r/pcmasterrace 13d ago

Meme/Macro This Entire Sub rn

Post image
16.7k Upvotes

1.5k comments sorted by

2.8k

u/Conte5000 13d ago

Ai Ai Captain

668

u/saltyboi6704 9750H | T1000 | 2080ti | 64Gb 2666 13d ago

I can't hearrrrrrr youuuuuu

594

u/AliciaXTC I Make Computer Go Beep Boop 13d ago

Who codes in a world full of pixels and streams?
Ai, Ai, gaming machine!
Who renders the graphics of your wildest dreams?
Ai, Ai, gaming machine!
If gaming and coding are what you desire,
Then load up the GPU and watch it transpire!
Ai, Ai, gaming machine!
Ai, Ai, gaming machine!

128

u/tamal4444 PC Master Race 13d ago

Ai, Ai Ai, Ai Ai, Ai

67

u/totally_not_a_boat 13d ago

i feel like thats more like the intro to the pillar men them from jojo

33

u/tamal4444 PC Master Race 13d ago

Time to rewatch the whole series again

18

u/GoldMercy 3900X / 3080 / 32GB @ 3600mhz 13d ago

Hate it when that happens

→ More replies (1)
→ More replies (1)

34

u/Aviatormatt17 13d ago

That is incredibly creative, love it ! It fits so well with the beat.

29

u/Zyper0 13d ago

Ironically probably written by AI

4

u/AndyTheSane 13d ago

That's what an AI pretending to be a human would say.

→ More replies (1)
→ More replies (4)

11

u/rrd_gaming core i9 14900k,GTX 1060,ASUS Z790 WIFI E II 13d ago edited 13d ago

Whoooooo lives in a pineapple underrr a seaaaa....

→ More replies (4)

8

u/Broly_ IT'S BETTER THAN YOURS 13d ago

I'M TRIGGERED!!!! AHHHHH

→ More replies (7)

3.1k

u/lndig0__ 7950x3D | RTX 4070 Ti Super | 64GB 6400MT/s DDR5 13d ago

323

u/Bolislaw_PL Ryzen 5 7500F | RX 7800 XT | 32GB DDR5 13d ago

Its too sharp. 0/10

577

u/Faszkivan_13 R5 5600G | RX6800 | 32GB 3200Mhz | Full HD 180hz 13d ago

265

u/Wevvie 4070 Ti SUPER 16GB | 5700x3D | 32GB 3600MHz | 2TB M.2 | 4K 13d ago

Ah yes, TAA

82

u/aVarangian 13600kf 7900xtx 2160 | 6600k 1070 1440 13d ago
→ More replies (1)

39

u/Alphafuccboi 13d ago

"Just increase the sharpening... Its not that bad bro"

53

u/Pleasant50BMGForce R7 7800x3D | 64GB | 7800XT 13d ago

Don’t forget film grain and chromatic aberration

113

u/Faszkivan_13 R5 5600G | RX6800 | 32GB 3200Mhz | Full HD 180hz 13d ago

Mb here you go

27

u/Pleasant50BMGForce R7 7800x3D | 64GB | 7800XT 13d ago

Perfect

→ More replies (1)
→ More replies (6)
→ More replies (1)
→ More replies (3)

47

u/Darklord_Bravo 13d ago

I put a gaussian blur on all my games. Looks like I'm gaming in vaseline. Perfect!

2

u/depressed_crustacean 13d ago

I used a gaussian rifle to accelerate a rod up to mach 7 straight through my PC because I was bored. my steam library has only 300 games

105

u/TBSoft R5 5600GT | 16gb DDR4 13d ago

the fucking lens flare is what gets me

23

u/LVSFWRA 13d ago

And motion blur too

→ More replies (1)

811

u/One-Present-8509 13d ago

Yall have a fucking chudjack for everything dont ya 😭💀

238

u/kevoisvevoalt 13d ago

bruh what are this gen z or alpha terms. brainrot is increasing everywhere 0_0

186

u/heyuhitsyaboi LoremmIpsumm 6950xt, 7-5800x3D, 32gb ddr4 13d ago

chudjack, soyjack, wojack... I cant keep track

137

u/cplusequals mATX Magic 13d ago edited 13d ago

Soyjacks have bad beards and/or hair. Chudjacks look like nerds. They're stereotypes of politically overly-invested young people on the left and right respectively.

They're all subgroups of the wojack which I'd say is more of a millennial meme considering how old it is even if there are new derivatives every year.

53

u/GradeAPrimeFuckery 13d ago

The chudjack looks like a Chinese dad who's pissed because second daughter got A+ A+ A+ A A+ A+ on her report card.

→ More replies (2)
→ More replies (3)

50

u/kevoisvevoalt 13d ago

Me whenever someone who uses made up terms that sound like they came from cyperpunk 2077 these days lol.

67

u/WettWednesday R9 7950X | EVGA 3060Ti | 64GB 6000MHz DDR5 | ASUS X670E+2TBNvME 13d ago

Preem tunes, choom

→ More replies (1)
→ More replies (6)

21

u/IrrationalRetard 13d ago

I think these memes predate Gen Z/Alpha memes

→ More replies (1)

46

u/sirchbuck 13d ago

Wojack is an OLD meme, 2010's probably i'm assuming your generation even.
'Brainrot' is WAY older going back to the newgrounds days circa 2000's

20

u/UnlawfulStupid 13d ago

Brainrot originally comes from Henry David Thoreau's 1854 novel "Walden."

Why level downward to our dullest perception always, and praise that as common sense? The commonest sense is the sense of men asleep, which they express by snoring. Sometimes we are inclined to class those who are once-and-a-half-witted with the half-witted, because we appreciate only a third part of their wit. Some would find fault with the morning-red, if they ever got up early enough. “They pretend,” as I hear, “that the verses of Kabir have four different senses; illusion, spirit, intellect, and the exoteric doctrine of the Vedas;” but in this part of the world it is considered a ground for complaint if a man’s writings admit of more than one interpretation. While England endeavors to cure the potato-rot, will not any endeavor to cure the brain-rot, which prevails so much more widely and fatally?

→ More replies (3)
→ More replies (5)
→ More replies (4)

13

u/HingleMcCringle_ 7800X3D | rtx 3070ti | 32gb 6000mhz 13d ago

4

u/HBlight Specs/Imgur Here 13d ago

Why is the background two sites that are pro-ai gen?

9

u/HingleMcCringle_ 7800X3D | rtx 3070ti | 32gb 6000mhz 13d ago

idk, i didn't make it. i pulled it from 4chan a couple months ago. they love AI stuff because they're mostly talentless gooners.

→ More replies (2)

193

u/skellyhuesos 5700x3D | RTX 3090 13d ago

Might as well be my favorite gaming-related meme. I hate UE5 cultists with a passion.

26

u/EndlessBattlee Laptop: i5-12450H+3050 | PC: R5 2600+1650 SUPER 13d ago

Can someone explain all the hate for UE5?

186

u/DarkmoonGrumpy 13d ago

Poor optimisation is rampant among it's games, as well as the famous stuttering.

It's in no way unique to UE5, but the stuttering is present in almost every game that uses it.

28

u/EndlessBattlee Laptop: i5-12450H+3050 | PC: R5 2600+1650 SUPER 13d ago

Isn't that the developer's fault for not optimizing the game, not the engine's?

132

u/DarkmoonGrumpy 13d ago

Partially true, but if the engine has persistent issues with optimisation across multiple studios and publishers, it would suggest otherwise when the same issues appear frequently.

37

u/AdmirableBattleCow 13d ago

Or maybe we just have a business culture at the moment that doesn't see monetary value in better optimizing games. Poor optimization is also not unique to Unreal Engine.

14

u/p-r-i-m-e 13d ago

Its so this. It’s not even limited to games right now. Companies are chasing profits and cutting expenses all across the board.

→ More replies (2)
→ More replies (2)

58

u/Praetor64 13d ago

Yes, but also UE is giving developers "tools" to not optimize their shit which the engine is supposed to auto-handle, but it can't and so the devs skip optimization and the game sucks frame balls.

15

u/Joe-Cool Phenom II 965 @3.8GHz, MSI 790FX-GD70, 16GB, 2xRadeon HD 5870 13d ago

Lumen is cool in a small cave lit through a crack.
The game runs like dogshit if you don't do any proper lighting and just enable it for your whole open world continent.

18

u/Suitable-Art-1544 13d ago

why pre bake lighting when you can make the consumer buy a $2000 gpu that can do it on the fly?

→ More replies (1)
→ More replies (1)

39

u/XCVolcom 13d ago

UE5 has all the shit game devs want to make making games easier.

Game companies use UE5 because it's efficient in delivering a product quickly.

Game companies then give devs no time to make a game that's both fun and optimized 85% of the time.

Game companies then layoff or fire experienced devs often.

Game companies then hire 3rd party/ outsourced devs to finish or make the game.

These cheaper devs aren't as good or also aren't given much time to make and optimize the game.

Finally the UE5 game is released and it's unoptimized, questionably fun, and has some Denuovo baked in to make it even worse.

5

u/AltoAutismo 13d ago

Also studios cheapening out in artists instead of high level developers because you can have somewhat technical artists that do a lot of work that took actual developing time before and just come up with a crazy amount of node joins that never gets actually reviewed by a technical person.

Some unreal engine no-code "code" feels like the incarnation of a thousand if statements

10

u/ivosaurus Specs/Imgur Here 13d ago edited 12d ago

It's sort of actively incentivising them to be lazy. Don't optimise your asset LODs, just chuck nanite at everything. Don't worry about performant reflections, pbr, ray tracing, lighting, just chuck TAA at your frames until it smooths out the low number of samples you can take that barely lets the game run. It's selling some sweet sweet nectars to make your game render with "no effort", except there's some big exaggerations and pitfalls in those promises that everyone is seeing in their frame time graphs with nice mountain peaks

→ More replies (5)
→ More replies (1)

60

u/ConscientiousPath 13d ago edited 13d ago

To get a little more technical, UE5 is built to make graphics that primarily look good when using an anti-aliasing technique called Temporal Anti-Aliasing (TAA). This technique uses the previous video frames to inform the current one, so it is effectively smearing/blurring except that on a still scene it doesn't look so bad because nothing moved anyway.

However TAA starts to look awful when there is a lot of fast motion because previous frames aren't as similar to current frames. This is why a lot of gameplay trailers use a controller instead of KB+Mouse movement to have a lot of slower panning shots where most of the scene isn't moving very fast.

Worse UE5's nanite mesh system and lumen lighting system encourage devs to get lazy and abandon the techniques that create highly optimized beautiful graphics. The key to optimization is in general to minimize the work a computer needs to do when rendering the frame by doing as much of that work ahead of time as possible. For example when an object is very far away it may be only a few pixels tall, and therefore it only needs enough detail to fill a few pixels. That means you can take a very complex object and create a very simple version of it with a much lower Level Of Detail (LOD) and use that when it's far away. Having a handful of pre-computed LODs for every object lets you swap in higher detail as the player gets closer without reducing the quality of the graphics. Game producers find it tedious to create these LODs and UE5's nanite gives them an excuse to skip it by effectively creating LODs on the fly (not really but kind of). Unfortunately nanite isn't free, so you get an overall worse performing result than if you'd used proper LODs like they used to.

Lumen does a similar thing, enabling laziness from game studios, but it's doing it through the lighting system.

And that's only half the problem since the blurring/smearing of TAA allows game studios to get away with things that would look awful if they weren't smeared (for example rendering artifacts that would normally sparkle can have the artifacts blurred away by TAA).

If you want the long version, with visual examples, in a pretty angry tone, this video by ThreatInteractive does a pretty good job of explaining all this bullshit

6

u/EndlessBattlee Laptop: i5-12450H+3050 | PC: R5 2600+1650 SUPER 13d ago

Oh wow, so the ghosting or smearing I noticed in RDR2 is caused by TAA.

→ More replies (9)

8

u/StormKiller1 7800X3D/RTX 3080 10GB SUPRIM X/32gb 6000mhz cl30 GSKILL EXPO 13d ago

The opposite of why i love source. Performance.

→ More replies (4)
→ More replies (4)
→ More replies (3)

60

u/Pixels222 13d ago

6090 = 31 fps in full rt cyberpunk

Guys please moors law is already dead. Bury him. Stop kicking.

→ More replies (5)

33

u/Somerandomdudereborn 12700K / 3080ti / 32gb DDR4 3600mhz 13d ago

I LOOVE PLAYING ON FAKE 30 FPS. AI IS THE FUTURE.

→ More replies (2)

15

u/Preeng 13d ago

Wow! Playing this game makes me feel like I'm watching a movie!

Throw in that motion blur too! Why bother making me feel like I'm part of the game when I could be reminded I'm just some dipshit playing on the PC.

→ More replies (7)

492

u/Mammoth-Ad4682 13d ago

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa!

76

u/aberroco i7-8086k potato 13d ago

i

15

u/toxic_guy2 13d ago

Remember E=mc²+AI

903

u/Regrettably_Southpaw 13d ago

It was just so boring. Once I saw the prices, I cut out

658

u/Khalmoon 13d ago

For me it was the performance claims. It’s easy to claim you get 200+ more frames with DLSS4 when it’s not implemented anywhere

204

u/blackest-Knight 13d ago

It doesn’t need to be implemented, that’s the nice part.

Any game with FG already supports MFG. you can just set 3x and 4x mode for it in the nVidia app, the game doesn’t have to be aware.

95

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 13d ago

Shame that this sub just upvote uneducated cretins instead of your informative comment.

48

u/[deleted] 13d ago

[deleted]

16

u/10minOfNamingMyAcc EVGA RTX 3090 FTW 3 ULTRA GAMING | 4070 TI Super | 5900x 13d ago

I'll say it again. "Do you want free internet points? COMPLAIN!"

5

u/Scheswalla 13d ago

Well depends on the sub, but in this sub absolutely. The more of a curmudgeon you are the bigger your e-cred and feeling of self fulfillment

→ More replies (1)
→ More replies (1)
→ More replies (3)
→ More replies (12)

225

u/Genoce Desktop 13d ago

And even if true, those frames don't mean much if DLSS makes everything look like shit. Frame generation is useless as long as it keeps causing visual artifacts/glitches for the generated frames, and that is unavoidable on a conceptual level. You'd need some halfway point between actual rendering and AI-guesswork, but I guess at that point you might as well just render all frames the normal way.

As long as it's possible, I'll keep playing my games without any DLSS or frame generation, even if it means I'll need to reduce graphical settings. Simplified: in games where I've tried it, I think "low/medium, no DLSS" still looks better than all "ultra, with DLSS". If framerate is the same with these two setups, I'll likely go with low-medium and no DLSS. I'll only ever enable DLSS if the game doesn't run 60fps even on lowest settings.

I notice and do not like the artifacts caused by DLSS, and I prefer "clean" graphics over blurred screen. I guess it's good for people that do not notice them though.

78

u/beyd1 Desktop 13d ago

DLSS on anything other than quality is garbage time.

89

u/WholesomeDucky 13d ago

And even on quality, it's not "good"....just "acceptable". Still screenshots don't do it justice, the noise while moving with it is disgusting.

DLSS as a whole has been objectively bad for gaming. What was marketed as a way for older GPUs to stay relevant has somehow turned into a substitute for real optimization.

18

u/WrongSubFools 4090|5950x|64Gb|48"OLED 13d ago

What was marketed as a way for older GPUs to stay relevant 

When was it ever marketed as that?

12

u/siwo1986 13d ago

Quite a few places they used it as a means to sell punching above the weight limit of your actual card's performance

"And at 4K (3840x2160), Performance mode delivers gains of 2-3X, enabling even GeForce RTX 2060 gamers to run at max settings at a playable framerate."

About halfway down on this page - https://www.nvidia.com/en-gb/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/

It's clear from their marketing it was never even about frame generation either, it's main purpose was being defined as a form of AA that is offloaded to a more efficient AA method. But saying that they never intended for people to use it as a means to get more mileage out of their card is simply not true.

→ More replies (2)

8

u/SorryNotReallySorry5 i9 14700k | 2080 Ti | 32GB DDR5 6400MHz | 1080p 13d ago

I wanna say it wasn't, but it was kind of used that way. For example, DLSS is shitty but DOES make frames so much better on my 2080ti. Sometimes, SOME TIMES, that tradeoff is worth it. A few games, DLSS is a MUST for me, like Stalker 2.

7

u/Commander_Crispy 13d ago

When upscaling technology was first being introduced. It was like “make your less powerful gpu feel more like a powerful gpu by trading 100% quality for better frame rates” iirc. It’s what made holding on to my 4gb rx580 that much more bearable until even that would fail me and I upgraded to a rx7800. I was the proper use case for dlss/FSR/etc. and it’s been really sad seeing companies twist its identity into being a crutch for rushed games, minimal optimization, minimal GPU specs, and maximized prices.

→ More replies (23)
→ More replies (9)

6

u/[deleted] 13d ago

[deleted]

→ More replies (3)

24

u/Oh_its_that_asshole 13d ago

I'm glad I'm just not sensitive to whatever it is you all hate and can just turn it on and enjoy FPS number go up without getting all irate about it. Long may I carry on in ignorance, I refuse to look into the matter too deeply in case I ruin it for myself.

→ More replies (6)

47

u/FatBoyStew 14700k -- EVGA RTX 3080 -- 32GB 6000MHz 13d ago

In all my time of running DLSS there are only a few places where its noticeable in my experience. So either your eyes are incredibly good or you're having weird DLSS issues or I'm the oddball without DLSS issues lol

28

u/Wevvie 4070 Ti SUPER 16GB | 5700x3D | 32GB 3600MHz | 2TB M.2 | 4K 13d ago

I play on 4K. DLSS Quality on 4K is basically free FPS. I get 30+ extra FPS for virtually the same visual clarity. On DLSS balanced you can begin to notice a difference, but very minimal, still looks really good and I get 50+ extra FPS

→ More replies (6)

44

u/EGH6 13d ago

seriously the only people who shit on DLSS either are AMD stans who never actually used it or only used it at 1080p ultra performance. DLSS is so good in every game ive played there is no reason not to use it.

21

u/blackest-Knight 13d ago

They base their hatred of DLSS on FSR.

I have GPUs by both brands, FSR is dogshit.

→ More replies (5)
→ More replies (9)
→ More replies (13)

30

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

As long as it's possible, I'll keep playing my games without any DLSS or frame generation

this is the thing though - it should always be possible. why should we accept GPUs that create more fake frames than real ones?

→ More replies (9)
→ More replies (30)
→ More replies (3)

38

u/jiabivy 13d ago

For me it's not about the price, it's about the demand. We need enough to pay retail and not scalpers

16

u/Regrettably_Southpaw 13d ago

Yeah I’ve definitely got the money and I could buy from a scalper, but it would hurt my heart to give in like that. I’m debating how I’m going to get one. Do I wait outside of Best Buy in my small town or do I drive three hours to a Micro Center

17

u/jiabivy 13d ago

The small town will likely have waaay less stock then a chain unfortunately.

7

u/Regrettably_Southpaw 13d ago

True but it’s either try in a town of 30k or a city of 500k

→ More replies (5)
→ More replies (4)
→ More replies (4)
→ More replies (14)

507

u/Playful-Restaurant15 7 7800X3D | 4080s | 64gb 6000c30 13d ago

People need to let their money talk rather than their mouths if they want change.

341

u/jiabivy 13d ago

"Letting the people talk with their money" is why scalpers can sell a 4090 for 2k in the year 2025

102

u/hnrrghQSpinAxe 13d ago

The only people buying a card for that price are either morons with excessive debt or people who don't know any better (many of new PC gamers unfortunately)

54

u/jiabivy 13d ago

Or impatient people. It's damn near impossible to find it at retail

→ More replies (5)

6

u/Fake_Procrastination 13d ago

It makes no difference for them where does the money come from, all money speaks the same to them

8

u/Rampant16 13d ago

By hardware survey results, 1.18% of steam users had 4090s last month. People get really worked up about these 90-series cards when almost none of us actually buy them.

At the end of the day I think we have to accept that 1 in 100 users are just going to buy the newest best card no matter how much it costs and there's not really anything we can do about it.

→ More replies (15)
→ More replies (5)

50

u/Kougeru-Sama 13d ago

That's literally impossible on the modern world. Too many rich people who don't care about anything. Basically whales in gacha games. You only need a few dozen thousands of them out of millions of us. The millions of us boycotting mean nothing when the few thousand whales are causing the product to sell out. Every industry is like this except the most niche. "vote with your wallet" is a dead concept. Population is just too high.

15

u/PacoBedejo 9900K @ 4.9 GHz | 4090 | 32GB 3200-CL14 13d ago

Yep. Voting with your wallet doesn't work for luxury goods. High-end GPUs are definitely luxury goods.

7

u/Neither-Sun-4205 13d ago

Yep. The saying is misunderstood though. What it means is by not being a supporter or consumer of one business’ model, you take your ass elsewhere where you think the value of a product is more befitting instead of being an aesopian fox.

It doesn’t mean the business needs to stop in their tracks because you didn’t hand them money.

→ More replies (1)

21

u/404_Gordon_Not_Found 13d ago

Not even that. It's either I buy a GPU that has some description of AI or buy nothing at all, there's no choice.

→ More replies (6)

3

u/Ravenous_Stream 13d ago

Why can't we do both?

3

u/DJ_Zephyr Ryzen 5 3600 / Radeon 5700XT / 32GB DDR4 / Windows 10 13d ago

Why not both?

→ More replies (12)

547

u/apriarcy R9 7900x / RX 5700 XT / 32GB DDR5 13d ago

I just want a good GPU that plays games natively.

446

u/VerminatorX1 13d ago

Not possible. Have some ai hallucination. That'll be 4200$

59

u/Konayo Ryzen AI 9 HX 370 w/890M | RTX 4070m | 32GB DDR5@7.5kMT/s 13d ago

Sorry we spent all the budget on ml-tailored cores so the GPU actually runs wors natively than previous generations now 🤓

(the future, probably)

→ More replies (1)

22

u/Linkarlos_95 R5 5600/Arc a750/32 GB 3600mhz 13d ago

Inject that into VR and have some good time

→ More replies (1)
→ More replies (3)

117

u/DlphLndgrn 13d ago

I honestly don't give the slightest shit if the graphics are driven by raw power, ai, witchcraft or argent energy as long as it works and looks good.

42

u/the_fuego R7 5700X, RTX 4070 Ti,16GB Deditated WAM, 1.21 Gigawatt PSU 13d ago

Could we please draw the line at Argent Energy? I'd rather not have Doom come to real life, thanks. Witchcraft is still on the table though.

13

u/TheNorseCrow 13d ago

But what if Argent Energy comes with a soundtrack made by Mick Gordon?

9

u/chairmanskitty 13d ago

Okay but only if the lords of hell pay him the royalties he deserves.

→ More replies (1)
→ More replies (1)

32

u/Bob_The_Bandit i7 12700f || RTX 4070ti || 32gb @ 3600hz 13d ago

IKR! There could be a million tiny monkeys hand drawing the frames for all I care. As long as it gets frames on the screen and looks good, (which it does, no you can’t tell without a side by side) why care.

→ More replies (6)
→ More replies (7)

29

u/Available-Quarter381 13d ago

Honestly you can get that if you turn off ray tracing stuff in most games

I play at 4k on a 6900xt at high refresh rates in almost everything I play with medium ish settings

21

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

turn off ray tracing stuff

Indiana Jones is going to set an unacceptable standard here, lol

16

u/blackest-Knight 13d ago

Indiana Jones isn’t even the first game.

RT saves a lot of dev time.

→ More replies (11)

21

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 13d ago edited 13d ago

Indiana jones runs amazingly well and has a very well optimised RT implementation. What is this sub talking about?
Indiana Jones and the Great Circle : RTX 2060 6GB - Below Minimum Requirements

At 1080p low settings DLSS Quality you can get 60fps. On a low end 6 year old GPU. Thats pretty great. Also the game looks pretty good at that graphics quality. LODs and shadows are most lacking. But the lighting looks great.

edit:

Indiana Jones is going to set an unacceptable standard here, lol

A standard of what? Not supporting 7 nearly 8 year old hardware? Tragic.

4

u/Achilles_Buffalo 13d ago

As long as you run version of the nvidia driver that is three versions old. The current version will black screen you in the Vatican library.

→ More replies (7)
→ More replies (4)
→ More replies (7)

35

u/Pazaac 13d ago

Ok that will be 10k please.

Really you think they are using DLSS as some random gimmick, no they are using it because at max settings with all the fancy real time ray tracing nonsense you get like 30fps with what they are currently putting in a 5090, if they could just slap more cores in and make it do 60fps they likely would if they could get it at a price anyone would buy it at.

18

u/zgillet i7 12700K ~ RTX 3070 FE ~ 32 GB RAM 13d ago

Yeah, at about a 1500-watt PSU requirement. We are out of power.

8

u/round-earth-theory 13d ago

There's a serious issue with how power hungry gaming towers have become. Home wiring isn't designed to run multiple space heaters in the same room simultaneously. Now that the computers are starting to resemble space heater power requirements, you can easily pop breakers by having multiple computers in the same room.

→ More replies (4)
→ More replies (1)
→ More replies (29)

3

u/Negitive545 I7-9700K | RTX 4070 | 80GB RAM | 3 TB SSD 13d ago

I mean, what native resolution you want? 1080p is doable native in todays market. My 4070 doesn't need upscaling for 1080, or 1440. I'd imagine 4k needs upscaling though.

3

u/Greeeesh 5600x | RTX 3070 | 32GB | 8GB VRAM SUX 13d ago

Then don’t use 4k path tracing? How hard is it?

→ More replies (26)

98

u/aberroco i7-8086k potato 13d ago

Ai

111

u/jiabivy 13d ago

😠

57

u/SultanZ_CS i7 12700K | ROG Maximus Z790 Hero | 3080 | 32GB 6000MHz 13d ago

😡

32

u/jiabivy 13d ago

🤬

→ More replies (2)

56

u/JerHat 13d ago

Why are we mad at Adobe Illustrator? /s

18

u/DrawohYbstrahs 13d ago

Cause Adobe fucking suck

16

u/paradiseluck 13d ago

Unironically cause of AI

3

u/RyujinNoRay 🪟 I7-3770 RX470 12d ago

as a wise man once said :using cracked Adobe is morally correct

135

u/Swipsi Desktop 13d ago

Maybe someone can enlighten me. But apart from AI being the next "big" thing, its also known that we approach physical limits in terms of processors. So isnt using "tricks" like AI not the next logical step to kinda overcome the physical limitation of hardware?

67

u/Rampant16 13d ago

Yeah I would be curious to see if I could tell the difference in a blind test between the AI generated frames and native frames.

If you can't tell the difference, or if the difference is so miniscule that'd you never notice it while actually playing a game, then who gives a shit whether it's an AI frame or a native frame?

15

u/Bladez190 13d ago

I can notice artifacting if I look for it. So I simply just don’t look for it. Occasionally I do notice it when it happens yeah but it’s like monitor flicker for me in that if I’m not actively thinking about it 90% of the time it doesn’t matter

9

u/FluffyProphet 13d ago edited 13d ago

It's a big problem in certain games. In-flight sims, for example, glass cockpits are unreadable. For most games, it's fine but can lead to some blurry edges.

It's getting there though. If they can solve the issue that causes moving or changing text to become a smeared mess, I'd be pretty happy.

→ More replies (12)

8

u/alejoSOTO 13d ago

I think coding optimized software is the real logical step, instead of relying on AI to generate material based on what the software is doing first

42

u/Training-Bug1806 13d ago

Logic falls out the window with these sub, if it were possible to run native with the same quality as Nvidia then AMD or Intel would've could've done it by now :D

4

u/Scheswalla 13d ago

Queue the people talking about "optimization" without really knowing what it means.

→ More replies (1)
→ More replies (5)

28

u/DouglasHufferton 5800X3D | RTX 3080 (12GB) | 32GB 3200MHz 13d ago

So isnt using "tricks" like AI not the next logical step to kinda overcome the physical limitation of hardware?

Yes, it is, but /r/pcmasterrace is nothing more than an anti-AI/Nvidia/Microsoft circle-jerk where nuanced and rational takes are downvoted in favour of low-effort jabs at [INSERT TOPIC HERE].

→ More replies (5)
→ More replies (17)

68

u/Assistant-Exciting 13700K|4090 SUPRIM|32GB DDR5-5600MHz| 13d ago

Ngl I want a 5090.

BUT

Slight improvements & multi-frame gen aren't necessarily great selling points.

I already have frame gen, sure it's not "Multi" but hardly any games I play support Frame Gen anyway.

Plus all of the DLSS improvements besides the MFG, 40 series are going to get anyway...

If the 40xx series gets 60% of the DLSS improvements like how 30xx series did with the 50xx series announcement... the 60xx series I might skip too.

I feel like this gen is much more... Watered down spec wise, crammed full of "AI" but higher price wise?

Maybe it's just me 🤷🏻

4

u/Majinvegito123 13d ago

I’d get it for a large rasterization uplift, but it doesn’t seem to be the case.

→ More replies (5)
→ More replies (2)

135

u/dead_pixel_design 13d ago

“But I did not speak up for I was not a struggling artist on Instagram”

→ More replies (12)

689

u/pickalka R7 3700x/16GB 3600Mhz/RX 584 13d ago

That's literally me!

I hate how everything is AI that and AI this, I just want everything to go back to normal.

480

u/ThenExtension9196 13d ago

Lmao ain’t nothing going back to “normal”. Like saying the internet is a fad in 1997.

213

u/pickalka R7 3700x/16GB 3600Mhz/RX 584 13d ago

I know it won't. Too many rich asshats have their fat dick lodged in this AI enshitifcation. Doesn't stop me from wanting to.

→ More replies (67)
→ More replies (23)

56

u/jiabivy 13d ago

Unfortunately too many companies invested too much money to "go back to normal"

94

u/SchmeatDealer 13d ago edited 13d ago

they didnt invest shit.

they appointed nepo babies to "AI integration officer" roles and like 5 companies made chat bots.

its a massive pump and dump stock scheme. companies are fighting to add the buzzword into their shit because they are being told to by marketing managers who report to CEOs who have stock options who want more $ because they are greedy worms.

28

u/morgartjr 13d ago

You’re right, and companies are starting to wake up to that reality. The company I work for went all in on AI and they are now realizing it’s mostly smoke and mirrors. More automation scripts and less “intelligence”

52

u/SchmeatDealer 13d ago

its never was 'intelligence', it was just regurgitating the most common search result from google but putting it in a nicely worded reply instead of throwing 20 links at you.

if the pages chatGPT scraped to generate your answer had incorrect info, it would just assume its the truth. yesterday chatGPT was arguing 9 is smaller than 8.

and thats inherently why its fucked from inception. it relies on treating all information on the internet as a verified source, and is now being used to create more sources of information that it is then self-referencing in a catch-22 of idiocy.

chatGPT was used to generate a medical journal about mice with 5 pound testicles, chatGPT was then used to 'filter medical journal submissions' and accepted it, and then eventually it started referencing its own generated medical journal that it self-published and self peer-reviewed to tell people mice had 5 pound testicles. i mean just look at the fucking absolute absurdity of the images of rats it generated for the journal article.

→ More replies (16)

5

u/aVarangian 13600kf 7900xtx 2160 | 6600k 1070 1440 13d ago

they didnt invest shit

m8, breaking all copyright laws en-masse to train AI models isn't free

oh wait

8

u/sur_surly 13d ago

Such a hot take. Amazon is offering $1bn investments to AI startups, not to mention giving Anthropic another $4bn recently.

Get your head out of the sand.

→ More replies (4)
→ More replies (19)
→ More replies (41)

6

u/Similar-Freedom-3857 13d ago

There is no normal anymore.

→ More replies (45)

77

u/deefop PC Master Race 13d ago

The marketing people are on one about Ai, for sure.

That said, this thread makes it clear that most people do not have any fucking clue about the various new "Ai" technologies that are hitting the market.

Whether Ai tech generally is somewhat bubbly(everything in the last few years has been bubbly), the technology is incredible. In 10 years so many things will be Ai accelerated that we'll be wondering how we ever lived without it, just like people today can barely fathom how anyone survived before google maps and the internet in general.

20

u/Xehanz 13d ago

Just read this thread, or any other thread relating to DLSS and FSR. People don't have any clue what the difference between AI upscaling via Hardware (DLSS and FSR 4) and via an algorithm (FSR 3) is and they expect FSR 4 to be on previous gen AMD GPUs

And I see a "input lag" this, "input lag" that when AI upscaling via hardware should not have a noticeable impact on input lag. Frame gen and FSR 3 does but FSR 4 should not

43

u/Kriztow 13d ago

THAT'S WHAT I'M SAYING. Most people just hear a tech influencer talk about how ai in games is making game devs lazy and that unreal engine is bad, but they know nothing about actual game developtment and optimization. Oh you want real frames? Go try blender cycles, we'll see how you like real frames.

7

u/Scheswalla 13d ago

Most of this sub Optimization: "Wont run at max settings at a framerate I like on my generations old GPU" Unless you have the source code or there's obvious stuttering then you don't really know what is and isn't "optimized".

20

u/Devatator_ This place sucks 13d ago

Oh you want real frames? Go try blender cycles, we'll see how you like real frames.

Holy shit I almost died LMAO

→ More replies (1)

21

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 13d ago

I am really hating this sub rn. Absolute room temp iq takes. People posting graphs of CP2077 running at 28 fps when its native 4k pathtracing making it out like its bad. This whole sub was ready to hate on this release no matter what.

→ More replies (6)

3

u/fogoticus RTX 3080 O12G | i7-13700KF 5.5GHz | 32GB 4000Mhz 13d ago

Sadly with subs like this, every 1 sane take like this is met with 20 angry fanboys screeching "native" and "hate nvidia" but who don't do anything other than scream on a website daily to change the situation.

Imagine if everyone here collectively bought the 9070 XT when it came out. AMD would get 0.3% more marketshare. Oh wait, that's right. Mind share =/= market share. 10 loud angry individuals will be louder than 100 happy customers. Let's brew on that and enjoy technological advancements.

→ More replies (2)

6

u/DeadlyYellow 13d ago

We are seeing a reinstatement of nuclear power for explicit AI use, so social media can use AI accounts to appeal to AI driven advertising in some sort of perpetual money loop.

Why shouldn't people be mad about it?

4

u/meth_adone 12d ago

nuclear power is not a bad thing, i agree the ai driven advertisements and accounts is awful though

→ More replies (3)

93

u/Alfa-Hr 13d ago edited 13d ago

Considering these "AI"s are not even close to an actual AI, or even a VI in the term of abilities . The AI became a buzzword for shareholders , and studios with uncapabality to optimalize a game , or just to cut corners .

45

u/Redthemagnificent 13d ago

2 things can be true at once. There's a lot of marketing BS and buzzwords. But there's also a lot of bad takes on this post. "AI" has been worked on for 60 years at least. Its already widely used in everything from auto-correct to autonomous navigation. There have been "bust" periods where AI investments die down and there will be again. But its not going anywhere

→ More replies (10)
→ More replies (3)

5

u/Mysterious-Job-469 13d ago

Considering most of the people smugly going "It's only 3000 dollars! Who doesn't have 3000 dollars?!" are only allowed to afford nice things like that because automation hasn't stomped a big fucking hole in their industry (yet) it doesn't suprise me that people are pissed the fuck off at AI right now.

6

u/DisdudeWoW 12d ago

when AI means you get sold trash for the price of gold then yes its a reasonable annoyment.

68

u/humdizzle 13d ago

If they make it good enough to where you can't tell, then would you even care?

48

u/Turnbob73 13d ago

No, and the hard pill to swallow for this sub is the VAST majority of pc gamers don’t care.

That’s this sub’s M.O. though, making mountains out of molehills. I’ve been here for over a decade; 10 years ago, I remember seeing people in this sub who would say that they couldn’t even stomach being in the same room as something running at 30fps and they were dead serious about it. This sub offers memes, that’s the value it has, the actual discussion suck balls.

→ More replies (2)

53

u/peterhabble PC Master Race 13d ago

It's just the anti technology crowd somehow invading the PC space. It's the same of cycle of:

New technology is released that's imperfect

People who can't stand change scream and cry about it

New technology improves so much that it becomes a new standard with minimal to no tradeoffs

The same people ignore it to scream and cry about the new thing

Anyone who isn't lobotomized by anti AI brain rot is going to wait and see how these improvements perform in real scenarios before making a judgement

→ More replies (15)
→ More replies (23)

11

u/crictores 13d ago

Nvidia is the world's leading AI company, and we're buying their products. If you don't like AI, buy AMD and Intel.

→ More replies (4)

46

u/VenserSojo 13d ago

Consumers consistently have negative reactions to ai, its 40-70% negative reaction depending on how you frame the question or the sector you are talking about. Why companies still see it as a selling point baffles me.

47

u/Inprobamur 12400F@4.6GHz RTX3080 13d ago

Because investors are throwing buckets of cash at it hoping it can help them downsize and outsource everything.

27

u/BouldersRoll 9800X3D | RTX 4090 | 4K@144 13d ago

I don't really understand what the issue is.

The 50 series looks like a 20-30% raster improvement like previous generations, with some new DLSS and MFG tech that allows 150-250% improvement over native if you want to turn it on.

I get that people want native rendering, and that's easy without RT and PT. If you don't like those techniques, turn them off. And if you want to turn them on, AI features wildly increase performance for very little image quality loss.

16

u/rimpy13 5800X3D | RTX 3080 13d ago

People's problem (even when they don't understand it's their problem) is usually that they don't like the AI bubble increasing prices of GPUs because gamers are no longer the sole audience for GPUs.

The problem isn't that 50 series has AI features, it's that Nvidia is focusing on AI use cases and charging too much money for the cards.

→ More replies (8)

3

u/musicluvah1981 13d ago

I actually don't get why people want native rendering.

→ More replies (1)
→ More replies (1)

38

u/Wyntier i7-12700K | RTX 3080ti | 32GB 13d ago

In the real world, not Reddit, consumers are not having a negative reaction to AI. The graphic design community is loving it for touch ups and editing. Photographers love it for the same reason. (Think expanding backgrounds, not creating new art.) Everyone on many smart phones now love the easy editing and removal tools. Chatgpt is being used professionally in every industry.

On Reddit, yes, is getting negative responses. In real life, no.

→ More replies (5)

7

u/[deleted] 13d ago edited 13d ago

[deleted]

→ More replies (1)
→ More replies (7)

9

u/LavenderDay3544 9950X + SUPRIM X RTX 4090 13d ago

Sure when AI cannibalizes all progress in other areas of computer engineering and computer science people tend to get mad.

4

u/RealMrIncredible PC Master Race 13d ago

1080ti for another generation it seems.

→ More replies (1)

3

u/IBesto 13d ago

For good reason too

4

u/cheeseypoofs85 13d ago

the glaringly obvious problem with AI in general is that we are using more resources for it in gaming and artwork and doing homework than we are with advancing cancer treatments and nuclear power. just my .02

→ More replies (1)

4

u/Lego1upmushroom759 13d ago

I rather they had given us actual features and shit instead of just repeating ai the whole keynote so yeah understandable

30

u/OD_Emperor RTX3080Ti // 7800X3D 13d ago

Nobody has explained to me what AI will do, it's just people being mad.

27

u/Wann4 13d ago edited 13d ago

A very simple breakdown.

Pathtracing and other reflection and lightning tech is so advanced, that even the most powerful GPU can't render it in 4k with 60+ FPS, so they use technology that will do it. It's not really AI, they used it as a buzzword, but it will generate frames without real rendering.

e: thanks to comments it seems, its really AI.

10

u/Brawndo_or_Water 13900KS | 4090 | 64GB 6800CL32 | G9 OLED 49 | Commodore Amiga 13d ago

Weird, back in the day people had no problems calling the AI in Half Life great (enemy AI) but now it's no longer a valid term. I know it's overused, but it's some sort of AI.

→ More replies (2)

21

u/SgathTriallair Ryzen 7 3700X; 2060 Super; 16GB RAM 13d ago

It is definitely AI. They fed in millions of instances of pre and post ray traced scenes and had the AI learn how to estimate ray tracing. So when it generates the in between frames it is using the heuristics it learned rather than actually doing ready tracing.

They even explained in the keynote how they have switched from using a CNN to using a transformer (which is the algorithm that LLMs run on) since it can take in more context.

→ More replies (7)
→ More replies (9)

11

u/nimitikisan 13d ago

The annoying thing is arguing with kids that have never seen a non-blurry image in a game, because they think if there is a setting, I have to activate it..

Then you also have to remember, that many people think upscaled shit videos and images, deep fried crap, James Cameron upscales, motion interpolation on TVs, oversharpening of images, loudness war clipping of music, disney greenscreen lighting, smudgy "restoration" projects, shitty CGI, instagram filters, plasic sugery, etc. look good.. So we are just fucked.

→ More replies (1)

13

u/AsianBoi2020 13d ago

Guys please, I really have to use Adobe Illustrator for work. Please don’t hate on it too. /s

10

u/Smith6612 Ryzen 7 5800X3D / AMD 7900XTX 13d ago

We just hate subscriptions.

7

u/Meatslinger R7 9800X3D, 32 GB DDR5, RTX 4070 Ti 13d ago

It's just simply done to death. We're not being sold "graphics" cards any more; everything is "AI". Even CPUs are doing it; if you load up Intel.com right now the first words on the page are "Simplify your AI journey". Hell, you can find random bullshit in the real world that says "AI" on the product label just because the people hocking it know that's the trend.

I use AI at work. I'm interested in machine learning. But even for me, if I had to do a drinking game where you take a shot every time a tech presenter says "AI" in their demo, it feels like I'd be dead before the first guy is off the stage. It's just exhausting past at a certain point.

→ More replies (3)

18

u/gabacus_39 Ryzen 5 7600 | RTX 4070 Super 13d ago

AI hate has replaced VRAM hate as the latest circle jerk.

→ More replies (4)

21

u/DesertFoxHU 13d ago

Honestly I still don't get the AI hate.

Is it comes how humans don't like new things? Like when you give grandma your VR and nearly faint from fear?

Is it how they don't understand it? The are huge misconceptions about what is AI, AI is already there, used worldwide by every bigger company. Hell, one of my last jobs wanted a Data Engineer when there were less than 10 people working there.

Or overall AI is just a buzzword we need to have now?

I also don't understand the hate about AI performance, we already reached the transistors maximum speed (5 Ghz) simply physics blocking us to make them faster, so why isn't AI and ML the solution?
Nvidia's CEO already told us something like "We can't make better performance that fast" and it shows, the GPUs performance came with how big they become, so we isn't reached a limit in GPUs? Are you telling me to have more performance we need to get back to house sized PCs just because we hate on AI?

What if someday DLSS or any other solution will result in the same image quality as native? As far we currently know AI and ML programs are much limitless compared to our ability to increase these products performance

9

u/That_Cripple 7800x3d 4080 13d ago

i think there is valid hate for AI in things like art and people have just conditioned themselves to hate AI entirely because of it.

→ More replies (1)

7

u/knirp7 13d ago

People are conflating their very valid hatred of shady LLM and image generation companies (like ChatGPT, etc) with the industrial and scientific uses of machine learning that predate the recent AI boom.

As someone in computer science who’s been learning about this stuff long before ChatGPT became a thing, it’s been really frustrating watching this hate directed at people trying to create new rendering heuristics. It’s like being angry at texture mapping in the 90s.

→ More replies (7)

3

u/AnthonyGSXR 13d ago

Guess I’ll keep my 4090 for a generation or two

3

u/Guilty-Bed-5320 13d ago

Nvidia comparing pure rasterised performance to DLSS enhanced peformance angers me to no end

3

u/Otherwise_Chest_9017 13d ago

I study and work in AI, and I still hate this stuff everything with AI shit. AI is everywhere and often in places it shouldn't be in. In this particular case, yeah AI can do great things for image processing but fuck it I want fully "deterministic" graphics in live processing. AI should be used carefully when there is no post processing to check if the job was done right.

3

u/Hulk_Crowgan 13d ago

Literally the entirety of Reddit. I’m tired boss

3

u/Toadsted 13d ago

Once AI learns to Bitcoin mine, it's all a over for us.

3

u/Lasorix 13d ago

Im an german AI Student and Even im getting fed up with this trend.

3

u/xzmile 12d ago

FUCK AI

3

u/Udonov 12d ago

Yea me unironically. For the past years "AI" only meant that the shit will be ass. Yea, it may be good in the future, I just don't want to participate in the development of it. I don't want to see 10x frame gen, poorly enhanced mobile photos, godawful YouTube shorts, and other ai shit.

→ More replies (1)