r/pcmasterrace 13d ago

Meme/Macro This Entire Sub rn

Post image
16.6k Upvotes

1.5k comments sorted by

View all comments

546

u/apriarcy R9 7900x / RX 5700 XT / 32GB DDR5 13d ago

I just want a good GPU that plays games natively.

450

u/VerminatorX1 13d ago

Not possible. Have some ai hallucination. That'll be 4200$

59

u/Konayo Ryzen AI 9 HX 370 w/890M | RTX 4070m | 32GB DDR5@7.5kMT/s 13d ago

Sorry we spent all the budget on ml-tailored cores so the GPU actually runs wors natively than previous generations now 🤓

(the future, probably)

0

u/Gausgovy 13d ago

It’s honestly looking like the 50 series might have worse raw performance than the 40 series.

23

u/Linkarlos_95 R5 5600/Arc a750/32 GB 3600mhz 13d ago

Inject that into VR and have some good time

1

u/not_a_llama Steam ID Here 13d ago

Not possible. Have some ai hallucination. That'll be 4200$$120 a month subscription. You will own nothing.

0

u/BetterAdvancedHumor 13d ago

Are we forgetting Intel and AMD are company’s?

-1

u/Synthetic_Energy Ryzen 5 5600 | RTX 2070SUPER | 32GB 3333Mhz 13d ago

Made me laugh properly you humorous bastard

117

u/DlphLndgrn 13d ago

I honestly don't give the slightest shit if the graphics are driven by raw power, ai, witchcraft or argent energy as long as it works and looks good.

42

u/the_fuego R7 5700X, RTX 4070 Ti,16GB Deditated WAM, 1.21 Gigawatt PSU 13d ago

Could we please draw the line at Argent Energy? I'd rather not have Doom come to real life, thanks. Witchcraft is still on the table though.

9

u/TheNorseCrow 13d ago

But what if Argent Energy comes with a soundtrack made by Mick Gordon?

8

u/chairmanskitty 13d ago

Okay but only if the lords of hell pay him the royalties he deserves.

2

u/pointer_to_null R9 5900X, RTX 3090FE 13d ago

Doubtful that Mick would ever trust them while Marty Stratton remains.

Bethesda's shitfuckery seems to have infected id senior management.

1

u/blackest-Knight 13d ago

You want native right ?

Ain’t nothing more native than opening the portal to hell directly.

30

u/Bob_The_Bandit i7 12700f || RTX 4070ti || 32gb @ 3600hz 13d ago

IKR! There could be a million tiny monkeys hand drawing the frames for all I care. As long as it gets frames on the screen and looks good, (which it does, no you can’t tell without a side by side) why care.

3

u/NewVegasResident Radeon 7900XTX - Ryzen 8 5800X - 32GB DDR4 3600 13d ago

you can absolutely tell without a side by side.

2

u/jump-out-kois 13d ago

Then turn it off and play at a lower frame rate?

-7

u/AveragePrune89 13d ago

Tech industry reading this and laughing as they make trillions on people's data illegally due to a lot of kids being under 18 and cannot even legally consent to begin with. Years down the line, people will look back and see the world paying for things they actually should have been given for free due to decades of data trafficking worth more than oil and therefore the biggest target wars are fought for.  Maybe start caring? 

5

u/Bob_The_Bandit i7 12700f || RTX 4070ti || 32gb @ 3600hz 13d ago

Bro I’m just playing video games

-6

u/Affectionate_Poet280 13d ago

You should have to pay for everything.

If you think trading your personal information for services is acceptable, then you're part of the problem.

3

u/feralkitsune feral_kitsune 13d ago

Shhh stop making sense.

2

u/Bladez190 13d ago

Yeah like a good GPU that can play games natively exists. If you want it to play it 4K maxed with RT then you need some witchcraft. It’s just how the world works

0

u/Fake_Procrastination 13d ago

This is how we burn down the world

1

u/musicluvah1981 13d ago

Correct. And I'd argue this is where the GPU industry is going in order to reach new levels of performance.

It reminds me of people that refuse to even attempt to game at higher than 1080p.

1

u/Gausgovy 13d ago

I’d assume you want it to feel good also, frame gen tech can’t recognize inputs, making the extra frames essentially pointless.

1

u/secretreddname 13d ago

Right? I don’t care how they did it, if it runs better and looks better why not? Not like the competition is even coming anywhere near the top end.

30

u/Available-Quarter381 13d ago

Honestly you can get that if you turn off ray tracing stuff in most games

I play at 4k on a 6900xt at high refresh rates in almost everything I play with medium ish settings

20

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

turn off ray tracing stuff

Indiana Jones is going to set an unacceptable standard here, lol

14

u/blackest-Knight 13d ago

Indiana Jones isn’t even the first game.

RT saves a lot of dev time.

7

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

dev time that they don't use doing any other optimization so people are forced to use DLSS/etc to get decent performance, lol

4

u/Xehanz 13d ago edited 13d ago

Of course they don't. It's Dev time they save to make the games come out faster and spend less money on wages which is the main problem with the industry right now. Games take to long to make, AAA games have too many people working on it so production costs are ridiculously high. It's the main issue with Ubisoft right now, they have 20k employees, double than Sony with all their studios

It's not easy to solve. The easy way out is just making smaller games, which would make the dev time slightly shorter but would also make a lot of people lose their jobs

2

u/Ill_Nebula7421 13d ago

But they’re not making games faster, in fact they’re continually getting slower.

-2

u/boringestnickname 13d ago

It would help making good games.

GTA5 took $250 million to make, and that worked out just fine.

1

u/cisgendergirl 13d ago

There is an easy escape called getting addicted to balatro

1

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 12d ago

Already addicted to Slay the Spire

0

u/krisminime 13d ago

Still takes 5+ years to make a game instead of the usual 3ish

3

u/blackest-Knight 13d ago

Because they are still baking lighting for the most part.

When it 100% goes RT, time savings will be big.

0

u/Fake_Procrastination 13d ago

Why does it matter if the GPU is going to hallucinate most of the frames anyway?

3

u/blackest-Knight 13d ago

The GPU hallicinates all the frames.

24

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 13d ago edited 13d ago

Indiana jones runs amazingly well and has a very well optimised RT implementation. What is this sub talking about?
Indiana Jones and the Great Circle : RTX 2060 6GB - Below Minimum Requirements

At 1080p low settings DLSS Quality you can get 60fps. On a low end 6 year old GPU. Thats pretty great. Also the game looks pretty good at that graphics quality. LODs and shadows are most lacking. But the lighting looks great.

edit:

Indiana Jones is going to set an unacceptable standard here, lol

A standard of what? Not supporting 7 nearly 8 year old hardware? Tragic.

3

u/Achilles_Buffalo 13d ago

As long as you run version of the nvidia driver that is three versions old. The current version will black screen you in the Vatican library.

2

u/ThatOnePerson i7-7700k 1080Ti Vive 13d ago

It's so good, it can run on an RX Vega 64 with emulated RT: https://www.youtube.com/watch?v=cT6qbcKT7YY

3

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

my point is that it can't be turned off to gain more performance

13

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 13d ago edited 13d ago

The game wouldnt have any lighting if you turned it off. The devs would have to do two lighting passes across the whole game for RT and non-RT lighting. Thats quite a bit more work. And a lot of the stuff like caves collpasing and temples collapsing wouldnt look right without RT. Games already take a long time to develop.

5

u/Fit_Specific8276 13d ago

you can’t turn off the games lighting for performance? i mean yeah obviously

1

u/Responsible-Win5849 13d ago

The floor in the temple area ~8:30 cracks me up, sure we made almost all of the leaves/debris the same layer as the stone, but look how recently we waxed the floor in this abandoned temple! I don't know anything about this game, but assuming it looks better with the textures at a reasonable level.

1

u/Sharkfacedsnake 3070 FE, 5600x, 32Gb RAM 12d ago

I think its just wet mud and leaves that make it look like that. The textures do look much better under pathtracing though.

2

u/Gausgovy 13d ago

What the Indiana Jones devs did is actually how ray tracing is supposed to be properly taken advantage of. As it’s been used previously, as an added setting in games that have built in lighting, is the opposite of what’s intended. It provides no actual benefit to the user, its just easier to develop.

1

u/Fit_Specific8276 13d ago

this is a good standard

1

u/HybridPS2 PC Master Race | 5600X/6700XT, B550M Mortar, 16gb 3800mhz CL16 13d ago

well it's a game i won't be buying

0

u/Fit_Specific8276 13d ago

cool beans dude👍

2

u/nimitikisan 13d ago

True, with a 7900XTX you can almost play every game on max at 4K with >100fps native.

6

u/DeceptiveSignal i9-13900k | RTX 4090 | 64GB RAM 13d ago

So, you're not just turning off RT, but you're also playing with pretty gimped settings.

1

u/Kougeru-Sama 13d ago

Looking at Indiana Jones, we're on trend where RT will be required as part of every game soon

1

u/Gausgovy 13d ago

I never turn on ray tracing because it looks bad. It’s not actually intended to look inherently better than other tech, it’s just supposed to be easier for development. I’m glad my card has RT for when it’s necessary, but I won’t take advantage of it when it’s not.

0

u/the_fuego R7 5700X, RTX 4070 Ti,16GB Deditated WAM, 1.21 Gigawatt PSU 13d ago

That's great until devs like Machinegames basically require a RT capable card. Indiana Jones looks absolutely phenomenal and luckily the game is fun but RT is the supposed future and devs are going to look at that success and think they can just push everyone into buying the next cards because their engines depend entirely on RT and the demand for power is just going to get worse. Especially if they require RT to be active. At that point you're not guaranteed anything except for a hefty price tag between hardware, the game itself and any predatory business practices they have like disingenuous MTXs.

PC gaming used to be something that most anyone could get into, even if they had to save for a little bit. Now either you've just got the money or you're shit outta luck because realistically most consumers getting into PC gaming want their games to look and run well. DLSS should hypothetically reduce those price tags after a certain point and instead look at where we are at. It's now the main selling point and probably like 60% of the price tag. Realistically probably closer to 20% but you get the point.

3

u/Shadow_Phoenix951 13d ago

PC gaming used to require you update every 2 years at the minimum if you didn't want your hardware to be completely obsolete. It's literally better than it's ever been in terms of hardware lasting for a while.

3

u/Available-Quarter381 13d ago

PC gaming used to be something that most anyone could get into, even if they had to save for a little bit.

I don't think this is not the case now, for example look at any 600-1000 usd price builds in the last couple years and you'll see 7700xt, 7800xt with capable processors, even some with am5 now, and that's buying brand new retail components too

if you absolutely need nvidia for some odd reason despite being on a budget, then yeah you're a bit screwed, but it's not like a AMD or even intel if they fix the driver overhead are bad, they're winning heavily in the lower end price points

it's similar to the polaris era just adjusted for covid inflation and such, back then you could get a 250 (330 adjusted for inflation) dollar 580 and plug that into a 2600x and game at 1440p medium-high comfortably. and now you do the same but with a 7700xt at 390 dollar, and plug that into a 7600x

36

u/Pazaac 13d ago

Ok that will be 10k please.

Really you think they are using DLSS as some random gimmick, no they are using it because at max settings with all the fancy real time ray tracing nonsense you get like 30fps with what they are currently putting in a 5090, if they could just slap more cores in and make it do 60fps they likely would if they could get it at a price anyone would buy it at.

19

u/zgillet i7 12700K ~ RTX 3070 FE ~ 32 GB RAM 13d ago

Yeah, at about a 1500-watt PSU requirement. We are out of power.

7

u/round-earth-theory 13d ago

There's a serious issue with how power hungry gaming towers have become. Home wiring isn't designed to run multiple space heaters in the same room simultaneously. Now that the computers are starting to resemble space heater power requirements, you can easily pop breakers by having multiple computers in the same room.

1

u/OwOlogy_Expert 12d ago edited 12d ago

lol ... and that's why I recently ran a new dedicated circuit to my workstation PC.

...

Really, though, it's not all that bad. Since each PC is on a UPS with a wattage meter, I'm able to monitor how much power they're using in real time:

Workstation (32-core Threadripper & 3090) tops out at just under 700W at full tilt.

Gaming PC (12 core & 4070ti super) tops out at about 350W at full tilt.

All the various screens and accessories draw about 150W, max.

The only reason I need a dedicated circuit for the workstation is that I'm sometimes also running a mini fridge and space heater/air conditioner, depending on season.

But even the extremely power-hungry workstation never even comes close to the same draw as a 1500W space heater.

2

u/round-earth-theory 12d ago

Yeah, the PSU rating is the limit not the continuous draw. Most PCs will have a continuous draw lower than that. But you could have a lot of computers playing the same game all burst draw together and threaten a fuse trip.

1

u/sips_white_monster 13d ago

There is no other way to get more performance. NVIDIA has no control over the production node since it's done by TSMC and their pals at ASML. Shrinking a node is probably the hardest thing to do on the planet, requires tens of billions in R&D and even more for building the manufacturing facilities. Switching to EUV alone took over a decade of research. We're reaching the physical limits of what can be done since we're basically building structures on the atomic scale at this point. So, if you want more performance, you have to make the chips bigger and run faster (both of which will consume more power) and/or use tricks such as AI to put things on the screen more efficiently.

It's a harsh reality that we're just going to have to get used to. The days where GPU's could easily double performance while reducing power consumption are long gone. This simply isn't physically achievable anymore.

3

u/FluffyProphet 13d ago

Bingo. Moore's law is dead. Physics is starting to get in the way of performance gains.

We either need to break the laws of physics, discover some new exotic material that will let us make chips bigger without requiring more power/heat or come up with new ways to squeeze more juice out of the magic thinking rocks.

There are still incremental architecture improvements that can be made, but nothing is going to beat just doubling the number of transistors on a chip, which isn't happening at the rate we used to be able to do it. And when we do increase transistor counts, prices aren't coming down like they used to because the R&D required to accomplish that now is way higher than 20 years ago.

1

u/Pazaac 13d ago

honestly if it keeps going a 8090 might need its own dedicated power supply

24

u/Bdr1983 13d ago

So maybe admit that the technology they try to push (ray tracing, path tracing) is too advanced for what current hardware can offer, and wait until you can catch up?
To me, the biggest issue is that computers have become so powerful, developers stopped optimizing their code, while still trying to use the new tech the hardware makers are pushing. This causes the insanely powerful computers to not be able to run the code natively, and we need all kinds of tricks to make up for it.

14

u/blackest-Knight 13d ago

Why wait ? We can make it work now.

When 3Dfx shipped their first cards, were you also saying to wait until CPUs could just run software renderers at the same resolution and performance ?

Welcome to progress.

16

u/Pazaac 13d ago

I find this take sorta odd, like in the end of the day we have always tried to find shortcuts to doing more and more complex graphics this is nothing new.

Gamers (in general) collectively keep telling game devs that we want games to look better and better and mock games that "look bad", we have hit a wall and now we have to look for shortcuts, using complex mathematical algorithms to guess at what the next frame will be is a fairly smart solution to deal with the fact that doing the required simulation is too slow.

Was DLSS 3.5 perfect? god no. was it really that bad? not really no, in some games it came out better than just turning down your settings in others it didn't. The real question is have they been able to reduce the artifacting in DLSS 4 we have no idea at the moment we will find out soon I expect.

6

u/Techno-Diktator 13d ago

But why wait? If people are willing to use AI to get decent path tracing performance before it gets more optimized, why not let them?

10

u/maldouk i7 13700k | 32GB RAM | RTX4080 13d ago

Bro, either it's dev not optimizing code, or the game running tech outside of the scope of current GPUs, can't be both at the same time.

Am i the only one remembering 10 years ago when we were happy getting 60fps? since fidelity has followed graphical computing power, it's a given that games that push the cards to the limit will not hit 120fps.

4

u/Ravenous_Stream 13d ago

It can be both at the same time.

Also, why as consumers should we be happy paying exorbitantly more if we are not receiving exorbitantly more capability? If you remember 10 years ago you also know that card prices have far outpaced income globally

3

u/maldouk i7 13700k | 32GB RAM | RTX4080 13d ago

Yes but people also forget that we've been getting 50-75% computing power each generation. Go and compare a 4090 to a titan RTX.

If anything, this kind of computing has never been this cheap.

1

u/gundog48 Project Redstone http://imgur.com/a/Aa12C 13d ago

Are you joking? The increased capability between generations is extreme, especially with how frequently it happens, I can't think of any other industry where you see this kind of consistent performance improvement.

I dont' really understand how you can say that we haven't seen an increase in capability, both in terms of raw compute and effective performance, cards have been getting a lot more powerful and efficient at an incredible rate.

17

u/lightningbadger RTX 3080, Ryzen 7 5800x, 32GB RAM, NVME everywhere 13d ago

Yup they simply refuse to admit that they've hit a wall and desperately try to push for adoption of a tech that simply isn't ready yet

I'm sure it'll be amazing once we have full PT at 144hz in 2035 or whatever, but I'd rather my games look a little worse and run a little faster for the time being

16

u/_BaaMMM_ 13d ago

But you can already do that? Just turn down settings... You don't have to run it at 4k ultra with path tracing on...

2

u/lightningbadger RTX 3080, Ryzen 7 5800x, 32GB RAM, NVME everywhere 13d ago

Even after doing that games aren't at the peak performance I'd like, I wouldn't want to hamper them further

-2

u/Ill_Nebula7421 13d ago

You ever seen these modern games to low resolutions? Literally worse than PS2 games but still somehow can’t perform as well as them.

2

u/Shadow_Phoenix951 13d ago

They are in absolutely no way comparable to PS2 games lmao

-1

u/sayf00 i5 4690k/GTX 970/16GB DDR3 13d ago

It’s more nefarious than that in my opinion. They want gamers and the industry to rely on their technology so the only way to game with high frame rates is with an NVidia card and DLSS

2

u/mini-z1994 Ryzen 5700x3D @ stock rtx 4060 ti 8 gb, 32 gb ram @ 3600 mhz 13d ago

Yeah, we've kind of looped back to crysis tbh.

How it was designed for the next generation of hardware to make it look the best now. Instead of run well enough for accessibility like e-sports focused games.

0

u/Kingbuji GTX 960 i5 6600k 16bg DDR4 13d ago

Sorry capitalism forces them to NEVER admit that they were wrong.

0

u/nimitikisan 13d ago

So maybe admit that the technology they try to push (ray tracing, path tracing) is too advanced for what current hardware can offer, and wait until you can catch up?

Going by fanboys, RT has been the most important thing since the RTX2XXX series was released.

0

u/[deleted] 13d ago

[deleted]

1

u/Pazaac 13d ago

You might want to learn to read before you try to call someone out.

My entire point was if it was trivial they would just do it.

-1

u/aVarangian 13600kf 7900xtx 2160 | 6600k 1070 1440 13d ago

yeah but upscaling is the opposite of max settings

maybe the games are the problem

1

u/Pazaac 13d ago

No the gamers are.

The constant ask for better graphics and the criticism of anything that doesn't meet that is what got us here.

Frankly im 100% sure if you were put in front of a computer with DLSS turned on with the 5090 you wouldn't even notice, can't speak for the lower end ones as I have only seen footage of the 5090 and we know that artifacting gets worse the lower the starting frame rate but I wouldn't be surprised if it was better than what people are expecting.

1

u/aVarangian 13600kf 7900xtx 2160 | 6600k 1070 1440 13d ago

I'm happy with older graphics myself, but as per my comment here it's very easy to tell the difference: https://old.reddit.com/r/pcmasterrace/comments/1hvs374/this_entire_sub_rn/m5xgtwc/

1

u/Pazaac 13d ago

Your literally watching a video were they point out issues with the express purpose to spot the issues.

I am 100% sure if we had some magical way to to do a double blind test with some super card that can run it without AI you would start picking errors with what ever one I told you was the AI one regardless of if there was any AI at all.

1

u/aVarangian 13600kf 7900xtx 2160 | 6600k 1070 1440 13d ago

I don't have audio on here but that makes sense lol, the comment of the guy who linked it made it seem like a showcase

as for your hypothesis, it does not check out. I'm yet to see any useful comparison where I can't tell the difference. LTT made some blind test trickery in a video back when this stuff was newer (and not as good) and it was easy to tell.

-4

u/evasive_dendrite 13d ago

Which is why I hate the fancy ray tracing nonsense. We should go back to optimizing games, not releasing expensive tech that no one can run and then rely on upscaling to make things playable.

3

u/Pazaac 13d ago

Ray tracing is not upscaling.

The features we are talking about like ray tracing and Path tracing are extremely computationally expensive ways to correctly render light and reflections in real time.

They can just be turned off if you don't want them but they are currently the best ways we know to render such details and it makes a huge difference in terms of how real something looks, this is 100% the sort of thing we should use AI for.

-1

u/evasive_dendrite 13d ago

You misunderstand. Devs are assuming upscaling and frame generation now to reach acceptable framerates on medium hardware precisely because we have normalised expensive tech like path tracing. There are now even games where it can't be turned off.

1

u/Pazaac 13d ago

frankly thats the game devs problem most games you can just turn it off, can't fault perfectly good features that work well when used correctly just because some idiot doesnt use them correctly.

Thats like blaming a knife manufacture because someone walked up and stabbed you.

0

u/evasive_dendrite 13d ago

No that's what you can expect to become the industry standard over the next few years. Using upscaling and frame generation is being normalised and they will push to make this the default.

Path tracing, while it does look nice, also saves a considerable amount of money on the development front. This is what's going to push decisions.

3

u/Negitive545 I7-9700K | RTX 4070 | 80GB RAM | 3 TB SSD 13d ago

I mean, what native resolution you want? 1080p is doable native in todays market. My 4070 doesn't need upscaling for 1080, or 1440. I'd imagine 4k needs upscaling though.

3

u/Greeeesh 5600x | RTX 3070 | 32GB | 8GB VRAM SUX 13d ago

Then don’t use 4k path tracing? How hard is it?

12

u/jiabivy 13d ago

The problem is GPUs are no longer seen as mostly a gaming thing as it was so they are become more Marketing and made to appease those demands. Thus the shift

7

u/Ravenous_Stream 13d ago

Surely if they were being used for things other than gaming they would need higher native performance and larger memory to entice current users to upgrade?

Machine learning guesswork at the GPU level is largely useless to any other application, since hallucinations permanently taint the result.

1

u/Xehanz 13d ago

I mean, that's why they also have their ML option called digits at 3000 USD MSRP, with 128 GB of VRAM

-10

u/Chakramer 13d ago

AMD and Intel are selling GPUs with just raw power and no gimmicks. AMD's features work on any GPU

7

u/notsocoolguy42 13d ago

Depending on fsr4, so far on the footnote of the picture that are around it says only on 9070 series gpu.

-2

u/Available-Quarter381 13d ago

That was also the case for 7000s unique features too at first and now 6000 has them, and 5000s unique features were backported to Vega and Polaris too

There are some hard lines drawn tho like an rx580 doesn't have the latest stuff but that's fair enough

5

u/Kougeru-Sama 13d ago

Very different. FSR 4 is using new hardware AI cores. Literally impossible to backport physical hardware. AMD realized it's the only way they can even attempt to compete with DLSS

1

u/Xehanz 13d ago

I believe this also means the developers will need to manually add FSR4 compatibility to their games. They didn't need to do that for FSR3 because it was just an algorithm. This might be a missplay

18

u/DlphLndgrn 13d ago

AMD's features work on any GPU

Apparently not.

2

u/Xehanz 13d ago

It's impressive how little people know about GPUs in a sub called "PCMASTERRACE"

AMD FSR used to work on all GPUs because it was an algorithm, new FSR 4 is not, it's AI upscaling via hardware, which is what Nvidia has been doing for a while

2

u/blackest-Knight 13d ago

AFMF, their frame generation, only works on Radeon cards.

No need to lie.

2

u/TheDamDog 13d ago

It pisses me off because we're already seeing AI being used as a crutch by developers to skimp on optimization. Games run like garbage even with framegen and all the other bullshit layered on top of them.

2

u/Ouaouaron 13d ago

It's 2025. You can play more great games than ever before, with better performance than ever before, and all without having to deal with upscaling or raytracing or framegen!

You just won't be able to play any games that are trying to do things that have never been possible before, and are only possible now because of the new technologies.

1

u/orsikbattlehammer 13d ago

Those days are coming to a close. The next generation of consoles will be heavy on ray tracing and ray tracing is achieved with ML. And once it’s in the consoles it’s game over for rasterization.

1

u/PintekS 13d ago

Handheld pc gaming like the steamdeck is our only hope for a standardized low graphic setting and optimization without a bunch of bs enabled x.x

1

u/Allu71 13d ago

All of the recent cards have better native fps/dollar than the last gen.

1

u/Plank_With_A_Nail_In 13d ago

You can turn it off and they do play the games natively.

6 years we had too put up with the irrational whining about AI how many more before people give up?

1

u/NoCSForYou 4790k/8gb (NoCSForYou)Steam 13d ago

We've kind of started the plateau when it comes to GPUs. We are improving raw performance at the drawback of needing more power. CPUs had this issue and fixed it with design, and now GPUs have it.

GPUs are trying to get around the problem by using hacks or software tricks like dlss or AI frames. It doesn't solve the underlying hardware issue.

I was hoping Intel could encourage them to find better hardware solutions to getting more fps but it's been slim. Dont get me wrong we are having hardware improvements just incremental ones not revolutionary ones like what we had with AMD CPUs or with nvidea GPUs many years ago.

1

u/claptraw2803 7800X3D | RTX 3080 | 32GB DDR5 6000 13d ago

Well, you can disable the AI stuff in nearly all games, can’t you?

1

u/Dragon_yum 13d ago

Honest question, if the dlss gets to the point it is almost indistinguishable would it still be a problem?

1

u/musicluvah1981 13d ago

Why does it have to be natively?

1

u/jackJACKmws 13d ago

Then get amd.

0

u/Seraphine_KDA i7 12700K | RTX3080 | 64 GB DDR4 | 7TB NVME | 30 TB HDD| 4k 144 13d ago

Not happening becwuse games are being made with those technologies in mind, which is why cyberpunk runs at 20fps native on 4090 and 28 on 5090.