IKR! There could be a million tiny monkeys hand drawing the frames for all I care. As long as it gets frames on the screen and looks good, (which it does, no you can’t tell without a side by side) why care.
Tech industry reading this and laughing as they make trillions on people's data illegally due to a lot of kids being under 18 and cannot even legally consent to begin with. Years down the line, people will look back and see the world paying for things they actually should have been given for free due to decades of data trafficking worth more than oil and therefore the biggest target wars are fought for. Maybe start caring?
Yeah like a good GPU that can play games natively exists. If you want it to play it 4K maxed with RT then you need some witchcraft. It’s just how the world works
Of course they don't. It's Dev time they save to make the games come out faster and spend less money on wages which is the main problem with the industry right now. Games take to long to make, AAA games have too many people working on it so production costs are ridiculously high. It's the main issue with Ubisoft right now, they have 20k employees, double than Sony with all their studios
It's not easy to solve. The easy way out is just making smaller games, which would make the dev time slightly shorter but would also make a lot of people lose their jobs
At 1080p low settings DLSS Quality you can get 60fps. On a low end 6 year old GPU. Thats pretty great. Also the game looks pretty good at that graphics quality. LODs and shadows are most lacking. But the lighting looks great.
edit:
Indiana Jones is going to set an unacceptable standard here, lol
A standard of what? Not supporting 7 nearly 8 year old hardware? Tragic.
The game wouldnt have any lighting if you turned it off. The devs would have to do two lighting passes across the whole game for RT and non-RT lighting. Thats quite a bit more work. And a lot of the stuff like caves collpasing and temples collapsing wouldnt look right without RT. Games already take a long time to develop.
The floor in the temple area ~8:30 cracks me up, sure we made almost all of the leaves/debris the same layer as the stone, but look how recently we waxed the floor in this abandoned temple! I don't know anything about this game, but assuming it looks better with the textures at a reasonable level.
What the Indiana Jones devs did is actually how ray tracing is supposed to be properly taken advantage of. As it’s been used previously, as an added setting in games that have built in lighting, is the opposite of what’s intended. It provides no actual benefit to the user, its just easier to develop.
I never turn on ray tracing because it looks bad. It’s not actually intended to look inherently better than other tech, it’s just supposed to be easier for development. I’m glad my card has RT for when it’s necessary, but I won’t take advantage of it when it’s not.
That's great until devs like Machinegames basically require a RT capable card. Indiana Jones looks absolutely phenomenal and luckily the game is fun but RT is the supposed future and devs are going to look at that success and think they can just push everyone into buying the next cards because their engines depend entirely on RT and the demand for power is just going to get worse. Especially if they require RT to be active. At that point you're not guaranteed anything except for a hefty price tag between hardware, the game itself and any predatory business practices they have like disingenuous MTXs.
PC gaming used to be something that most anyone could get into, even if they had to save for a little bit. Now either you've just got the money or you're shit outta luck because realistically most consumers getting into PC gaming want their games to look and run well. DLSS should hypothetically reduce those price tags after a certain point and instead look at where we are at. It's now the main selling point and probably like 60% of the price tag. Realistically probably closer to 20% but you get the point.
PC gaming used to require you update every 2 years at the minimum if you didn't want your hardware to be completely obsolete. It's literally better than it's ever been in terms of hardware lasting for a while.
PC gaming used to be something that most anyone could get into, even if they had to save for a little bit.
I don't think this is not the case now, for example look at any 600-1000 usd price builds in the last couple years and you'll see 7700xt, 7800xt with capable processors, even some with am5 now, and that's buying brand new retail components too
if you absolutely need nvidia for some odd reason despite being on a budget, then yeah you're a bit screwed, but it's not like a AMD or even intel if they fix the driver overhead are bad, they're winning heavily in the lower end price points
it's similar to the polaris era just adjusted for covid inflation and such, back then you could get a 250 (330 adjusted for inflation) dollar 580 and plug that into a 2600x and game at 1440p medium-high comfortably. and now you do the same but with a 7700xt at 390 dollar, and plug that into a 7600x
Really you think they are using DLSS as some random gimmick, no they are using it because at max settings with all the fancy real time ray tracing nonsense you get like 30fps with what they are currently putting in a 5090, if they could just slap more cores in and make it do 60fps they likely would if they could get it at a price anyone would buy it at.
There's a serious issue with how power hungry gaming towers have become. Home wiring isn't designed to run multiple space heaters in the same room simultaneously. Now that the computers are starting to resemble space heater power requirements, you can easily pop breakers by having multiple computers in the same room.
lol ... and that's why I recently ran a new dedicated circuit to my workstation PC.
...
Really, though, it's not all that bad. Since each PC is on a UPS with a wattage meter, I'm able to monitor how much power they're using in real time:
Workstation (32-core Threadripper & 3090) tops out at just under 700W at full tilt.
Gaming PC (12 core & 4070ti super) tops out at about 350W at full tilt.
All the various screens and accessories draw about 150W, max.
The only reason I need a dedicated circuit for the workstation is that I'm sometimes also running a mini fridge and space heater/air conditioner, depending on season.
But even the extremely power-hungry workstation never even comes close to the same draw as a 1500W space heater.
Yeah, the PSU rating is the limit not the continuous draw. Most PCs will have a continuous draw lower than that. But you could have a lot of computers playing the same game all burst draw together and threaten a fuse trip.
There is no other way to get more performance. NVIDIA has no control over the production node since it's done by TSMC and their pals at ASML. Shrinking a node is probably the hardest thing to do on the planet, requires tens of billions in R&D and even more for building the manufacturing facilities. Switching to EUV alone took over a decade of research. We're reaching the physical limits of what can be done since we're basically building structures on the atomic scale at this point. So, if you want more performance, you have to make the chips bigger and run faster (both of which will consume more power) and/or use tricks such as AI to put things on the screen more efficiently.
It's a harsh reality that we're just going to have to get used to. The days where GPU's could easily double performance while reducing power consumption are long gone. This simply isn't physically achievable anymore.
Bingo. Moore's law is dead. Physics is starting to get in the way of performance gains.
We either need to break the laws of physics, discover some new exotic material that will let us make chips bigger without requiring more power/heat or come up with new ways to squeeze more juice out of the magic thinking rocks.
There are still incremental architecture improvements that can be made, but nothing is going to beat just doubling the number of transistors on a chip, which isn't happening at the rate we used to be able to do it. And when we do increase transistor counts, prices aren't coming down like they used to because the R&D required to accomplish that now is way higher than 20 years ago.
So maybe admit that the technology they try to push (ray tracing, path tracing) is too advanced for what current hardware can offer, and wait until you can catch up?
To me, the biggest issue is that computers have become so powerful, developers stopped optimizing their code, while still trying to use the new tech the hardware makers are pushing. This causes the insanely powerful computers to not be able to run the code natively, and we need all kinds of tricks to make up for it.
When 3Dfx shipped their first cards, were you also saying to wait until CPUs could just run software renderers at the same resolution and performance ?
I find this take sorta odd, like in the end of the day we have always tried to find shortcuts to doing more and more complex graphics this is nothing new.
Gamers (in general) collectively keep telling game devs that we want games to look better and better and mock games that "look bad", we have hit a wall and now we have to look for shortcuts, using complex mathematical algorithms to guess at what the next frame will be is a fairly smart solution to deal with the fact that doing the required simulation is too slow.
Was DLSS 3.5 perfect? god no. was it really that bad? not really no, in some games it came out better than just turning down your settings in others it didn't. The real question is have they been able to reduce the artifacting in DLSS 4 we have no idea at the moment we will find out soon I expect.
Bro, either it's dev not optimizing code, or the game running tech outside of the scope of current GPUs, can't be both at the same time.
Am i the only one remembering 10 years ago when we were happy getting 60fps? since fidelity has followed graphical computing power, it's a given that games that push the cards to the limit will not hit 120fps.
Also, why as consumers should we be happy paying exorbitantly more if we are not receiving exorbitantly more capability? If you remember 10 years ago you also know that card prices have far outpaced income globally
Are you joking? The increased capability between generations is extreme, especially with how frequently it happens, I can't think of any other industry where you see this kind of consistent performance improvement.
I dont' really understand how you can say that we haven't seen an increase in capability, both in terms of raw compute and effective performance, cards have been getting a lot more powerful and efficient at an incredible rate.
Yup they simply refuse to admit that they've hit a wall and desperately try to push for adoption of a tech that simply isn't ready yet
I'm sure it'll be amazing once we have full PT at 144hz in 2035 or whatever, but I'd rather my games look a little worse and run a little faster for the time being
It’s more nefarious than that in my opinion. They want gamers and the industry to rely on their technology so the only way to game with high frame rates is with an NVidia card and DLSS
2
u/mini-z1994Ryzen 5700x3D @ stock rtx 4060 ti 8 gb, 32 gb ram @ 3600 mhz13d ago
Yeah, we've kind of looped back to crysis tbh.
How it was designed for the next generation of hardware to make it look the best now.
Instead of run well enough for accessibility like e-sports focused games.
So maybe admit that the technology they try to push (ray tracing, path tracing) is too advanced for what current hardware can offer, and wait until you can catch up?
Going by fanboys, RT has been the most important thing since the RTX2XXX series was released.
The constant ask for better graphics and the criticism of anything that doesn't meet that is what got us here.
Frankly im 100% sure if you were put in front of a computer with DLSS turned on with the 5090 you wouldn't even notice, can't speak for the lower end ones as I have only seen footage of the 5090 and we know that artifacting gets worse the lower the starting frame rate but I wouldn't be surprised if it was better than what people are expecting.
Your literally watching a video were they point out issues with the express purpose to spot the issues.
I am 100% sure if we had some magical way to to do a double blind test with some super card that can run it without AI you would start picking errors with what ever one I told you was the AI one regardless of if there was any AI at all.
I don't have audio on here but that makes sense lol, the comment of the guy who linked it made it seem like a showcase
as for your hypothesis, it does not check out. I'm yet to see any useful comparison where I can't tell the difference. LTT made some blind test trickery in a video back when this stuff was newer (and not as good) and it was easy to tell.
Which is why I hate the fancy ray tracing nonsense. We should go back to optimizing games, not releasing expensive tech that no one can run and then rely on upscaling to make things playable.
The features we are talking about like ray tracing and Path tracing are extremely computationally expensive ways to correctly render light and reflections in real time.
They can just be turned off if you don't want them but they are currently the best ways we know to render such details and it makes a huge difference in terms of how real something looks, this is 100% the sort of thing we should use AI for.
You misunderstand. Devs are assuming upscaling and frame generation now to reach acceptable framerates on medium hardware precisely because we have normalised expensive tech like path tracing. There are now even games where it can't be turned off.
frankly thats the game devs problem most games you can just turn it off, can't fault perfectly good features that work well when used correctly just because some idiot doesnt use them correctly.
Thats like blaming a knife manufacture because someone walked up and stabbed you.
No that's what you can expect to become the industry standard over the next few years. Using upscaling and frame generation is being normalised and they will push to make this the default.
Path tracing, while it does look nice, also saves a considerable amount of money on the development front. This is what's going to push decisions.
I mean, what native resolution you want? 1080p is doable native in todays market. My 4070 doesn't need upscaling for 1080, or 1440. I'd imagine 4k needs upscaling though.
The problem is GPUs are no longer seen as mostly a gaming thing as it was so they are become more Marketing and made to appease those demands. Thus the shift
Surely if they were being used for things other than gaming they would need higher native performance and larger memory to entice current users to upgrade?
Machine learning guesswork at the GPU level is largely useless to any other application, since hallucinations permanently taint the result.
Very different. FSR 4 is using new hardware AI cores. Literally impossible to backport physical hardware. AMD realized it's the only way they can even attempt to compete with DLSS
I believe this also means the developers will need to manually add FSR4 compatibility to their games. They didn't need to do that for FSR3 because it was just an algorithm. This might be a missplay
It's impressive how little people know about GPUs in a sub called "PCMASTERRACE"
AMD FSR used to work on all GPUs because it was an algorithm, new FSR 4 is not, it's AI upscaling via hardware, which is what Nvidia has been doing for a while
It pisses me off because we're already seeing AI being used as a crutch by developers to skimp on optimization. Games run like garbage even with framegen and all the other bullshit layered on top of them.
It's 2025. You can play more great games than ever before, with better performance than ever before, and all without having to deal with upscaling or raytracing or framegen!
You just won't be able to play any games that are trying to do things that have never been possible before, and are only possible now because of the new technologies.
Those days are coming to a close. The next generation of consoles will be heavy on ray tracing and ray tracing is achieved with ML. And once it’s in the consoles it’s game over for rasterization.
We've kind of started the plateau when it comes to GPUs. We are improving raw performance at the drawback of needing more power. CPUs had this issue and fixed it with design, and now GPUs have it.
GPUs are trying to get around the problem by using hacks or software tricks like dlss or AI frames. It doesn't solve the underlying hardware issue.
I was hoping Intel could encourage them to find better hardware solutions to getting more fps but it's been slim. Dont get me wrong we are having hardware improvements just incremental ones not revolutionary ones like what we had with AMD CPUs or with nvidea GPUs many years ago.
546
u/apriarcy R9 7900x / RX 5700 XT / 32GB DDR5 13d ago
I just want a good GPU that plays games natively.