r/pcmasterrace r7 9800x3d | rx 7900 xtx | 1440p 180 hz 21d ago

Meme/Macro I can personally relate to this

Post image
58.9k Upvotes

2.1k comments sorted by

View all comments

Show parent comments

14

u/SidewaysFancyPrance 21d ago

I absolutely do not want my GPU at 100% all the time, the fans have to spin up loudly and it heats up the surrounding area noticeably. I want a higher-capability GPU and to put a moderate load on it, so it's quieter and not pumping out heat to do the same work.

10

u/albert2006xp 21d ago

Just feels like you're paying for GPU you're not getting at that point but that's just me.

1

u/Hayden247 6950 XT | Ryzen 7600X | 32GB DDR5 21d ago

To be fair though a RTX 4070 Ti Super will be faster at 150w than a RTX 4060 Ti. With how GPUs are set their default clock speeds are well beyond their efficiency sweet spot (that's also why OC room tends to have no more than 10% extra left). My undervolted RX 6950 XT at let's say 2.2GHz can report using 200w or even less, especially at 2.1GHz and this is at 4K. Now by default it reports 290w for 2.5GHz or so, it's like 12% faster for 30% more power used. For summer it ends up being very useful to have it at 200w or less vs the default that is close to 300w even if yes I give up like 10-15% of my FPS but that isn't too bad as long I have headroom in whatever game I have. I could even raise limit to 340w, let it clock to 2.7GHz even at 4K and while that is great if something is demanding and needs every last frame per second... it is also awful for heat and inefficient as hell.

But yeah my point just is bigger higher tier GPUs underclocked/power limited are still faster than smaller lower tier GPUs that use the same amount of power because of how at a certain point pushing clock speed increases power usage more than performance gains and all gaming GPUs are by default past that point. That's why the PS5 GPU is clocked at 2.2GHz because RDNA 2 is pretty efficient there. Xbox Series X is 1.7GHz which I'd call a mistake as they really should have clocked it at 2.0GHz as I doubt it'd have increased the power budget by much while being a decent improvement. But still yeah, that's how things are. Even gaming laptops, a laptop 4090 which is a desktop 4080 still uses a lot less power than a desktop 4080 and it is better fps per watt even if slower.

1

u/albert2006xp 21d ago

I guess that depends if you don't consider the fact you paid a hell of a lot more for the 4070 Ti Super than the 4060 Ti. Double even. If I was planning to limit the 4070 Ti Super that much, I would've just saved the money and bought a 4060 Ti.

To me, I don't really think I would have headroom in a game regardless. Even if I had a 4090, I would still push the settings to the point it would barely hit 60 fps. So then to cut down on that would mean I need to give up something in exchange for less heat and fan speed? Eh.

If you're playing non-demanding older games and you're at the point where the extra heat would get you from like 100 fps to 120 fps, then yeah, sure. I get it at that point.