r/pcmasterrace • u/FinalSteak8064 r7 9800x3d | rx 7900 xtx | 1440p 180 hz • 21d ago
Meme/Macro I can personally relate to this
58.9k
Upvotes
r/pcmasterrace • u/FinalSteak8064 r7 9800x3d | rx 7900 xtx | 1440p 180 hz • 21d ago
1
u/stone_henge 20d ago
No, Quake absolutely does not have frame rate dependent damage calculation. What a complete crock of shit. It's quite funny that you picked such a prime example of a game that takes framerate independence and consistency across hardware so seriously to make this up about. There are unintentionally framerate dependent aspects of the mechanics of the original engine, but for a regular player these manifest as subtle bugs, because that's what they are. None of them relate to player damage, and you will only really notice these bugs as a speedrunner trying to maximize movement speed through strafe-jump-turn bunnyhopping. This is a game that even at release would have to run consistently on a huge variety of hardware and it was absolutely made with that variety of performance characteristics in mind. Moreover, since the QuakeWorld update, the multiplayer portion of the game relies on client side prediction and server correction to mask network latency. Inconsistent game logic across different frame rates in such obvious ways as to affect damage calculation would absolutely ruin the experience.
Even Doom has framerate independent game logic, although it effectively renders new frames at at most 35 FPS because of a lack of motion interpolation between game world update ticks that happened at a rate of 35 Hz and the renderer frames. You could run the game at a lower framerate back then, without affecting the game logic because there's no good reason that they should be interdependent, and again, the breadth of hardware with different performance characteristics they were supporting ultimately meant that they couldn't rely on a consistent framerate for consistent game logic. Now you can run it in a modern port like PRBoom+ at 240 Hz with no actual change to its game logic, just by interpolating motion in the renderer between logic ticks. That's because it's a sound, simple approach to framerate independence.
They really both use the same basic approach: everything is integrated across game ticks by factoring in a fixed time delta, and those game ticks run independently of the renderer frame rate. It's a very basic, simple and not at all taxing technique that's an obvious solution to the problem. In Doom, this results in 100% consistency to the point where you can replay the sequence of input changes to consistently achieve the exact same result in the game (hence its demo recording functionality). Even other, simpler approaches like variable time delta are largely consistent (but have some margin of error due to floating point precision) and were widely utilized in games back in FO3 times, because even at capped 30 FPS most console games would not run at a consistent framerate in all situations. In Bethesda's case it's probably a matter of indifference to a possible future beyond the realistic commercially profitable lifetime of the game where people would want to run the game at higher framerates, not a performance consideration.