logoalt Hacker News

nasretdinovtoday at 12:40 PM1 replyview on HN

Ideally you'd want to measure _perceived_ performance of the game by players, which would probably depend on the _lowest_ "fps" value during the specific interval. I've seen some games change the colour of the fps counter based on whether or not there were significant FPS _dips_ below the one-second average. So e.g. you might be able to render 100 frames in a specific second, but if one frame took 0.1s and the others took the rest, then for users it'll feel like the game plays at 10fps at that point, even though the actual number of frames rendered is much higher.


Replies

embedding-shapetoday at 1:22 PM

> but if one frame took 0.1s and the others took the rest, then for users it'll feel like the game plays at 10fps at that point

Wouldn't it feel like 10fps for 0.1s only? I agree it's a good thing to measure, I think it's called "stutter" usually, but I'm not sure you can say "it feels like 10 fps" since its for such a small moment.

show 1 reply