My game does not push high framerates, and that's OK: there's heavy calculation every frame, and it's not an action game.
But I'm trying to get my head around two weird phenomena:
1. Changing .setRefreshRate
has a HUGE impact even though I never come NEAR the target refresh rate:
At setRefreshRate 20, I get ~ 10 fps
At setRefreshRate 30, I get ~ 14 fps
At setRefreshRate 40, I get ~ 16 fps
At setRefreshRate 50, I get ~ 18 fps... AND the fps is more steady AND less effected by adding/removing/optimizing various effects
2. Those are the numbers reported by .showfps
and Device Info... but are visibly WAY higher than reality:
When it says ~14fps I'm visibly only getting 4 or 5 fps
When it says ! 18fps... it's much closer, too fast for me to count reliably, but I don't think 18.
After a day messing with removing effects to boost smoothness... and then putting them all back and just cranking up setRefreshRate... I'm already happy with the outcome... just confused about the causes. I'm doing next to nothing in coroutines FWIW—all the real work is in update().
Any insights?
One factor could be: how does Playdate deal with a "missed" refresh deadline?
Say I'm targeting 10 fps, but the work to generate the next frame takes more than 1/10 second. Say it takes 25% extra. When the work is done, what happens?
A) It immediately refreshes. You get frames lasting .125 seconds instead of the targeted .1.
or
B) It WAITS for the NEXT frame, adding even MORE delay. (So every frame is always a multiple of 1/10.) You get frames lasting .2 seconds even though .125 should be possible.
If (B) then I can see how using a refreshRate of 50 is the way to get best performance even when you know you'll never achieve close to that. (Battery aside.)
#2 still puzzles me.