Which of these two is how timers work?
Given this example...
• A 2000 ms timer: playdate.timer.new(2000, myTimedEvent)
• With playdate.timer.updateTimers()
called on every playdate.update()
(say, 15 updates/second)
• And in the middle of the timer's run, starting around 1000 ms, there happens to be one extra-long 500 ms frame—due to some intensive calculation in one update. Then resuming normal framerate at around 1500 ms, with ample time left before the timer ends.
Does it work it like this?
A. You create the timer, and myTimedEvent
is triggered by the first updateTimers
that occurs after 2000 ms have passed. The timer runs for the requested 2000 ms, despite the long frame in the middle.
Or does it work like this?
B. You create the timer, and calculations done while it runs may extend the timer arbitrarily. So that 500 ms long frame in the middle makes the timer take 2500 ms to complete before triggering myTimedEvent
, even though that long frame was completed well before the intended 2000 ms.
Thanks in advance! I'm seeing timer weird behavior that B might explain.