Floating-point performance

I did some profiling to measure the performance impacts of floats compared to integers. I set up identical arithmetic trials in C for each data type and measured how long they took with playdate->system->getElapsedTime(). I'm posting the results here for anyone curious about it.

The specific data types in question were int32_t and float. For the most part it was pretty much a tie, with the exception of division.

  • Addition: Floats took 0.99953× as much time as ints.
  • Subtraction: Floats took 1.00118× as much time as ints.
  • Multiplication: Floats took 0.99969× as much time as ints.
  • Division: Floats took 1.62270× as much time as ints.

For everything but division, the differences in execution time may be simple margin-for-error stuff and for all intents and purposes floats and integers take exactly the same amount of time to compute. Division might have been impacted by situations where the dividend was zero: I don't know if the CPU instruction short-circuits that case, but if it does it might account for the discrepancy. If not, then the difference is likely because of the additional exception overhead necessary for floating division, since zero, indefinites and NaNs can all raise exceptions.

All-in-all, I'm pretty stoked about this result. It means floating-point computations aren't going to drag down the performance of a program using them. Code that goes out of its way to avoid using floats (such as fixed-point alternatives) will actually run slower. Best of all, any Lua code that is running float numbers isn't going to be any slower on the grounds of data type.

5 Likes

I did another test for profiling on the same project. I found that floating-point 2ˣ on Playdate takes 1.67102× as much time as floating-point multiplication. This was tested via exp2f().