I'd like to develop tools that run as playdate apps in order to enable creation of assets in the Playdate SDK environment (e.g. using the PD synths, which you couldn't do in a third party stand alone app). It would be really nice to allow for extra inputs that are only available on the computer (keyboard/mouse) that would make these tools way more useable. I know that there are debug keys, so that's really nice, but it would also be awesome to get mouse input so we can make nice GUI tools (e.g. a mesh editor or paint app).
1 Like
So FWIW I had a similar want and have noted what I'm going to attempt to get to work - it may be that you can do the same. It may only be sensible if you're using C though.
The idea is to simply spawn a window with raylib or similar in the native build of the game, that window can be used for debug information but could also handle mouse controls. You can even use the PD api to render your game to a bitmap context, then get the bitmap data, and render that bitmap onto the raylib window.
Direct debug mouse support in the simulator would be nice though, similar to the debug bitmap that renders in colour. Can't fault the suggestion.