Hello! I just received my Playdate and planned to create a game based on music, so I'm doing experiments with audio features.
But I can't get Instrument to work. I may have misunderstood its usage though.
I'm coding in Lua, on Windows, and the result is the same for both simulator and console.
Consider this code sample:
-- Create synths used for the instrument
local customSynth1 = playdate.sound.synth.new(playdate.sound.kWaveSine)
local customSynth2 = playdate.sound.synth.new(playdate.sound.kWaveTriangle)
local customSynth3 = playdate.sound.synth.new(playdate.sound.kWaveSquare)
--- Create the instrument
local chordInstrument = playdate.sound.instrument.new()
--- Add synths as voices for the instrument
chordInstrument:addVoice(customSynth2, 0, 0, 12)
chordInstrument:addVoice(customSynth3, 0, 0, 24)
local noteA <const> = 60
local noteB <const> = 48
--- Play C4 note when A is pressed
--- Play C3 note when B is pressed
if playdate.buttonJustPressed(playdate.kButtonA) then
elseif playdate.buttonJustReleased(playdate.kButtonA) then
if playdate.buttonJustPressed(playdate.kButtonB) then
elseif playdate.buttonJustReleased(playdate.kButtonB) then
So, as I understand it, these lines should create the synths, each one with its own channel. I create an instrument and add the synths as voices, so when I play a note, all three synths play at the same time. And by using transpose parameters, when I press B to play the C3 note, I should hear sine wave at C3, triangle wave at C4, and square wave at C5, all at once.
By testing this code, you'll hear only the first synth, for both A and B buttons.
Are my code and expectations correct? Am I missing a step to setup the instrument? Is it a bug? If you are able to answer one of these questions, it would be really helpful!
ohhhhhhh yes. I totally see now why you'd expect it to work that way. What the docs don't make clear at all is that addVoice() is there to support polyphony: When you play a note on the instrument it scans its pool of voices to find an inactive voice (or the least recently used, if they're all active) with that note in its range to play that note. In your demo synths 2 and 3 have a range of 0-0, so they'd only fire on MIDI note 0 but if you change that to 0-128 then hold A and B down at the same time you can hear that they're using different synths (though there's also a lot of clipping which changes the tone because they're both playing at full volume). So first thing is I need to fix the docs to make that clear.
Second is adding a method to support that kind of layering. You can do that right now by using multiple instruments, but that's a bit of a hack. I'll file both of these and think about the best way to make this work. Thanks for pointing this out!
I think I can help you with that, at least for the design (not sure for the implementation, my bad). I'm planning to create a game (well, more of a tool to be honest) with basic synth features and sequencing features (so way less fun and simple than Boogie Loops , which already seems very technically impressive to me). So I think I will have more feedbacks like this one in the future...
About this issue, it's not a problem to me to use multiple instruments "manually" to get a similar result. I just didn't understand its goal.
But knowing that, I'm questionning the existence of the Instrument class. Since I have way more controls on Synths objects, why using Instruments then?
I mean, we could just pass a Synth object as parameter when creating a Track instead of passing an Instrument, with an optional parameter to define a number of voices. Instrument would then possiibly be a structure used internally to bind a synth and a number of voices.
Another solution to me, is to really make Instrument clearly make to create layering. I feel it's a strange (yet interesting) ability to set different synths for polyphony on the same instrument, as they may not have the same waveform. Most of VSTs and hardware synths usually use the same source, replicated in voices for polyphony. Here things can go really wild, by playing 3 different notes on a same track, with the same instrument, and still get different sounds for each note.
It's a wonky way of doing things, I know. Ideally you'd be able to say "here's a synth for this track" and then a setPolyphony() function would make that many copies of it, but as you say you might want to make an instrument out of multiple synths. And it's not immediately clear what you'd do if the synth has modulators--we could add a "shared" flag that indicates the same modulator would be used for all the copies, otherwise it'd be copied.. and then custom modulators would need some API for dealing with that.. It got complicated fast so I figured it was more flexible and more transparent to let you do it yourself in code.
Which is not to say you're wrong in any way, just giving an explanation for why things work like they do currently. I look forward to expanding the synth capabilities on Playdate, particularly making the API more usable and less confusing. Your feedback is super useful!