Hi! Been playing around with FMOD for a few days while evaluating it for a possible migration for our game and it’s been awesome.
For our game we need the ability to slow down or up multiple audio files in realtime to limits not possible with the workaround that most people are using to timestretch (10% of the original audio file length up to 1000%) as that is currently a feature on our live version of that game on which we are using another audio library.
So we are looking into implementing the RubberBand time stretching library running in realtime mode for this. We’ve got a DSP plugin up and running with the slowing down part working almost perfectly, the main issue, which is where i need help, is on the speed ups.
So, the library (as described in here Rubber Band Audio Time Stretcher Library) outputs less samples than what I give to it on the dsp read call when speeding up the audio. (eg, I receive 512 samples from FMOD, I only get back 170 from the timestretch library). The regular way of working around this is to just request more samples from the audio library on the same call on a while loop and feed the time stretching library until I get enough samples to avoid buffer underruns. But after looking at the FMOD docs for 2 full days, it doesn’t seem like it is possible to do that .
Is there a way I can get more samples that what FMOD gave to me mid dsp read while being hooked to a ChannelGroup in realtime? Or maybe is there an alternative way to implement this library? I saw that you could in theory get the base sound object and use sound::read, but our issue with that is that; 1.- It doesn’t seem like you can do that from inside the dsp plugin 2.- We are playing multiple tracks at the same time that need to be timestretched. We would probably run into issues if we try to timestretch multiple sounds instead of just the channelgroup output, which is the mix of all those sounds.
Thanks in advance!
Karla