How can I schedule a event to play a specified DSP time in the future?


I’m working on a project with some tight audio timing requirements. I’m trying to play an event at a specified point in the future in sync with a piece of music. It’s easy enough to determine what time in the future to play it given the BPM of the piece of the music and the DSP time, but I can’t for the life of me figure out how to play an event at a very specific time.

Any pointers?


Bumping this. Still haven’t figured anything out.

Hi Toms,

We don’t have any sample accurate event scheduling exposed in the FMOD Studio API unfortunately.
You can however setup sample accurate event scheduling in FMOD Studio using instrument Trigger Conditions and use parameters to trigger the instrument.
As an example, here is how you could setup an event to trigger a beat-quantized referenced event on a parameter change:

  1. Add a parameter sheet to your music event, with a new discrete parameter ranging from 0 to 1.

  2. Add a referenced event in the “1” section of the parameter sheet.

  3. Configure the referenced event instrument to trigger on quarter notes.

  4. Add a command instrument to immediately set your new parameter back to “0”.

With this setup, while the event is playing you could call EventInstance.setParameterByName("Explode", 1) and trigger a beat-quantized referenced event (in this case an explosion) on the next quarter beat.

If you want more referenced events to choose from, you could increase the range of your discrete parameters and assign different referenced events to trigger, for example:

It doesn’t necessarily scale well if you have a large number of events to choose from. If this is a problem for you then if you could please provide any more information on your specific use case, such as how many events you need to select from, whether they could to be random, or any other relevant details you can think of then I might be able to help you come up with alternatives.

Hmm, it’s an interesting approach, but I have my concerns that it would work for my use-case.

A bit more about the project I’m working on.

Music is loaded in dynamically along with tempo information in the form of MIDI (tempo can fluctuate, especially for live music).

We then load a series of events that we want to trigger in time with the music, with a supplied ‘beat’ value being how we know when to play them.

The supplied beat is a floating point number, and can be anything arbitrary (i.e. quarters; 0.250, eighths; 0.125, thirds; 0.333, random; 3.145). Given the beat and the tempo information, we can work out exactly when in time (and therefore on what sample) we want to trigger the sound.

The project is musical in nature, and therefore reducing audio latency directly improves the user experience.

We’re currently reducing audio latency by Unity’s PlayScheduled function to queue the sample for playback at a specified time.

Given the lack of consistent tempo and arbitrary-ness of our beat parameter, I’m not sure if the approach you mentioned would work.

That is a tricky problem. If you could get away with just triggering individual FMOD.Sound instances instead of event instances, then you could stitch them together with sample-accurate timings using ChannelControl::setDelay. We have an example of how to achieve this in our gapless_playback C++ Core API example.

As for scheduling an event instance, trying to get EventInstance.start to begin playback as quickly as possible is as good as it gets at the moment. Some ways to get that down are:

Thanks for the additional information. I’m doing some integration tests later this week so I’ll have a chance to try both of these approaches out. FMOD already demonstrates lower round trip latency in my performance testing, so there’s a good chance the second approach here would be enough.