Syncing oneshot events perfectly on beat

Hi,

I have a piece of music, which has a constant bpm. I can calculate the interval between beats by dividing 60 seconds / bpm. By reading the dsp clock I can then, presumably, through code, know eaxactly when a beat occurs. I want to trigger one shot events that play exactly on these beats, however I am experiencing inconsistencies - sometimes the events are on beat, sometimes a bit early or late.

My oneshot events are created up front, I then call start() on them when I hit a beat.

Is there a more reliable method to sync these events being played? I was going to try setdelay , but this seems to be on a channel and not an event - I don’t understand why.

I tried a timeline beat callback - triggering events from this has the same issue, the timing is not reliable.

Another path has been to experiment with schedule-delay - I think this improves my situation somewhat but since I don’t know if my oneshot events will have latency before playing, it feels a bit shaky.
instance.setProperty(FMOD.Studio.EVENT_PROPERTY.SCHEDULE_DELAY, someValue);

So I guess my question is, what approach should I be taking here to achieve sample accurate playback of my oneshots?

Thanks

1 Like

When you play an Event, the command isn’t dispatched until Studio::System::update is called. Processing of Studio update is asynchronous, when this occurs the Event start will be scheduled. Once scheduled the Event will start at the beginning of the next mixer update. These delays and update quantization will likely cause your Events to play out of lock-step with the desired beat.

The easiest way to sync playback is to do it within Studio, via nested Events, you can use the quantization and timeline tempo markers to ensure things are played back on the beat, Internally we take all the delays into account to ensure sample accuracy.

Doing it manually is a little tricky, under the hood everything is managed with setDelay and clocks however we don’t expose those to the Studio API level (only the Core API), perhaps we should? Manipulating the setDelay of the head DSP of the Event would be how to hack it but there are no guarantees when messing with the Core API behind the Studio engine’s back.

1 Like

Hi Mathew, thanks for your reply!

I’m not sure using studio is an option, if some of these sounds are player drivenbut it’s worth a shot. I’m wondering if the following is possible:
Create a song event as main event
Create a nested event with a quantized playback (is it possible to loop the same sample on beat?)

If something like this is possible to set up, I could raise/lower the volume of nested events as an alternative solution - likely with some artifacts of cutting the sound but I think I can live with them.

Is there an example somewhere of setting up nested quantized events?

Regarding the core API, is this not available through unity managed code, would I need to write a c++ plugin?

Thanks!

The full Core API is available via C# and thus Unity, you wouldn’t need to write a C++ plugin.

I don’t think we have any examples to assist but I’ll get someone from QA to look over this thread in case they have something that might help.

If you are willing to use Studio, you can allow user input to trigger individual instruments in an event by assigning them parameter trigger conditions and having the user’s input set the value of their controlling parameters, and you can quantize individual instruments such that they always play on the beat.

I’m not sure I understand exactly why you need to use nested events instead of other instruments, so I’m unable to suggest a specific way to implement the behavior you want using nested events. Why are nested events necessary?

thanks for your replies!

How would I go about creating a trigger condition that plays on beat? that sounds like what I’m after. If I understand the flow correctly this would be something like

  • One event which plays music at some fixed BPM
  • WHen the user e.g. presses a button which is to trigger a sound I want to synchronise, I set some trigger condition from code
  • Somewhere in a sfx event this is triggered by my trig condition, and it is synchronized to the bpm of the music in my first event

The only problem is that I currently have no idea how to set this up?

If I was to do this only from code, I presume the flow would be

  • Some music event is playing
  • Code reads dsp time from this event
  • When the player presses a button early, I want to schedule a sfx event to play on the next beat

Would this be using setDelay on the mixer bus using the core API, or what would be the approach?

Super grateful for your help so far!

To set up the behavior in FMOD Studio:

  1. Add a preset parameter to your project. A continuous user parameter with local scope should be fine. Name it something like “recently recieved input.”
  2. Set the preset parameter’s velocity to a negative number. This ensures that parameters based on that preset parameter gradually ramp down to the preset parameter’s minimum value over time whenever they are left alone.
  3. Add a tempo marker to the beginning of your music event’s timeline, and ensure it defines the correct BPM and time signature for that event.
  4. Take the audio that is supposed to play only on the beat when the user provides some input, and put it in an instrument in your music event. (The type of instrument doesn’t matter; use whichever is most appropriate for the audio.)
  5. Click on the instrument to select it. In the deck, click the instrument’s “add condition” button, and add a parameter condition to the instrument using the parameter you added earlier. Set the range of the parameter condition such that the instrument only triggers when the parameter is at or near its maximum value.
  6. Still in the deck, set the event’s quantization interval to whichever interval is most appropriate.

The above steps will create an instrument that only plays on the beat both if it is overlapped by the parameter playback position and the parameter used by its parameter trigger condition is at or near its maximum value.

When the user provides the appropriate input, use Studio::EventInstance::setParameterByName to set the value of the parameter to its maximum value. This allows the instrument to trigger; the parameter’s negative velocity will then cause the value of the parameter to move out of the range of the instrument’s parameter condition. (Alternatively, you could move the parameter value out of the parameter trigger condition’s rage by using Studio::EventInstance::setParameterByName instead of using velocity, if you prefer.)

1 Like

The Core API has no concept of FMOD Studio events; they’re only used by the Studio API. Could you clarify what you mean by “event,” and whether you’re using the Core API or the Studio API?

thank you Joseph! I will give you Studio example a try, seems promising.

Regarding core API, I think I’m getting some parts of Studio/Core mixed up.

From an FMOD EventInstance I can do the following:

myInstance.getChannelGroup(out theChannelGroup);

ChannelGroup is, according to docs part of the Core API. Now, what I’m trying to achieve is some sample accurate playback of myInstance (which I assume belongs to the channelgroup). According to the docs here, ChannelGroup::setDelay seems to be what I’m after. I guess I just don’t quite understand what this delay is in relation to.

The docs say “Sets a sample accurate start (and/or stop) time relative to the parent ChannelGroup DSP clock” - what is my default here?

If I e.g. create a new event in FMOD Studio, and instance it from code as an EventInstance - what can I expect the channelgroup to be, and what is the parent in that case?

Thank you so much for taking the time to educate me :slight_smile:

The Studio System and Core API are not designed to be used together except in certain specific ways, and attempting to use them in his fashion will likely result in undesirable behavior.

Specifically, the Studio system already sets delays when scheduling content played as events, and using ChannelControl::setDelay would override those delays, potentially throwing off other scheduling.

If you are planning to use FMOD Studio events in your project, we recommend using FMOD Studio’s built-in mechanisms for scheduling and synchronizing content (referenced events, tempo markers, and quantization) and not the Core API; if you’re planning to use the Core API, we recommend not using an FMOD Studio project, as FMOD Studio content is designed to be handled through the Studio API and not the Core API.

thank you for clarifying Joseph. I will take the FMOD Studio approach you described earlier and give it a whirl within the next couple of weeks. Will ping back with results in case anyone else is interested