Building a rhythm game with accurate beat events

Hey,

I’ve been working on a rhythm game for a while using the built-in Unity Audio System.

In short, I have a beat tracker scripts that sends out an event when a beat (or a “half” beat) is played. A lot of things can react to these beats, for example an animation starting to play, a particle system to play etc.

The problem I am having is that I’d like to have some event instances to play exactly at the same time as the beat of the “main” event instance, which is sending out these beat events. If these events are even slightly off-beat it’s very noticeable.

With the Unity setup, I have a character playing a 3D drum sound exactly on next beat using the Unity AudioSource’s PlayScheduled method, checking the last beat’s dsp time and adding the length of the beat to that, so the result is good enough most of the time.

What would be the best approach, using FMOD, to schedule the playing of an event instance, so that it plays exactly at the time of a beat?

Hi,

The easiest way to accurately schedule event instance behavior would be to handle it at design-time in the FMOD Studio application. This can be done using referenced events on a timeline alongside FMOD Studio’s quantization features to allow the Studio system to handle scheduling. Parameters can be used to control which referenced events play, their volume, etc.

It is theoretically possible to do this via the Studio API instead, but it isn’t recommended - while typically it may not be a problem, the Studio system can’t make any guarantees on the exact time that event instances start playback. This is due to the nature of the Studio system’s asynchronous processing model, where API commands you make are enqueued in a buffer, and are then submitted to the Studio system on Studio.System.update() to execute asynchronously. There are ways to ensure that execution of event instances from the API somewhat more consistently synchronized with a beat, but setting it up in FMOD Studio is the simplest and best way of handling it.

Hey, thanks for the reply!

The referenced events sound like something to look at!

Do you think it’s reasonable to use these referenced events in this scenario, where it’s possible there would be 10-20 different sound effects that I would like to play to the beat?

Let’s say one character playing an instrument and another one grunting the rhythm, and maybe a bird chirping every 7th beat.

Yes, that sounds reasonable, especially if you know which sound effects you want to play at design-time. If you wanted to control which of the sound effects played at run-time, then it’s a little more complex; this would involve setting parameters from code to control which sound effects play, for example by setting trigger conditions on the event instruments, or by automating the volume of corresponding audio tracks.

Thanks for the reply.

I think the challenge is that we want to consistently play 3D sound of a character playing an instrument to the rhythm, so that if you’re further away you won’t hear it but when you get closer it plays along to the “main” music.

Is this something that can be done with the design-time implementation?

Yes, it can be handled at design-time.

What you’ll want to do is create a single parent event that plays your music, as well as all other audio you want to sync with the music, and a number of other events for all of your individual synced audio The synced audio will muted in the parent event, and instead will be sent to different events using Transceiver effects, where it can be spatialized.

Here’s a quick mockup of the parent event, playing two SFX tracks (a single looped instrument vs multiple instruments) alongside a music track:

I have placed my synced audio assets on separate audio tracks from the music, have set their faders to -∞ so they don’t produce any audio. Then, pre-fader, I’ve used a Transceiver effect set to transmit the signal from these tracks elsewhere. Each SFX track is sending to a different Transceiver channel.

Then, I’ve created a separate event with a Tranceiver set to “receive”, and placed a Spatializer on the event so the synced audio that it receives can be spatialized:

Instances of this event will all produce the same audio being played in sync by the parent event, but can all be spatialized individually.

The only other thing to note is that using a transceiver introduces a single mixer block’s worth of latency between the source and destination - if you’re finding that this makes your audio a little too unsynced for your liking, you can also use a transceiver for the music track so they’re all at the same latency.

Let me know if you have any other questions.

Hey, just as an update.

We managed to use Transceivers for most of our cases, to play spatialized events to the beat of the music.

However, I found out that with ChannelGroups you can use setDelay to achieve the same outcome as with Unity’s PlayScheduled. I just had to get the dspClock for this and it wasn’t too difficult.

So we’ll be probably using a mix of these, as it requires far less development time for me to replace PlayScheduled implementations with setDelay instead of making things through design-time FMOD (as the whole implementation would have to be changed).

It’s good to hear that you’ve found a solution! I can definitely understand wanting to handle this without fundamentally changing your whole implementation.

As always, since the Studio system assumes responsibility for its underlying Core system, directly interacting with the Core API when using the Studio API and messing with an Event’s Channels or DSPs can cause unintended behavior. Depending on what you’re doing with your events, you may run into some weirdness, but if it is all working as expected then it’s probably not too much cause for concern in this case.

Currently the events that I’m delaying are separate from the music event, so just standalone SFX events. I’m also creating many EventInstances for this setDelay purpose so that each has their own Channel Group (at least by testing it seemed like that). In this scenario, is there some known unintended behaviour that could happen?

This is correct - each instance of an event has its own master channel group that essentially corresponds to its master audio track in Studio.

The Studio system uses its own delays and fade points on the event master channel group to execute some behaviors like scheduling, pop-free pausing, etc. The concern would be that your own setDelay calls on the event master channel group would mess with this. However, since the SFX events are standalone, this shouldn’t effect your separate music event.

I just had a chance to return to this matter. And it seemed to work just fine last time, but as soon as pausing the game is taken to account, weird things start to happen. By weird, I mean setDelay not being “in sync” anymore and the delayed sounds not being played when I expect them to play.

Another scenario where I’ve previously used Unity’s scheduling is to play an “example beat sequence” to tell a player what kind of beat they should play. I’ve used scheduling for this to make it precisely match the music. Also the time when this sequence will start can vary from like 1-3 seconds (depending where the beat is at the point of scheduling), hence why I’m using scheduling because within code I have that knowledge.

We tried to implement this type of stuff to the Studio, but it seems quite a lot of work since a beat sequence can contain different sounds in different orders, so quite a lot of unique sequences. Furthermore, we would have multiple songs in the game so having to implement this for each music event seems very tedious and error prone. Are we approaching this wrong way with the studio implementation?

EDIT: got things working better after switching to use the dsp from the channel group of the music event instead of master channel group.

Ideally, you wouldn’t handle this via the Core API, and we recommend against it for reasons stated previously around Studio assuming total responsibility for Core API functions - while it is possible to find some success with it, you’re largely on your own.

The specificity of your needs (beat syncing + spatialized sounds + existing code using PlayScheduled) makes it difficult to suggest an exact solution on the Studio implementation side of things without more familiarity with your project as a whole, which would fall outside the scope of forum support. That said:

if it’s working for your purposes, and you aren’t running into any further issues, then it seems fine.

1 Like