Rhythm game advice needed: lining up SFX with music

Hi there!

My team and I are developing a VR game in Unreal Engine with a rhythm component to it. Objects spawn and arrive at specific musical intervals and the player is supposed to interact with the objects at these specific intervals like they would in a game like Beat Saber or Pistol Whip.

While the music track plays from an FMOD Event, we trigger a sound directly in Unreal using the native “Spawn Sound 2D” function. If the player successfully interacts with an incoming object, the sound cue plays. If they miss, then we play a separate “miss” sound.

Now, our main problem is that these sounds are supposed to sound rhythmically precise in time with the music. We were using Marker Callback events to trigger the sounds while the FMOD music Event is playing, and all of our Destination Markers are aligned to the grid in Studio. However, I think we are running into latency issues with trying to call these sounds via marker callbacks. The triggered SFX sometimes play a little too late, and the unquantized feeling of it kind of ruins the experience.

My question is: besides triggering sound cues in Unreal, is there an optimal way to set up these hit sounds in FMOD and make sure they’re quantized to our music timeline?

We’ve tried queuing the sounds via delays, but as those are frame-rate dependent they can sometimes be slightly off. I’ve also been looking at triggering async sounds via Nested Events inside a music Event, and playing them when a Destination Marker is reached. But the results have been pretty mixed – sometimes a sound plays right away, and sometimes not at all.

The Unreal Engine’s native audio doesn’t really interact with the FMOD Engine, so there’s no way to accurately quanitze audio played using the Spawn Sound 2D function to music playing through the FMOD Engine.

Rather than using Unreal’s native audio, have you considered using quantized instruments in the same FMOD Studio event that’s playing the music? An instrument that’s triggered by a parameter and quantized to only play on specific bars and beats should be perfectly sync’d to those beats when triggered. You can create an instrument that’s triggered by a parameter by adding it to that parameter’s sheet in the event, and the quantization properties of an instrument can be found in its trigger behavior drawer in the deck when the instrument is selected.

1 Like

Yeah I’d agree that your best bet is to handle all the audio within fMod. I haven’t come across marker callbacks causing latency, and we’re using a ridiculous number of markers.

1 Like

Oh, interesting! I didn’t even consider that FMOD could trigger things on specific bars/beats if you set them all as parameters. This seems like an awesome solution! Now I need to figure out how to convert the hundreds of markers we use into parameters… :joy: Thank you, Joseph!

Follow-up question: does FMOD Studio has a way to internally read marker names / strings? That’s how we’ve been triggering them in Unreal – if our parser sees a marker named “Hit” or whatever, then it plays the hit sound. I’m just trying to think of ways I can avoid manually entering all the marker positions in the parameter sheet

Haha, that’s awesome. Our latency issues might be tied more to our game’s performance – it has to run on a mobile graphics card, and sometimes the visuals run behind the SFX by a smidge :sweat_smile: :sob:

The answer to this question is somewhere between yes and no, depending on what exactly you mean and how you’re planning to use it.

  • Can you get a marker callback when the playback position passes a destination marker? Yes. However, the nature of marker callbacks is that they can only be dealt with in code, not within the internal logic of the event, so this is probably not the solution you want.
  • Can you continue using the events you have already been using with no modification? Yes and no. Those events are designed to do something (generate marker callbacks outside the event) that’s very different to what you now want them to do (trigger instruments within the same event). You will have to substantially change those events if you to change their behavior so substantially. However, see the next point.
  • Can you use FMOD Studio to automatically make the necessary changes to your existing events? Yes, by creating an FMOD Studio script that reads the content of the event and generates parameters, instruments, and automation as needed. It sounds like your events are fairly simple in terms of how they currently operate, and the behavior you’re trying to achieve is likewise relatively simple, so writing a script to automatically read the markers in your events and add quantized instruments with trigger conditions on parameter sheets should be a relatively simple coding task.

In that case, I definitely recommend disabling Unreal’s built-in audio. If you leave it enabled, it will run in parallel with the FMOD Engine, and having two audio systems running in parallel is never as resource-efficient as running just one.