Syncing parameter changes to music?

We want laser beams in our game to emit a constant hum that harmonises with the music. Currently we’re trying to do this with a global parameter called “LaserNote” with labeled values for each of the 12 note names. The laser beam event is a hum loop that responds to the parameter with pitch automation, and the music event contains command instruments on beats where the harmony changes to set the LaserNote parameter. I had to delay the command instruments slightly off from being exactly on beat though, otherwise they seem to change too early.

The problem is in the final build on mobile, the sync between the music and lasers is off, particularly in places where the music harmony changes more rapidly and command instruments are denser. I can’t reproduce the problem with live update in Unity though, so I’m wondering if there’s a more robust way to do it that doesn’t have different performance on different platforms.

I’m considering enabling Quantization on the command instruments, but with a quick test the behaviour seems counter-intuitive in live update, triggering the changes earlier instead of delaying them to occur on beat.

Because there may be multiple lasers at once and they need to be spatialised, the lasers need to be in their own events instead of being a stem in the music event.

Have you set priority highest for both events? You also could try with markers callbacks, but I suspect using a parameter is the best method.
Also, how does work your automation? If it’s a jump in the timeline, it can be slightly delayed. If it’s a track volume automation or parameter sheet, I guess it reacts faster.

Does event priority affect processing/timing? I thought it was just for voice stealing and virtualisation.

If nothing else works we can try using marker callbacks, but I’d rather keep to tools available in FMOD Studio, especially since the manual says that using command instruments behaves like using the FMOD API anyway.

The laser beam event is indeed set up in a parameter sheet. It’s not using a timeline at all.

Update: we’ve fixed it by

  1. setting all relevant sound assets to no longer be streamed, but loaded into memory instead

  2. enabled Quantization on the set parameter command instruments, and repositioning them to be placed slightly earlier than the target beat.

I’m not sure if both of these were necessary, but it’s synced now!