Synchronize Multi Instrument event instances?


We are developing using Unity (not directly pertinent, but may be relevant).

Our racing game has a track announcer. Up to now, the announcer has played as a non-positional event. Each announcer event in Studio represents a “phrase bucket”, and a Multi Instrument in that event contains the actual phrases. So we can trigger a “Last Lap” event, and then the actual phrase spoken by the announcer is picked randomly from the Multi Instrument. This has worked so far, and the multi instrument allows us to specify the percentage that each sample will play, which is a necessary functionality for us.

I am trying to bring the announcer into world space, with multiple loudspeaker emitters placed all around the track. The problem is that if I create a new event instance for each emitter, their multi instruments won’t be synchronized. So I can tell all the emitters to play “Last Lap” phrases, but they all may choose a different sample from the multi instrument. I need a way to synchronize the multi instruments.

Audio tables almost seem like they could work for me. In code I could determine which sample to play and then manually play that in each instance, and I could do away with the multi instruments. But ideally we could add phrases to the announcer just by adding them to the audio table, without requiring any changes in code. If there were a way to query the list of available keys, that would work for me (I could encode the phrase type and play percentage into the key name, and I could find out how many samples exist for each phrase), but I don’t see any way to do that in the api.

Am I missing something? Do you have any better suggestions for how to approach this?


Looking a bit closer at Multi Instruments, I see I can specify a trigger condition for each sample. I suppose I can create a discrete parameter (say, in the range 0-100), and assign some unique trigger range to each sample. As long as we never have more than 100 samples in a multi instrument (that’s a safe assumption for our purposes), I can then just select a random integer from 0 to 100 in code and assign that to the parameter when I play the same event for each emitter.

We have a tool that imports these samples into FMOD Studio, so we would not have to manually assign the trigger conditions, it could do that for us. (Phew!)

Would that work, or am I missing something?

I think an easier solution to your problem would be using the Transceiver effect on your announcer event and sending the audio to an empty event (set to persistent in the macro controls) with a Transceiver effect set to receive on the same channel. In Unity just create one EventInstance of the announcer Event and the copies should be EventInstances of the empty Event. It should play the same MultiInstrument in sync :slight_smile:

Interesting–I’m not familiar with the Transceiver effect. I’ll look into that.

Thanks for the super-fast suggestion!

After reading that thread, yes, that looks exactly like what I want! One event to handle all the logic, the rest as “dumb” receivers just repeating it at different locations.


Yeah that is pretty useful to position the same event at different locations. Another example would be having a 2D music event but wanting to position the individual tracks in 3D space. Just apply the transceiver to the tracks and transmit to several events that have a spatializer effect applied :slight_smile:

1 Like