In my Unity project, there are multiple events with loop regions that I’d like to play simultaneously and in sync with one another. Unfortunately, no matter what I do, they’re often audibly out of sync with one another.
The closest I came to making them play in sync was by using the getTimelinePosition method on the event that’s used as the reference and then applying the received value with setTimelinePosition to all the other events that are supposed be synchronized. But even after I did that, the tracks were still audibly out of sync.
Using the getTimelinePosition to compare their timings doesn’t seem to display the correct values, because once the events have started playing, constantly logging their position indicates an unchanging difference between each of them, however, whenever I change parameters on some of the said events (by using setVolume or setParameterByName), they audibly drift and get more out of sync – yet the logged timeline position values continue to indicate the same difference that was there before. (One more thing to note is that the events seem to automatically get back in sync once they reach the end of the loop, but it’s usually still not as accurate as I’d like it to be).
Now, I’m not sure if it’s the events that are getting desynchronized or if it’s perhaps different instruments within those events that drift, but in either case – I’m left with a difference that can’t be measured or adjusted for.
Is there any way to make sure that a few different events (along with all instruments within them) are perfectly in sync with one another?
What you describe sounds like a standard vertical layering. Isn’t it possible to redesign your events in one unique event with instruments syncronized to a unique timeline? That’s in my opinion the only way to be sure they will never go out of sync.
You have to use the Transceiver effect. All tracks have to be in the same event and on each track you put the Transceiver effect in send mode. Then create several events for each instruments and in each you put the Transceiver effect in receive mode. You will to start all events at start. Hope it helps.
You’re right, it’s certainly possible. The reason I’m trying to avoid creating a single event is because there will be hundreds of different tracks that can make up this loop (and each track will have 4 different variants that can be controlled through custom parameters). So, if it were possible to synchronize different events, I’d only need to create a single event with 4 instruments in Fmod studio, setup all the parameters for it, and then keep duplicating it (each time simply switching out the 4 tracks inside). But if my only choice is creating a single event, I’ll need to set up separate parameter values for each and every instrument across all of the different tracks. This process is a lot more time consuming, and having so many instruments in a single event makes Fmod studio stutter sometimes. Hence, I was hoping to avoid creating a single event.
Are you streaming all of them? If so, try preload (not efficient for sure, but depending on your use case might be viable.)
Another thing I would try is pause as soon as an event is started, while keep track of the events you are looking to sync, you can do this through FMOD callbacks. Once all events in the list are started (and paused, so no sound is heard till this point), unpause then all.
Unfortunately, while the FMOD Engine does support starting multiple events at the same time with sample-accurate timing so that they are perfectly in sync, it is not currently possible to do so in Unity. This is because of the way Unity handles API calls.
As such, if you want to start multiple events at the same time in a Unity project, they must be triggered as event instruments by the same parent event instrument.
The instruments getting increasingly out of sync over time sounds like a separate, unrelated issue, most likely caused by the assets being set to stream or by the channels being virtualized and restarted by our virtual voice system.
The nature of streaming audio prevents sample-accurate scheduling. Setting the affected assets to a loading mode other than streaming in FMOD Studio’s assets browser, then rebuilding your banks, may make them less out of sync.
If your events are getting virtualized, and include asynchronous instruments, it is not surprising that they are getting out of sync. When a virtualized channel returns to being non-virtual, any asset it was playing is restarted from the beginning. This would explain why the instruments go back to being in-sync at the end of a loop. Setting the relevant instruments to be synchronous instead of asynchronous should prevent them from going out of sync when virtualized; if that’s not an option, preventing the event and its channels from being virtualized should allow you to avoid the issue.
Right, that answers my question regarding the multiple event then.
When I first ran into this problem, I saw some of the other forum posts where you explain how the desynchronization of the instruments could be explained by virtualization. So, I made sure that none of the assets are set to Streaming mode, and none of the events include asynchronous instruments, yet this problem still persists.
Right, so it seems that some of the instruments are getting virtualized due to the VOL0 Virtual feature. Lowering their volume to -79.9dB instead of completely muting them solves the issue, however that leaves the tracks slightly audible when they should be completely muted.
As I understand, in order to disable this feature, I’d have to change the FMOD_INIT_VOL0_BECOMES_VIRTUAL flag. But according to this post it is impossible to change the init flags of the Unity integration.
Are there any alternative ways of globally preventing sounds from getting virtualized?
You could potentially edit RuntimeManager.cs, locate the line advancedSettings.vol0virtualvol = 0;, and change that 0 to -1. This property represents the volume at or below which events are considered silent for the purposes of virtualization. The audibility calculations use a scale of zero to one, so setting vol0virtualvole to -1 ensures that even totally silent events are still too loud to be virtualized.
That being said, we recommend against doing this. Globally disabling virtualization would result in your game’s events consuming CPU even when totally inaudible, potentially increasing your game’s resource consumption by a significant amount if your game contains a large number of silent events. In addition, disabling volume 0 virtualization in this way would break the “virtualize” stealing behavior of events, meaning that any events that use it would need to be redesigned.
Yeah, adding a new line advancedSettings.vol0virtualvol = -1; next to other advancedSettings lines in RuntimeManager.cs fixed the problem.
Understood, that’s a fair point.
If anyone comes here with a similar problem in the future, I ended up fixing it by changing the Priority setting to Maximum on the Master return track of the event, that needed its instruments to be synced perfectly. As I understand, maximum priority disables the VOL0 Virtual for that particular event alone.