iOS Logic Tracks Go Out of Sync After Suspend

I was testing our game on mobile and noticed that if I suspend the application and come back, the logic tracks go slightly out of sync.

Is there some special handling required for OnApplicationPause to prevent the sync issues from happening, or is this a bug?

Bump

It depends on what you mean by “logic tracks” but usually you will need to ensure two things with the event:

  1. Set the priority in the event macros to “Highest”
  2. Ensure none of the assets being used in the event are set to “streaming”

Can you see if this helps?

Thanks very much for your reply! So you surmise that pausing the game is causing multiple streams buffering to go out of sync? And the only way around that is to load all streams into memory. Is that correct?

Just want to make sure I understand so that I’m not loading a bunch of stuff into memory for no reason.

Yes that is correct. If an asset is low priority (in the context of the rest of the assets being loaded) and is set to streaming, there will be a slight buffering whilst the event is loaded, seeked to a location, and then the underlying asset in the triggered instrument is also loaded and seeked. Events with time-sensitive instruments should try to avoid streaming assets because the slight delay can knock the event out of sync with itself.

Hey Richard, I had a chance to test this out some more. I changed all tracks to be non-stream and I still noticed this happening. The audio tracks go out of sync on low power devices like my old iPhone 5SE.

Is there anything else I need to do to make it so the tracks don’t go out of sync now that I’m not streaming anything?

Can you confirm that you are already using System::mixerSuspend and System::mixerResume?

https://www.fmod.com/resources/documentation-api?version=2.02&page=platforms-ios.html#handling-interruptions

I don’t think “I” am doing that, but maybe the Unity integration does behind the scenes? Is this something I need to be doing?

I’m seeing this in RuntimeManager:

#if (UNITY_IOS || UNITY_TVOS) && !UNITY_EDITOR
        [AOT.MonoPInvokeCallback(typeof(Action<bool>))]
        static void HandleInterrupt(bool began)
        {
            if (Instance.studioSystem.isValid())
            {
                // Strings bank is always loaded
                if (Instance.loadedBanks.Count > 1)
                    PauseAllEvents(began);

                if (began)
                {
                    Instance.coreSystem.mixerSuspend();
                }
                else
                {
                    Instance.coreSystem.mixerResume();
                }
            }
        }
#else
        void OnApplicationPause(bool pauseStatus)
        {
            if (studioSystem.isValid())
            {
                PauseAllEvents(pauseStatus);

                if (pauseStatus)
                {
                    coreSystem.mixerSuspend();
                }
                else
                {
                    coreSystem.mixerResume();
                }
            }
        }
#endif

Is that not doing what you’re suggesting? That documentation seems to suggest it’s incumbent upon the developer, but then there’s that code…?

bump

I’ve tested using several versions of Unity and the FMOD integrations and all seem to handle suspending and coming back just fine, no desynchronisation issues.

Are you able to send the event in question over for further investigation?

Sure thing! What file exactly do you want? I’m not sure how to export a single event. Did you need a project with a single event? What’s the best way to share?

Right click on the event in question and select Copy. Open a new blank FMOD Studio project and paste this event into the events browser. It should copy over the assets and any mixer bus routing it uses. Save and close the project, and you can now zip up the folder it resides in to send over.

OK no problem. I just uploaded one of the events to my account. Let me know if there’s anything else I can provide. I was testing on an older Samsung S5 if that’s any help.

In my testing, I could hear an ever-so-slight desynching happening if I really hammered going in and out of suspending the Unity editor, but it went away once I set all assets to not-stream,. I could not get a reproduction to occur on our test devices (an older Samsung Galaxy and an iPhone 5).

I might suggest chopping the music up into smaller synchronous instruments, rather than one long stem per track. This would likely help with getting the music to resynch in the event it does desynch.

Hmm, very odd! I have all my tracks set to non-stream and the de-synchronization I’m observing is not subtle.

When it happens, it stays out of sync until an entire loop of the song completes, and then the stems sync back up. I’m not sure what the difference is between our two setups.

When you say chopping it up into smaller instruments, do you mean having even more tracks? I’m not sure how we’re supposed to get around the length of the stems. The stems are as long as the song is, and we can’t really truncate them.

I meant chopping each stem into 4 bars, or whatever works for the song. Having multiple shorter instruments makes it less likely to go out of sync and if it does it can resynch upon the next instrument triggering rather than waiting for the whole song to loop.

I will DM you the already chopped up assets I’ve rendered for you to test with.

EDIT: Also, please could you confirm you have set the Priority event macro to “Highest”?

Thanks for sharing this and for the suggestions! I’ll try setting priority to highest for all of our music and see if that helps. Chopping up our entire soundtrack would be a lot of work (it seems strange that this would be required. If it’s necessary, why isn’t it done behind the scenes?).

I’ll let you know if either of these work. Thanks for taking a look!