Similar to some other posts in the last couple years, I’m experiencing latency in some situations when I change parameters. Please check out this video walking through it:
I can’t understand why this much latency is introduced when I work this way, and it seems like it could be a serious problem down the line. For now, I could probably find another to work around this, but if it did work like this, it would be a very nice and smooth way to do things.
I’m using version 2.01.09. None of the relevant sounds are streaming, and all assets are WAVs. I have tried moving the relevant instruments to different places on the timeline as well, played with source/destination regions, and reduced the area shown in the attached image to its minimum (zero) length, with no change.
I can’t be totally certain without seeing your event in person, but I suspect that the latency may be related to the fact that your multiply nested events make use of parameter trigger conditions.
There are three details that I suspect may be combining to produce the behavior you’ve observed.
As mentioned in the threads you’ve linked, there is always an unavoidable amount of latency when an event instance’s behavior depends on a parameter, so as to allow our scheduling system to schedule the audio to be played.
Nested event instances are not created until the event instrument associated with the nested event is triggered. This means that if an event instance starts when a nested event instrument is triggered, that instance does not exist before that instrument is triggered.
For obvious reasons, a nested event instance’s parameters can only be set once the nested event instance exists.
Each of the nested events in your video has parameter-based behavior, meaning that they contain behavior that will involve a parameter delay. Because there are multiple levels of nested event and each of those levels is subject to latency, you’re hearing each of those event instances’ parameter-based latencies in sequence - and the same latency three times in a row is much more obvious than just once.
Thanks for your reply. That makes sense, and I’ll keep it in mind and try and avoid nesting events too much in the future.
Got a confession to make though – it looks like one of my files had accidentally been marked as streaming (the engine start one-shot, no idea when/how that happened), and as that was referenced in a later event, it caused the whole event to suffer from latency. Removing the event also didn’t remove the latency immediately – only by removing the event, then deselecting and reselecting the whole event was the latency gone, so this threw me off a bit. Still my fault though!
This occurs because FMOD Studio loads the assets of an event when the event is selected for display in a tab of the editor pane of the event editor window, and unloads them when the event is deselected and stops being displayed in that tab. This loading behavior is only used to audition events in FMOD Studio, and has no impact on in-game behavior.
I should mention that whenever you import an audio file that’s at least ten seconds long into an FMOD Studio project, FMOD Studio automatically sets it to stream. This is because assets of that length are usually dialog or music, and thus likely to benefit from streaming and unlikely to be played at the same time as each other. If you want to change the length of audio file that’s automatically set to stream, or disable this behavior entirely, you can do so in the “Assets” tab of the preferences window.