FADING, TIMELINES AND PARAMETERS -- How to fade between a few different events all with different loop times -- and to random points?


I want to know how to use FMOD Studio for a pretty particular use case.

I’m trying to make a game with 9 different “dimension” settings. Each dimension has its own timeline-based event, a collection of looping sounds, sometimes with scatter instruments thrown in for good measure.

The dimensions are : -4, -3, -2, -1, 0, 1, 2, 3, 4.
The player can shift from one event to the other, but in increments of 1 or -1. So, a player can go from -4 to -3, or 1 to 0, for instance, but no less than -4 and no greater than 4. Just to give you an idea of the game logic.

What I really want to know is: How would I model this in FMOD? I can’t exactly do this like many people make adaptive music, since my “dimension” settings are discrete, not continuous. And I have no idea how to fade in or out when stopping one event and playing another. Hopefully I’d like the solution to take place in one event, even if it requires sub-events.

Please let me know what I can do to make this work.

It sounds like you could make an event containing a discrete or continuous user parameter named “dimension,” create one referenced event instrument for each of your existing dimension events and lay those instruments out along that parameter, and then place an AHDSR modulator on each event instrument’s volume to ensure they crossfade when the value of the “dimension” parameter changes.

That is what I am doing – and that would be all well and good, but even with AHDSR modules on all my sub-events and the master events with big release times, I get absolutely no release time in-game.

Here’s my code:


  targetPolarity = stateObject.polarity + delta;

  dimensionSound.setParameterByName("Polarity", targetPolarity);


And here’s how it sounds in-game.

Whereas, if I start the sound from the Start() function and change it so the above code only sets the FMOD parameter, no sound plays at all.

If you are stopping the event, it is entirely expected that the AHDSR modulators on the instruments aren’t triggered; as described in our documentation on this topic, stopping an event does not untrigger any of the instruments in that event under normal circumstances.

That seems strange, especially as it produces sound when you stop and start the event in the example you gave above.

Is the parameter’s “Hold value during playback” checkbox checked in the edit parameter window? If so, uncheck it. That property prevents the parameter’s value from changing while the event is playing - which is useful in some circumstances, but not in this one.

Otherwise, is the event’s persistent property in the macro controls set to “on?” If not, it may be reaching a natural end and stopping between shifts, and setting it to be persistent should prevent this.

playing the sound once and shifting it – setting them all to “persistent”, and the “hold value during playback” was unchecked – works.

Thank you. Although I am curious as to why the sounds have what sounds like 100ms of lag before playing…

I’m glad to hear you were able to get it working.

The various causes of latency are covered in more detail in this thread, but the short version is that there are some circumstances where it is physically impossible to avoid latency, and changing the value of a parameter in order to change the behavior of an event involves the intersection of multiple such circumstances.