Mute instrument / region on parameter set

Hi,
I have a timeline where I would like to be able to mute and unmute individual instruments / tracks depending on if a parameter is set for that instrument, but I want them to only mute or unmute at the beginning of each region. I’ve been able to get the unmute function to work by using quantization and automation for the track level, but when I want the track to mute it does so straight away without waiting for the beginning of the region. Is there a way to make FMOD wait for the region to finish playing before applying automation?

Can you post a screenshot of your event?

image

Here’s a small part of it, it contains quite a few more tracks and loop regions, but it shows the basic form I’m currently working in. I want to be able to have FMOD respond to events in game by muting or unmuting tracks and/or regions, however I want it to wait to mute until the current loop region has finished playing.

Example:
Current loop region is A1
Player does X.
Set parameter to tell FMOD to start playing track PianoMelody.
Player does nothing for amount of time.
Set parameter to tell FMOD to stop playing track PianoMelody, but wait until current loop region (A1) has finished playing current cycle or next loop region (A2) starts playing.

Is there any way to achieve this?

Ok, thanks for the screenshot. The simplest way to achieve that would be to set all your single instruments to “async” and “loop”. That way, the default behavior when untriggered is to play entirely, which is what you want.

image

And of course, condition the trigger of the instrument with your parameter.
Note that the “loop” button is a workaround for your instrument to continue to play at each timeline loop, but it means the instrument loops on itself, asynchronously from the main timeline, which could theoretically cause some desync over time, if the game stays a long time on the same loop. By shortening your instrument the way it don’t totally fill the region would solve that problem and make a resync at each loop (and no more need to check the “loop” button):
image
With this technique, the musical content can even be a bit longer than the loop lenght, for blending a reverb tail with the next loop.

But now that I wrote that, I realize your loops are 16 bars long, and there’s no 16 bar quantization (max 8 bar). So I guess the unmute you found isn’t ideal. Anyway, my async tip for muting will work independently from the quantization.

If I’m understanding what you’re trying to achieve, it might be worth utilizing event instruments with trigger conditions to do this.

In the child event you can set up the timeline with your PianoMelody instrument(s) and a loop region with a parameter condition so that once the parameter value is met it will finish playing the rest of the track before untriggering.

Essentially there will be two parameters - one to trigger the child event and one to untrigger the child event’s loop region.

If I’m not explaining this well please let me know.

Thanks for the reply!
Yeah, I started out experimenting with async and different loop lenghts, but I didn’t get it to work satisfactory. Might try your suggestion again and see if I can get it to work.
Regarding region lengths being 16 bars, I’ve thought about chopping them up into 8 bars instead inside of FMOD since that would theoretically make quantizing work more in line with the way it’s set up in FMOD. I’m having a hard time visualizing other things that might change when doing that though, so I’m a bit hesitant to try if it ends up being a lot of manual labor and not working properly.

For the 16/8 bar problem, I just had the idea: halve your tempo marker and use the 8-bar quantization!

It should work, by simply checking the async button and reducing the lenght of the instrument. What issue did you encountered?

Thank you for your reply!
I tried your solution and it solved the “muting in the middle of the loop region” issue. However, it did not solve the quantizing issue with instruments starting in the middle of loop regions. The event instruments are really powerful for implementing complex flows, but they are also not very well suited for my project with 12 different instruments as the nesting becomes a bit cumbersome.

Good idea!

The mix of async and sync instruments get weird results when you’re trying to have it all synced. Most of the time my async instruments started playing from the beginning while the accompaniment instruments where in the middle of the loop with different harmonies making it all clash.

Sorry for spamming this thread with replies, but I did actually manage to solve this myself so I wanted to share my findings.

My solution to this problem was using 2 sets of parameters per instrument, in my case 12 of them including piano, upright bass, drums, vibraphone, etc.

The setup is as follows:

  • 1 “external” parameter for each audio track/instrument. This is what will be controlled by other parameters in-game. Set to true or false. Mine are called “InstrumentPlay”.

  • 1 “internal” parameter for each audio track/instrument. This is used as a trigger condition for each instrument region. Set to true or false. Mine are called “InstrumentPlayCheck”.

At the beginning of each loop region I added two command instruments on each of the tracks. One checks if the external parameter is set to true and sets the internal parameter to reflect that. The other one checks for false and sets the internal to false.

It’s not a very pretty solution, but it works.

Yeah, I also had this solution in mind, using two sets of parameters, but it sounded overcomplicated (due to the number of parameters involved) compared to the other solution.

This is probably due to the 16/8 bar quantization problem. Since it’s solved by the tempo trick, it shouldn’t be a problem anymore.

1 Like