Hello! I have a tecnical conundrum im trying to wrap my head around.
I have a vision of building a small enviorment with a jazz track playing 2D. Around you you have the 4 different instruments of the quartet, and if you walk close to a instrument (for example the piano) the 2D-mix gets quiter (preferably the piano cutting out fully) and a 3D piano track is played with spatialization and you can walk around it. Going back in the middle fades the full mix up again and getting close to the drums can do the same thing etc.
The problem is, If I want the specifiik 3d parts of the track localized at different position, and the 2d mix to be constant it doesn’t sync up perfectly witch leads me to believe I need one event. But that ruins the individual localisation of the events beeing spread out in a room.
Is someone more experienced than me able to maby help me think?
Im trying to make it work in Unreal Engine (5.2 and FMOD 2.02.20, windows!)
The easiest way to do this would be to use transceiver effects. Place each of your four instruments’ instruments on a different audio tracks of your music event, set those tracks’ faders to -oo dB, then use pre-fader transceiver effects to send the signals of those instruments to four instrument-specific events that contain spatializer effects and no other content. This results in all the instruments using the same event’s timeline, but in each being spatialized based on the position of its corresponding event instrument.
Then comes the tricky part: Making sure the 2D version of the music is ducked when the 3D version is playing. To do that, you’ll need to ensure all four of your instrument-specific music events are routed into the same group bus in the mixer, and put a sidechain effect on that bus. Use that sidechain to control a global parameter such that the parameter’s value drops by more the louder the sidechained effect’s signal gets, and then automate the madster track volumes of your four instrument-specific events on that global parameter.