Multiple reverb zones depending on where the sound is

Good Afternoon,

I am in need of some help with a system in our project and its relation to FMOD. We are using FMOD Studio V1.10.19 and Unity 2018.4.16f1.

We are currently implementing Reverb Zones and came to the conclusion that using snapshots for the reverbs is not good as it limits to only having the reverb applied to where one of the players are.
Multiple players (in multiplayer) and AI (in singleplayer) can be in different reverb zones, and should be heard with those reverbs.

Does anyone have any idea on how one could achieve this? We’ve already tried the following, and indicate why it did not work well:

  • Transceiver: does not allow to automate the channel value it sends too.
  • Manually setup the sends in all events and have a parameter change the one to which it gets send too: not feasible, as one would require a lot of manual setup, due to the amount of audio events there are and reverbs. Events will have over 20 sends to reverbs, due to them being available in multiple levels, which all have a different reverb zone(s). Unfortunately Effect chain is not available on this version of FMOD.
  • Automazing parameters of the Reverb effect on each event - A lot of performance cost as each event would have its own reverb effect instead of having one on the necessary mixer bus.
  • Snapshots: One can only have one snapshot active at the same time (or at others, but they will blend) making it so that the only reverb which is active is where the main player is.

Thank you,

There is no easy way to do this without preset effect chains, short of creating multiple possible routing paths for each event and changing which is used via sends and automation or giving each event its own reverb effect controlled by automation. You could potentially automate your snapshots’ intensities to apply the average reverb from all your listeners, such that each and every event instance is subject to the average of the reverbs it would experience were it audible to all your listeners, but this will only make the issue less obvious rather than solve it.

In a perfectly realistic simulation, each event instance would be routed into a unique reverb effect whose behavior depended on the location of the emitter relative to the listener. Unfortunately, this kind of perfectly accurate simulation is prohibitively expensive, so the usual method of approximating it is to use a single reverb effect for all event instances, and determine its properties based on the position of the listener. This method works well in most cases, but as you’ve observed, it does not mix well with splitscreen multiplayer.