I understand the UE4 integration is young, but I was wondering what the best method is for handling reverb areas (known as reverb volumes in vanilla UE4). The only method I can think of is to trigger a mixer snapshot event to change the reverb effects based on location. This, however would be dependent on listener (player) position rather than the source position. Is there a best practice for handling reverb based on source position, perhaps by re-routing an event to a different aux bus?
Secondly- are there any plans to integrate the passing of obstruction/occlusion values in the UE4 integration?
Using mixer snapshots is the standard way of working with Studio, but you are correct that it will result in the effects based on player location rather than sound emitter position.
Other projects have done dynamic rerouting of event instances by having multiple sends in the event, and then setting up automations to control which sends are active based on a parameter. However I concede this is not a very good workflow for that case.
We’re looking into better solutions for dynamic routing but so far we haven’t got anything coming up in the immediate future
At this time, snapshots are still the primary way to alter your mix in reaction to changing in-game circumstances.
What’s the drawback of having multiple sends in the event? I assume that some may find current snapshot system not very intuitive for handling reverb environment tasks.
@Ilya There’s no particular drawback to having multiple sends in an event, though of course each send does cost a small amount of system resources. (If you have follow-up questions, I recommend asking them as new questions, as we get notifications about them that way.)