I am integrating fmod into an engine and I am now trying to get the environmental effects working. I am a complete noob in regards to audio design, so my understanding of fmod Studio is close to zero, which is probably where my problem comes from.
I have my listener and events working nicely and I got a looping sound playing. I have two Reverb3D instances, which have min and max distances of 10 and 20 and I use some of the environment presets (Underwater etc.). I would expect the effect to become more or less audible when the listener moves in and out of the reverb volumes. So far nothing happens, though. It seems I am simply missing some part to enable the effect on my sound events.
I assume I have to configure the event in fmod studio such, that the reverb volume values actually affect the sound? I was unable to find any clue in the documentation (neither programmers API, nor fmod Studio docs) and Google etc. also were not helpful so far.
When you say that you have two “Reverb3D instances” are you referring to sends or snapshots?
If you are looking at creating reverb zones, it would be best to use snapshots. Create a snapshot in your Mixer and scope in the Reverb3D return. Ensure that the reverb effect only has Wet applied to it and reduce the Dry to -80dB.
Then add a Reverb3D send to all groups and events that you wish to be affected by the reverb. Any groups/events without this send in will not be affected by the reverb snapshot.
Snapshots act like events, which you will need to trigger and play in your game engine.
Set all sends to be 0dB, the base Reverb3D return to -80dB, and then the snapshot scoped Reverb3D to 0dB. When the snapshot is enabled (ie. played) the Reverb3D return bus will increase in volume and apply reverb to everything with a send to it.
Thanks for the elaborate answer. Unfortunately I seem to be coming from the opposite end of this, and most of your explanation is like Star Trek techno babble to me
I am a game engine programmer, and I know little about how to create sounds in the fmod Studio application. I am currently mostly looking at the code side and try to integrate everything such that actual audio designers can later use the full feature set.
So these are spheres that you place in a 3D level which are used to define the audio environment. I currently only try out the built in environments, which give you effects for small rooms, large rooms, bathrooms, sewers, underwater, etc.
Now I placed two such Reverb3D spheres in my game world and when I move the listener between these volumes, I expect the sounds to get these room effects. As far as I can tell from the documentation, there is nothing else on the code side to do. I couldn’t find any flags on the sound events or the listener, that would enable it.
So I assume it has to be configured on the sound event inside the fmod Studio application, and that is where I come into stuff that I know nothing about. And the documentation doesn’t mention this either, it makes the impression as if everything should work out of the box without additional setup on each sound event.
It’s difficult to post code, as it is integrated into a larger engine. I have double and triple checked it, and I don’t see what could be wrong on the code side. The Reverb3D properties are updated every frame and the values look correct.
Note, that the original code was written about 1.5 years ago and it worked back then, but wasn’t in use for a while. Now I have come back to it and upgraded to the latest version, which also upgraded my fmod Studio project file. Now I cannot seem to get the Reverb3D working anymore and I thought maybe the project upgrade changed some configuration (or something else is new) that I now need to fix on the data side. Looking at a completely new project, I cannot see any difference though (I thought it might be a setting on the master bus or something).
But as far as I understand you, the Reverb3D instances created on the code side should always affect all 3D event instances, without additional configuration on the data side, is that correct ?
Position values look fine. They work on the events and the listener, and that’s the same code. I am pretty certain some configuration is simply missing, but the documentation doesn’t say anything. I would really like to confirm with some sample app, that the built in reverb objects aren’t simply broken or that I am not missing some step, but as far as I can tell there is no sample that uses this feature.
Also very confusing is that there are 4 reverb sends for which you can set the strength. The documentation doesn’t clarify at all, what this means. Playing around with multiple Reverb3D instances I currently assume that when they overlap, the system picks up to 4 reverb volumes to mix the final sound, each getting a different strength. Because only when I assign a value of 1.0f to more than one index, do I seem to get blended reverb effects.
Am I correct to also assume that the 4 reverbs are sorted by strength or distance? So that I can pick how much CPU resources to invest by setting 1 to 4 of the indices to 1.0f or 0.0f ?
I’m happy to hear that the above helped resolve your issue.
In regards to the reverb sends; you are able to set up as many Reverb3D instances as you want - there is no limit on this.
When the listener moved around the min-max area, this is affecting the wet level of the reverb DSP. When the listener moves between two overlapping Reverb3D instances, it will smoothly transition the wet levels between the two (or more) in a similar way to how FMOD Studio automatically cross fades two instruments in an audio track.
The reverb instances aren’t really sorted, they just have their respective wet levels set by the distance to the listener. As you’ve mentioned, when between two reverb instances these are automatically mixed to avoid clipping.