Meta Quest 3 headsets - Spatial FMod settings

We’re looking to develop spatial audio for the Meta Quest 3 headsets (Unity). We have the Oculus FMod plug-ins but wondered what the optimal spatial FMod settings would be in 3D events and the global mixer - including project preferences?

Can I get you to elaborate on what exactly you mean by “optimal” in this case? What outcome(s) are you specifically trying to achieve?

Thanks for getting back to me. We want to develop VR prototypes for the Metaquest 3 headsets, to be experienced either through additional headphones or using the in built Metaquest audio outputs. So the output would need to be binaural. We’re using Reaper and Dear VR immersive software with Supperware headtrackers to model the audio prior to implementing in FMod.

Firstly, what would be the best settings in preferences - someone advised us that we should set the build to ‘mobile’ and 5.1 (fadcpm), but while we can see the 5.1 option under the build ‘Desktop’ section, we can’t see an option for ‘mobile’. I assume that setting to 5.1 allows a channel based approach in individual events and then the Oculus plug-ins convert to binaural audio ready for headphone monitoring. Alternatively, would it be better to set to the headphones option under build from the start if the audio from Reaper is already encoded for binaural sound?

Secondly, for 3D positional events, if set to 5.1, presumably the Oculus plug-in goes on the master track only but I’m not sure how this would impact sounds already encoded binaurally?

Finally, what about the global mixer - output channel settings and plug-ins on the master track? The Oculus reverb doesn’t seem to do anything, unlike FMod’s own reverb. We haven’t got a working session file to send at this stage, we’re just checking the basics beforehand. Thanks.

I’ve released games made specifically for Quest set up both ways: 5.1 and Stereo. In the end, FMOD will detect if the number of outputs on the device is less than what’s specified in preferences, and will mixdown accordingly. Yes, set to Mobile, FADPCM, 48kHz (Quest’s native sample rate — setting this will reduce CPU from resampling in realtime).

If you have any audio pre-processed as binaural, yes, using the Oculus spatializer on those events or on the track those files are on will negate what you’ve bakes because the spatializer downmixes to mono before processing. If you want to make an event that uses both simultaneously, you can put stuff you’ve pre-processed on one track and the stuff to process real-time on another track with Oculus Spatializer. I do this all the time — makes for amazing sounding directional ambiences or objects that need to have directionality and width! Just know that setting it up this way will cause FMOD’s Virtualize and Distance voice stealing to not work right, so use it carefully.

Last note, every binaural plugin will sound different, and to me, DearVR and the Oculus Spatializer sound possibly further apart than any two binaural algorithms. So anything you’re not baking in a DAW with a binaural effect, and just testing in Reaper for how it will sound when spatialized later in FMOD with Oculus Spatializer, will sound way different. DearVR has nice reverbs and hugely different coloration. Oculus you’ll need to finesse and tailor the reverb sends per event. I always automate the send with a distance parameter to make things feel right at any distance, and each event may use a different distance curve or preset.

Also, yep, never used Oculus’ reverb. FMOD’s algo and convolution reverbs work great!

Thanks so much for your detailed advice, that’s really helpful - I really appreciate you taking the time to reply!

@magomusica has graciously answered a lot of your questions, but to address some standout points:

  • “Mobile” in this case refers to the project’s platform, which you can find in Preferences → Build → Project Platforms - these are customizable, and the “Desktop” one is the default platform. Right clicking on the list of platform will allow you to add a “Mobile” one.
  • Oculus’ reverb only affects audio that is being into an Oculus Spatializer, and uses an attenuation curve that appears to mute the reverb signal when the spatialized event is located in the same position as the listener - the effect is easiest to observe when the positions are slightly different, and optionally when the “reverb send” level and “wet level” on the Spatializer and Spatial Reverb respectively are increased.

If you have any further questions, please feel free to ask.

Thank you for the additional information and support - that clarifies things.

1 Like