Hello Andrew,
Apologies for the delay replying to you – I have been away on leave. Our Unity tech lead is still on leave, but I have emailed him asking for confirmation of the Unity/ FMod version and plug-ins used. As soon as he gets back to me, I will pass on the information and the build log – we would be very appreciative of any help and advice. We have always found FMod support services to be excellent – in marked contrast to Meta!
The page you mention was one of the ones we looked at. We went through all the pages relating to FMod and Meta and followed the guidance to the letter, although the guidance was not always clear, and the embedded and disjointed links didn’t always clarify matters because they linked to generic content rather than Meta Quest specific guidance. The support resources seem to suggest that with a few plug-ins in FMod and a few set-up tweaks in Unity audio can be spatialised for the Meta headsets, but in reality implementing FMod, particularly with the Meta audio library, has been highly problematic. Our tech lead found implementing audio without FMod much more straightforward and found that the room geometry etc worked well – hence the pressure to drop FMod….which isn’t something I want to do as the audio lead.
When using FMod, if we don’t use the Meta audio library facility for room acoustic modelling, the headset and desktop builds generally work (although there have been challenges getting audio playback in some cases), but the sound from the emitters isn’t contained within the geometry of the spaces. If we try to use the Meta audio library within Unity, we can get the desktop version to model the sounds correctly, but there’s no audio from the mobile build.
Currently the example linked from the support pages is the Go Karting example, which is a standard FMod/ Unity implementation. What would be helpful would be a link to an example showing the step-by-step stages for a dedicated Meta Quest 3 Android Unity build with screen shots at each stage and detailing troubleshooting steps for common issues – such as no audio output. It wouldn’t have to be complicated (not a full game) just showing three different sound event examples, such as one mono positional sound emitter, one ambisonics environment (3D), one non-ambisonic environment (3D). It would be good to include use of the Meta audio library for acoustic modelling and raytracing as part of the example.
The reason I mention the need for guidance regarding non-ambisonic environments is that the Meta plug-ins don’t really account for this – environments wouldn’t be modelled using the positional soundsource plug-in and people might not want to use ambisonics for environments either – so the meta ambi plug-in wouldn’t be used. That leaves one option really – using the FMod spatialiser for non-ambisonic environments, so some guidance regarding what to do for environmental events that don’t use the Meta plug-ins would be helpful to ensure correct spatialisation for the Meta Quest 3.
We’ve also had issues sometimes when validating the FMod projects which sometimes flag the Meta plug-ins as unexpected – not sure if this is a version issue, but it sometimes happens when there’s nothing wrong with the FMod set-up. I do think it would be helpful if the Meta plug-ins could be supported by FMod as the Resonance plug-ins are (built into the software) – particularly given the place of Meta headsets within the market place.
Many thanks for your help,
Helen