In my rhythm-based Unity game, I want enemies to have percussive sound effects that are quantized to a music event’s BPM. I’ve experimented with the nested event solution discussed at Syncing oneshot events perfectly on beat, but unfortunately:
- I have many music events (with different BPMs), and many SFX events. Adding hundreds of referenced SFX events to all of the music events, and wiring up hundreds of parameters seems to work, but quickly becomes an explosion of duplicated work and complexity as the project grows.
- For each SFX event, there are several enemies on the screen that should emit the event with correct spatialization. I don’t think I have fine control over the positioning of nested events unless I add yet more parameters.
My dream solution would be some way for Unity to trigger an FMOD event, quantized to an already playing FMOD event, so I can keep my sound effects modular and loosely-coupled. I gather this isn’t straightforward, but is there any workaround, in FMOD or Unity?
For example, I was wondering if Unity’s
AudioSource.PlayScheduled() could work for SFX alongside FMOD music playback, just for sound effects that need precision.
Any advice would be appreciated!