Hi, I’m a student sound designer currently working on an indie project using Unity 6 and FMOD 2.03.
I’ve been teaching myself FMOD and technical sound design, and I’m trying to figure out the most professional way to build a haptic architecture.
Recently, I’ve been experimenting with Meta Haptics Studio to convert audio into haptic signals. My current workflow is dragging the generated haptic assets directly into FMOD tracks (on a separate track within an event) so they trigger alongside the sound.
I have a few questions regarding this:
1 - Is this a viable professional workflow? Is it common to manage haptics as tracks within FMOD, or is it generally better to handle haptic triggers on the Unity/Engine side?
2 - Platform Compatibility: Does this FMOD-track-based approach work reliably when porting to consoles like the PS5 (DualSense), or is it mostly optimized for Meta Quest/Android?
3 - Performance: Are there any known issues regarding synchronization or CPU overhead when running haptics through FMOD events like this?
Since I’m learning this on my own, any advice on “Industry Standard” for Haptic vs. Sound architecture would be incredibly helpful. Thanks!