Separate sounds for Multiple listener(not split-screen)

Hi to all!

Can you get advise, how added different and isolated sounds for 2 listeners?

I working on a old Resident Evil style game with static cameras on rooms and etc.
And I want to create a system in which the cameras that looks at the player hears his steps, and the listener on the player hears the sounds of the environment.
That is, I need to separate the sound groups and isolate them from different listeners, but I still do not quite understand how to do this both in fmod and in unity by editing the listener script.

Thanks,
Tim

Hum, that’s interesting. So you’d like the environment sounds to be heard by a listener placed on the player, and the player sounds (footsteps) to be heard by a listener placed on the camera, isn’t it? Unfortunately, it doesn’t seem possible to mute some events or mixer groups on a specific listener ; but I’d also like to know if there’s a workaround.
It may be possible to achieve this by faking the location of the player footsteps, though…

Yep, correct!

I thinking about Tags and editing listener script, maybe it possible make the listener on the camera react only to the sound coming from the player using attenuation or events from animations.

I watched the report on the GDC from the ItTakesTwo developers (although they used Vwayz) and so they separated the sounds for the players on a split screen, it seemed interesting to me, so I’m trying to repeat this in my project.

You can set an event instance to only be spatialized by certain listeners with Studio::EventInstance::setListenerMask.

That being said, there might be a better way to achieve the behavior you want… Depending on what that behavior is, of course.

Before I can talk about other ways of doing things, though, I need to talk a little about how listeners work.

It’s easy to think of a listener as a kind of “virtual microphone” in the game world, but that’s actually not what they are: They don’t pick up audio signals. In fact, they can’t process audio at all. It may seem counter-intuitive, but Listeners don’t actually listen.

Instead, a listener is just a set of co-ordinates that describes a point in space. The FMOD Engine passes these co-ordinates to each event instance in your game, and each event instance uses the co-ordinates to calculate how its effects and parameters should affect that instance’s output.


So, now that we’re on the same page… What do you actually want to achieve by having different events use different listeners?

I ask because there’s two different points in space you want to be important (the avatar’s head and the camera), and two different processes that are involved in spatialization (attenuation and panning). While it’s occasionally useful to use different attenuation for different events, it’s extremely rare to want to use different panning, especially when the positions used to panning are in close proximity.

2 Likes

I am looking to achieve a similar effect for a top down 2D game. I want certain sounds’ attenuation to be affected by our default listener (positioned at the centre of the screen) and some other sounds to be affected by their distance from the player’s character, which can be anywhere in the 2D coordinates.

What would be the best approach to tackle this problem?

Off the top of my head, if you set the distance attenuation mode of a spatializer effect to “off,” that spatializer effect will no longer attenuate the signal based on the event’s distance from the listener. This can be used in conjunction with a gain effect whose property is automated in order to create custom attenuation based on anything you choose, including (for example) the distance between the event instance and an arbitrary point calculated by your game’s code.

It is of course possible to apply this custom automation to a track’s fader volume instead of to a gain effect, but using a gain effect means that the gain effect can be made a preset effect (or part of a preset effect chain along with the spatializer effect) and used in multiple events.