Why does the scatter even spatialise when no spatialisation insert is present?

I’ve tried both. Yeah, that’s what I figured too.

I want the scattered sounds to be placed in the 3D world, so that I would hear which way they’re coming from if I turn around. By vague I just mean that I’d only be able to pinpoint their location to a general direction like 120degress and not within 10degress. That’s what the “minExtent” knob seems to do, right? And, with the min distance, I was after having no sounds spawned so close that I would pass through them when I turn around.

I don’t have an advanced setup, and I can start over: I’ll just tell you the scenario, and you’ll probably know the best way to do it: I’m trying to scatter bird sounds in the distance around the player. (Close sounds are made separately by actual bird objects.) If scattered sounds spawned close too, they might spawn in empty air, with no bird mesh in sight…

I probably spawn it the wrong way too. If I have a emitter stuck to my player, the spatialization is lost, and if I play the event as 2D, it works - but if I walk away from the location the sounds don’t spawn around my new location…

Your understanding is correct: The extent of a spatialized signal determines how broadly the signal is spread over multiple speakers when it is a distance away from the listener, and min extent determines the lowest possible value that extent can have.

Hmm… Besides extent, how are you spatializing the signal of each individual spawn? The primary means by which spatialization is achieved are:

  • Attenuating the sound based on the emitter’s distance from the listener, such that more distant sounds are quieter and thus sound like they’re coming from further away.
  • Panning the sound based on its position relative to the listener, such that sounds seem like they’re coming from the direction from the listener to the emitter.

Can I assume you have taken steps to ensure that these changes are applied to the sounds spawned by your scatterer instrument - perhaps by adding spatializer effects to the events referenced by the event instruments in your scatterer instrument’s playlist?

Wait, you mean the scatter instrument doesn’t do that itself? It only provides virtual locations? And then the attenuation and/or panning needs to be applied by you?

I have got one event with a scatterer instrument that has nested events with spatialization applied, and that one seems to work the way I want, so that is the required setup then?

Those virtual locations though. I want them to keep spawning around my player (follow player), but not move once spawned (not follow player) (or even worse, rotate with player once spawned) What setup would allow for that?

As you surmise, the scatterer instrument doesn’t normally spatialize the sounds it spawns; it only generates random positional offsets, and passes them to the sounds it spawns. (As a convenience feature, it does add a hidden spatializer effect with default settings to single instruments in its playlist, but this benefit only applies to single instruments and has some limitations that prevent it from being useful in your use-case.)

For playlist entries that are event instruments, the easiest way to spatialize them is to put a spatializer effect on the referenced event’s master track; for other instrument types, the easiest way is to wrap the instrument in a nested event and put a spatializer effect on the referenced event’s master track.

That should work, provided the nested events contain spatializing behavior of some kind. Spatializer effects on their master tracks would do it.

Replacing the event instruments in the scatterer instrument’s playlist with “start event” command instruments would do it. Event instances spawned by command instruments inherit the parent event’s position (with the offset generated by the cattered instrument) when they’re created, but do not move with the parent event thereafter.

Note that you can’t target nested events with a “start event” command instrument, so you’ll have to convert your nested events into non-nested events if you want to do this. To convert a nested event into a non-nested event, select the nested event in the events browser and drag it to any location outside of its parent event.

1 Like

Many thanks for the answer!
I should be able to get that working now. It’s more steps to do a (in my mind basic) thing than I expected from my first experience with fmod. But it’s cool that nesting allow you to make more complex things to.

It’s because there’s no One True Way to spatialize a sound. The spatializer effect suits many use-cases, but “most” isn’t “all,” so we can’t assume that everyone wants their scatterer instruments’ spawned sounds to be spatialized that way.

While this thread was very helpful to understand a few concepts, I still can’t get my Scatterer Instrument working the way I want to.

I’m working on a 3D isometric game. I have a 2D Ambience event that plays a bed loop and another track with Scatterer Instrument. This Scatterer Instruments has Event Instrument referecing a 2D event in its playlist with the sounds I want to be played.

At first I tried the solution I used for the 2D games I worked on: randomizing the panning of the referenced event. While that works for spawning the sounds in different positions, I want the spawns to stay there as the player moves around it and hears the direction changing.

I tried adding a Spatializer to the referenced event like it was discussed here but as soon as I do that, the sounds stop playing. If I toggle the Spatializer off I can hear the sounds again.

I tried adding the Command Instrument like Joseph mentioned, but same issue.

What am I missing? Is it something with the referenced event’s Min & Max Distances? The Scatterer Instrument Min & Max Distances? Both?

Managed to make the sounds audible by tweaking the distances a bit more.

But the spawned sounds don’t change as the player moves in the world, they sound in the same space in the panorama from start to finish. So they efefctvely work like the 2D Event with the Randomized panning solution,

I tried using the Command Instrument and “play event” but same thing happens and weirdly enough, the sounds all play in the right speaker now.

Any way to make them spawn randomly in the world, with the player hearing the change in direction while hey move around it? O does this has to be done in code?

Where is the event instance that contains the scatterer instrument relative to the listener? I ask because this information is key to how the scatterer instrument works.

The scatterer instrument provides a random offset to each sound that it spawns, “scatterering” them randomly in an area centered on the event instance that contains the scatterer instrument. Thus, if the event instance containing the scatterer instrument in the same location as the player, it will scatter its spawned sounds in an area centered on the player; whereas if the event instance containing the scatterer instrument is at a fixed position in the game world, it will scatter its spawned sounds around that position in the world.

The 3D positions generated by the scatterer are used to determine the final positions of the spawned sounds for the purposes of those spawned sounds’ position-dependent behavior (such as spatialization).

As you have observed, this will not work. This is because the randomly-generated panning adjustments you are applying to the nested event instances are not based on the positions of those nested event instances relative to the listener in 3D space, and so do not change when the position of the nested events changes relative to the listener.

If you want the panning and attenuation applied to a sound to change when the listener moves, the amount of panning and attenuation applied to that sound must be dependent on the position of the event instance relative to the listener. To ensure the panning and attenuation applied to the event instance is dependent on the position of the event instance relative to the listener, it must either be applied by a spatializing effect or be automated on a 3D parameter.

This is probably because the event instance containing the scatterer instrument is too far away from the listener.

The spatializer effect attenuates the volume of an event instance depending on the distance between the event instance and the listener: The further the distance, the more the volume is attenuated. If the distance between the listener and the event instance is greater than the max distance property of the event, that event instance is attenuated all the way to silence, to simulate the sound being too distant to hear.

Check the max distance property of the event (or, if the spatializer effect is set to override the event’s max distance, check the max distance of the spatializer effect), and the distance between the event emitter and the listener in your game world. If you want at least some of the sounds spawned by the scatterer instrument to be audible, the distance between the listener and the event emitter must be less than the sum of the spatializer’s max distance and the scatterer’s max distance; and if you want all of the sounds spawned by the scatterer instrument to be audible, the spatializer’s max distance must be greater than the scatterer’s max distance and the sum of the distance between the listener and the emitter.

The purpose of using a command instrument instead of an event instrument in a scatterer instrument’s playlist is to allow the sounds spawned by a scatterer instrument to remain where they are even when the event instance containing the scatterer instrument moves. Replacing the event instrument with a command instrument does not otherwise change the position of the spawned sounds, and so will not cause sounds that are too far away to hear to become audible.

If this is the case, and the event instances spawned by the scatterer instrument containg spatializing effects, it is likely that at least one of the following is true:

  • The event instance containing the scatterer instrument is moving in sync with the listener. This is likely if they are both attached to the camera. (If this is the case, then using command instruments should solve the problem.)
  • The listener or event instance’s 3D attributes are not being updated every frame. This can happen if the listener is not being attached to the camera, or if it is being attached to the camera but the listener or event instance’s 3D attributes are not being updated by your game’s code when they move.

Of course. That’s exactly what the scatterer instrument is for. We just need to identify what is preventing it from working correctly in this case.

First of all, thank you so much for the quick and very detailed reply! And sorry I couldn’t reply sooner.

Where is the event instance that contains the scatterer instrument relative to the listener?

So, this event is 2D (no spatializer) with 3 tracks: One consisting of just a single instrument loop and the other two containing the scatterer instruments (with referenced events containing spatializer effects inside them).
I just checked via profiler and this main event seems to be at a distance of about 1.18k from the listener.

I managed to tweak the distances in the scatterer and spatializer on the referenced events and the sounds are audible now.

I still need to make them change attenuation related to the player though.

  • The listener or event instance’s 3D attributes are not being updated every frame. This can happen if the listener is not being attached to the camera, or if it is being attached to the camera but the listener or event instance’s 3D attributes are not being updated by your game’s code when they move.

This might be it maybe? Our game is isometric so the Listener is not attatched to the Camera. It is placed in the above the center of the screen.

Attatching the main 2D event to the camera should solve it?

Can the player “scroll” the view in order to view different parts of the map? (Moving their avatar withing the game world counts, if the view follows them around as they move.) If so, when the view scrolls to show a different part of the map, does the listener move to remain above the center of the screen, or does it remain above the section of the map it started at while the view moves away?

I ask because the movement the listener relative to the event emitter is what’s used to achieve the change in attenuation that you want. If you want the attenuation of the event to change as the player or camera moves around, the listener must move around as the player or camera does. The listener does not necessarily need to be at the exact same location or attached to the same game object as the camera, but it does need to move around in the same way.