Hi there everyone.
I hope this is not going to be too much of a noobie question, but: I´d like to create two separate atmospheres that would not change when player moves or rotates the camera - but would slowly crossfade each other when player would enter certain height in the level. Is this possible? I am just slowly getting into FMOD and I am no programmer - so the easier the solution, the better. I know how to work with 3D timeline event, I understand how distance parameter attenuates other parameters by creating envelopes etc - but if I understand it correctly, you simply cannot “walk away” from a 2D stereo atmosphere, that is looping all over the place and is not applied to any specific object. I know how I would do this if I wanted to crossfade several sounds/effects depending on my distence IF these were 3D events. But what about 2D? How to tell FMOD/Unreal engine to start fading in other 2D event as soon as I enter certain height (for example a top floor of some room, or a top of a hill…)? Or is it in how I set stuff up in Unreal engine (like creating some kind of a box which would represent an area, where it would trigger another event and slowly fade the first one?..) ?
I hope I wrote it understandably, sorry if my english little chaotic.
Hi there everyone.
Hi, the best way imo would be to send height as a parameter of the 2D event instance and set the desired automation related to this parameter in FMOD (as you said 3D events aren’t maid to use only one of 3 coordinates).
If your 2 ambiances hasn’t any timing relationship, you could instead easily use UE built-in sound engine, but since you’re using FMOD, stick with it.
As Alcibiade says, a sensible option would be to automate the volumes of your two atmospheres on a game parameter that represents in-game altitude.
Thank you guys, I guess I understand what you mean. But yeah, I´ve spent one whole day looking for a parameter that would actually control player character altitude - with no luck. I am level zero in Unreal, I thought its going to be something simple, like “player location-Z” or something like that. But there is nothing like that anywhere and all the forums I checked say something about ray-tracing stuff, which is way beyond my knowledge, or a complicated event graph combinations, which are like reading alien language to me…
It’s not that complicated. I’m gonna help.
Your FMOD event should look like this:
In Unreal, there’s several ways to do it, but here’s mine. I’ve implemented that in the GameMode blueprint, in which I’ve added an FMOD component (the music event).
It works perfectly.
By the way, could the staff confirm: since the blueprint node “set parameter” asks for the parameter name, I presume it corresponds to the API “set parameter by name”, so for a parameter continuously updated each tick (as it is here), it’s less optimized than calling “set parameter by ID” in C++. Isn’t it?
thank you so much, this seems like the way to go for me, but - i have a problem with this solution. when i set “getActorTransform” node, all i can see there is only “target” and “return value”, but no values like x, y, z - there is simply nothing like that. what i am doing wrong?
You have to right click on “return value” and choose Split Struct Pin. Then also split location.
In fact, a more optimized version would be something like that. The parameter isn’t updated at each tick anymore, but at a rate you think to be sufficient. It’s less reactive but could be enough for your use. But you should then set an appropriate seek speed in FMOD for that parameter.
Our colleagues here propose the best way to approach this solution. Having a game parameter is always a great way to implement logic as you have more control overall.
@Alcibiade nice logic there and thanks for sharing.
Allow me to share a tweak on Alcibiades solution that will make it compatible with various terrains. You don’t need to do it if your world is kinda flat but it can save you hours of sound design if your world has diverse terrain.
When I was audio lead for MMORPG we had the same problem to solve as @Tomas_F is facing. The thing was that if you check only for the height relative to the world (only Z), then you don’t get the difference from the terrain, which might be ok or not, depending on how you populate the ambience of your world.
If we implement a simple vertical downward ray cast from the player (listener) looking for the distance between the player and the terrain surface (object or tag), we can then subtract this value with the player’s Z to get the “real” altitude related to the game level. As an example, that way the ambience can change to “Rooftop”, only when the player is on a rooftop even if the rooftop is upon a building located upon a hill, and if the player runs on a hill the altitude will be zero and not a “rooftop” zone. It’s not complicated and allows for your algorithm to be implemented in games with any terrain.
You can also expand the logic, by using both the player’s altitude_from_terrain and altitude_from_sea_level (the full Z value probably, depending on how you setup your levels). That way you can also create a second layer of ambience to use together with the primary.
The primary will contain anthropophonic sounds like traffic, walla, etc. which will change according to the altitude_from_terrain parameter, and the second layer will contain geophony and biophony like strong winds and soaring eagles, etc. which will change according to the altitude_from_sea_level.
That way you will continue to hear different ambience from street level to rooftop level, but if the rooftop is located on a mountain, you will also hear stronger winds and soaring eagles.
Whatever the solution you might implement, don’t forget to put a timed transition between the different altitude ambience loops, to achieve smoother playback if your player is free falling.
Kind of strange example but you get the picture.