How to go about spatializing points in Ambisonics in response to a 3d virtual space in Unreal Engine.

Bryan Barrientos's icon

Hello!

I am doing a school project that uses Max in conjunction with incoming unreal engine 4 OSC data. I'm hoping someone familiar with this topic can point me in a direction of research or established math formulas where x, y, z and 360 degree rotational data has been used with ambisonics externals.

For example, recreating the position of an object in game with distance within ambisonics using pak aed can be done relatively easy along one plane, lets say the x axis. If within virtual space, the object is placed on 100 on the x-axis, and the incoming data of the players location is being received as it approaches the object from the starting point (0-100), one can scale an interaction like that to be between 1 and 0 and have it work with ambisonics in that matter.

My issue is that I am generally not familiar with these sort of interactions as there are many between the way the players camera is facing, and his movement along the x and y axis. If anyone can point me in the right direction so that I can make these object reflect their fixed space that would be very helpful.

Roman Thilenius's icon


i assume that it was given to you to program it yourself?

because otherwise there are max objects or VST plugins which do b-format panning right out of the box.

if head tracking is all you need a simple stereo signal and two instances of these processors will already do it.

for the positioning of many different virtual sound sources there are a few other technologies besides b-panning, depending on the speaker setup, output format, and how the sound sources look like.

Bryan Barrientos's icon

Hi Roman,

Thank you for your response.

No, I don't necessarily have to program it from scratch. Any object or vst would be very welcome!

At the moment I have been attempting in with ICST ambisonics package. While it has a mode (aed) that uses azimuth, elevation, and distance, I am having trouble figuring out how to achieve my desired result.

I used the players rotation, which I believe could be considered head tracking, as the value for the azimuth inlet in the aed format of the package, and used the function for distance to get the distance from my incoming x and y co-ordinates for the player into the distance inlet. In combination with my scaling which reflects the the object in unreal engines game space, this creates an effect where as I walk towards my object, It will reflect it in max in the ambimonitor and have my sound source come towards me (or away). It also reflects my in game camera rotation correctly!.. except for when it doesn't, which is when I walk to the side of the object in the game. When I do this the rotation is thrown off to a 90 degree offset and further thrown off as I walk.

In a sense, the ambimonitor tool and my current conversion have achieved essentially the reverse of what i want. It's as if my player has become the sound source in the interaction because it reflects my movements perfectly, with the object in the game (not the player) being dead center of the ambimonitor.

If by any chance you know of tools that make this simpler, or anything really, that'd be great as I've yet to fully explore other ambisonic solutions. I might attempt to use MASI if I can figure it out, although it seems more suited to the Unity engine. Also, I believe this might just be more of a question of correctly mapping things in a virtual space, so I apologize as my intention is not to ask someone teach me math! I beleieve that the use of my osc messages and data is probably my main issue here.

Roman Thilenius's icon


"except for when it doesn't, which is when I walk to the side of the object in the game."

you mean when you move as close as possible? maybe the distance is -0 then or the range is folded or so.

always monitor the data you get from #somewhere into max using numberboxes, print, whatever, in my experience one often assumes something is 7 or 9 but in fact it is not. the input should always be checked first before you start reparing your code. :)

Roman Thilenius's icon


btw if the audio sources are mono you could just replace b-panning with something else. b-format´s power is to rotate sources which are not point-sources. otoh if it works already, then dont replace it. :)