FFT-based binaural panner

Download Links
Tool Site

The FFT-based binaural panner is not an external, but rather a project that consists of a couple of Max patches and 43 pairs of HRTF-subject files to be read by jit.matrix~.

It was created in 2011 as part of a thesis on the possibilities of binaural synthesis in combination with head-tracking, in an aesthetic/experience-oriented context. In the thesis project, the FFT-based binaural panner is a small, but essential, part of a larger construction in Max, involving sensor-based head-tracking, room simulation, configurable sound- and listener positions etc..

The project is inspired by other Max projects such as Head in Space (http://sites.google.com/site/dariopizzamiglio/projects/head-in-space) and Binaural Tools (http://www.ece.ucdavis.edu/binaural/binaural_tools.html). The FFT-based binaural panner was created to reduce the substantial load on the cpu that comes with convolution in the time-domain. It uses pfft~ to perform an equivalent but ‘cheaper’ mutliplying process in the frequency-domain.

The panner uses the impulse responses from the CIPIC HRTF database (http://interface.cipic.ucdavis.edu/sound/hrtf.html) but in a frequency-domain and interpolated version.

The project also includes a HRTFSubjectMatcher sub patch, which relies on a java external (‘HRTFSubjectMatcher.class’). The HRTFSubjectMatcher tries to match the entered listener measurements with the subjects in the CIPIC database. However it must be mentioned that it is by no means scientifically valid, but might serve at a rudimentary tool for selecting an appropriate HRTF file set.

Best regards
Jakob Hougaard Andersen

Dec 12 2011 | 8:58 am

Fantastic idea!

Sep 29 2012 | 7:11 pm

Hey Jacob –
Thanks for the great patch..
The javascript for the DirectionAndDistanceHandler seems a bit off to me – I was wondering if anyone else had a similar problem – the Left Ear Gain is in whole numbers, usually >50, whereas the other stuff seems to be scaling proportionately under 1.

Oct 03 2012 | 9:38 am

Hi m.
Tanks for your comment.
You are perfectly right – there is a problem. It is because I had included an earlier version of the DirectionAndDistanceHandler, where the second outlet was elevation for the panner. In the correct verson azimuth and elevation comes out as a list from the first outlet. I am sorry for this mistake, but now it should have been corrected. It is important to overwrite or delete the old DirectionAndDistanceHandler.class file, so that Max uses the new one instead. Be shure to write again, if there are other bugs.
Best regards Jakob

Jun 19 2015 | 4:34 pm

Hi Jacob,

Thank you for this wonderful tool. Could you let me know how do you convert the CIPIC database hrtf files in .mat form to .jxf format? . That would be really helpful.


Jun 20 2015 | 5:16 am

those hrtf files used to be pcm audio at first. to convert the to a fft filter setting you would just play a spike or noise through them.

Oct 25 2015 | 10:10 am

Hi Jakob and everyone else..
I am using this patch for a VR test using the oculus rift – leap motion.
I have implemented the [p virtualSoundPositioning] but I have some problems.. i think.

The 3D world I have build x = left and right, y = elevation and z = depth (front – back)
In this patch x = left (0)and right(10) position, y = front (10) – back(0) z = elevation 0 to 5 m

In order to make it work.. shell i just swap the my coordinates around, e.g. y becomes z ? Also I have added the listenerRot ( using the Oculus quat values do i need I need to convert anything for that?

The problems I am having:
When I navigate around the object (via the Oculus position) and i go around the left side of the object, i hear the sound in the left ear rather then the right ear (butt the object is on my right).
When I move the object, Oculus remains still, the panning seems to be correct. When I move my head up and down the sounds goes right (look up) and left (look down).

I have tried to send the coordinates to the nodes..so I can sew what happen but seems I can only control one node at the time ??

Please could you help me on this, just to check if I am doing something wrong.

Thank you guys.
Best F

Oct 25 2015 | 11:13 am

re quat : here what the patch says:
… optional listener rotation is given with unit quaternions with the scalar part as first argument

What I did is send the Oculus quat via [pack listenerRot f f f f] sent to [ s toDirectionAndDistanceHandler1] .
I am not sure what "the scalar part as first argument" is ? please help.

thank you.


Dec 02 2015 | 4:30 am

The DirectionAndDistanceHandler mxj object works with the x-y axes being the horizontal plane and the z axis is up. In many (most) 3D tools the y and z axes are opposite, so that y is up and z is forward. I don’t know why I built it the other way around, but that is how it works.
So you might have to swap some of the elements of the quaternions and positions coming from other ‘3D tools’ such as Occulus in order to make it work correctly.
I made a test where I sent rotation quaternion values from Unity (game engine) to Max Msp to be used in the DirectionAndDistanceHandler. And here I had to swap the y and z values to make it work – this makes quite good sense since in Unity, z is forward and y is up. So the order of values in the sent quaternion would be w x z y. And if I were to send position coordinates as well, the order would naturally be x z y.
Best regards Jakob

Viewing 8 posts - 1 through 8 (of 8 total)

Explore More

Subscribe to the Cycling ’74 Weekly Newsletter

Let us tell you about notable Max projects, obscure facts, and creative media artists of all kinds.

* indicates required