FFT-based binaural panner

The FFT-based binaural panner is not an external, but rather a project that consists of a couple of Max patches and 43 pairs of HRTF-subject files to be read by jit.matrix~.

It was created in 2011 as part of a thesis on the possibilities of binaural synthesis in combination with head-tracking, in an aesthetic/experience-oriented context. In the thesis project, the FFT-based binaural panner is a small, but essential, part of a larger construction in Max, involving sensor-based head-tracking, room simulation, configurable sound- and listener positions etc..

The project is inspired by other Max projects such as Head in Space (http://sites.google.com/site/dariopizzamiglio/projects/head-in-space) and Binaural Tools (http://www.ece.ucdavis.edu/binaural/binaural_tools.html). The FFT-based binaural panner was created to reduce the substantial load on the cpu that comes with convolution in the time-domain. It uses pfft~ to perform an equivalent but ‘cheaper’ mutliplying process in the frequency-domain.

The panner uses the impulse responses from the CIPIC HRTF database (http://interface.cipic.ucdavis.edu/sound/hrtf.html) but in a frequency-domain and interpolated version.

The project also includes a HRTFSubjectMatcher sub patch, which relies on a java external (‘HRTFSubjectMatcher.class’). The HRTFSubjectMatcher tries to match the entered listener measurements with the subjects in the CIPIC database. However it must be mentioned that it is by no means scientifically valid, but might serve at a rudimentary tool for selecting an appropriate HRTF file set.

Best regards
Jakob Hougaard Andersen

December 12, 2011 | 8:58 am

Fantastic idea!

September 29, 2012 | 7:11 pm

Hey Jacob –
Thanks for the great patch..
The javascript for the DirectionAndDistanceHandler seems a bit off to me – I was wondering if anyone else had a similar problem – the Left Ear Gain is in whole numbers, usually >50, whereas the other stuff seems to be scaling proportionately under 1.

October 3, 2012 | 9:38 am

Hi m.
Tanks for your comment.
You are perfectly right – there is a problem. It is because I had included an earlier version of the DirectionAndDistanceHandler, where the second outlet was elevation for the panner. In the correct verson azimuth and elevation comes out as a list from the first outlet. I am sorry for this mistake, but now it should have been corrected. It is important to overwrite or delete the old DirectionAndDistanceHandler.class file, so that Max uses the new one instead. Be shure to write again, if there are other bugs.
Best regards Jakob

June 19, 2015 | 4:34 pm

Hi Jacob,

Thank you for this wonderful tool. Could you let me know how do you convert the CIPIC database hrtf files in .mat form to .jxf format? . That would be really helpful.


June 20, 2015 | 5:16 am

those hrtf files used to be pcm audio at first. to convert the to a fft filter setting you would just play a spike or noise through them.

October 25, 2015 | 10:10 am

Hi Jakob and everyone else..
I am using this patch for a VR test using the oculus rift – leap motion.
I have implemented the [p virtualSoundPositioning] but I have some problems.. i think.

The 3D world I have build x = left and right, y = elevation and z = depth (front – back)
In this patch x = left (0)and right(10) position, y = front (10) – back(0) z = elevation 0 to 5 m

In order to make it work.. shell i just swap the my coordinates around, e.g. y becomes z ? Also I have added the listenerRot ( using the Oculus quat values do i need I need to convert anything for that?

The problems I am having:
When I navigate around the object (via the Oculus position) and i go around the left side of the object, i hear the sound in the left ear rather then the right ear (butt the object is on my right).
When I move the object, Oculus remains still, the panning seems to be correct. When I move my head up and down the sounds goes right (look up) and left (look down).

I have tried to send the coordinates to the nodes..so I can sew what happen but seems I can only control one node at the time ??

Please could you help me on this, just to check if I am doing something wrong.

Thank you guys.
Best F

October 25, 2015 | 11:13 am

re quat : here what the patch says:
… optional listener rotation is given with unit quaternions with the scalar part as first argument

What I did is send the Oculus quat via [pack listenerRot f f f f] sent to [ s toDirectionAndDistanceHandler1] .
I am not sure what "the scalar part as first argument" is ? please help.

thank you.


Viewing 7 posts - 1 through 7 (of 7 total)

Explore More