Forums > Jitter

Oculus Rift

May 7, 2013 | 2:15 pm

anyone else here got an oculus rift? (I bet some folks do!)

I’ve been playing with a sterescopic patch shared on the forums by Perry Hoberman:
and I’ve modified it for the oculus rift. It’s still a work in progress but it includes a mostly finished shader to compensate for the rifts optics.

no head tracking yet – managed to see the head tracker as a hi device but it wasn’t spitting anything out. Haven’t dug any further.
Would love to connect with anyone else experimenting with this!


May 26, 2013 | 6:46 pm


I’m not currently experimenting with the Rift, but hope to be soon. I’m especially interested in the head tracking, so if anyone does get some data from the tracker, I’d love to hear about it!



July 9, 2013 | 7:22 am

any progress?

July 18, 2013 | 8:59 pm

currently occupied by other projects. Hope the patch is a head start for anyone who wants to investigate further.

August 6, 2013 | 3:42 pm

I have an oculus dev kit.

Really just looking to get the head tracking data in msp.

Anyone have any luck or somewhere they can point me?


January 13, 2014 | 4:19 pm

Hi I just get the kit today and would like to play with it a little bit using Jitter did you get any progress with the head tracking ?

January 15, 2014 | 3:42 pm

Nope, haven’t dug any further. Only a start to stereoscopic rendering and display.

January 22, 2014 | 10:34 pm

It’s been over a week, so quick check if anyone made progress on head tracking since the last post. If not, I will take a decent stab at it over the next short while, will post a patch and instructions here if I manage to get it.

I have a good track record, got Kinect and recently Move data coming into Max (Win only for now, with ports planned in the future to Mac OS).

Thanks to Zeal for the stereoscopic starting point. We have a really fun project planned that anyone can try if they get all the same hardware we’ll be using.

February 4, 2014 | 2:46 pm

Just adding an update, the work is not all done yet, but I am confident I will be able to share a small utility to help the head tracking data to get into Max. Using Zeal’s shared patch as a test, we will be able to see that Rift head tracking data will be able to drive the rotations about XYZ axes via an OSC stream hacked onto the Rift SDK’s SensorBoxTest.

For C# hacks I tended towards the Ventuz.OSC.dll for simplicity, but in this case I will be trying out this oscpack utility:

I don’t see any more obstacles in the way of getting this, just a matter of time. Since I have a project to get this working for, it’s safe to expect that I will share the utility along with a modified version of Zeal’s patch showing it all working together.

Can anyone volunteer to help me test that I don’t accidentally release it in a way where it actually only works on my machine?

And sorry that this will be a bit of a hack, the reason being I would rather take my time integrating this into a larger project I am doing called "GestureLab", that will be a central interfacing application for HCI devices (Wiimote, Kinect, PS Move, Rift, etc.) that will have Max in mind as its initial output destination, but could go on to support other output destinations as well eventually (music applications, Unity, etc.)

February 4, 2014 | 4:17 pm

I can help test. sounds great. you can get me at bob(at)zealousy(dot)com.

February 9, 2014 | 2:37 am

Thanks! Looks like I will definitely need a tiny bit of help with the tail end of this, but maybe I can send you what I have so far and we can probably work out the rest.

I have the option of sending quaternions or converted euler angles into Max, which is still a step away from the nice degree-based rotatexyz option your patch defaults to (though I notice other options such as quat which look promising).

The trick here is to get the data where it needs to go with the minimum number of conversions so as to prevent loss of precision. (for example, notice the note for quat2euler object here: , which goes as far back as SIGGRAPH 1985, see here, where he warns the same: )

I am a CS animation student at the moment, so I hesitate to claim to know the best course of action.

Shall I send you what I have so far? (also, do you have a VS2010 environment already set up by chance?)

February 9, 2014 | 3:00 am

Sounds good. shoot me what you’ve got. Don’t have VS2010 set up right now but need to get it going for other max external work anyway so yes, send me source as well.

Just got one of these little fellas – – so I’m keen to get back into some rift experimentation.


February 9, 2014 | 3:09 am

Should have held out a little longer before my last message – success! I have managed to pipe the quaternion data directly without any conversions.

I’ll send it all over momentarily, and cool, I didn’t know there were 3rd party accessories for the Rift already!

February 17, 2014 | 12:34 pm

Hi Oculo-Maxists,

I’m also interested in helping out if you need more hands. I’m on a Mac set up if that still matters. That ovr vision business is pretty rad, btw.

I’m at tennie[dot]here[at]gmail.

Let me know what I can do to help.


April 2, 2014 | 2:06 pm

Hey mattt. Do you have Bootcamp set up on your Mac?

Do you have a Kinect by chance? Either way, we are trying to work on a slight bit of overlap on the far left/right edges of the displays for Rift alone.

With Kinect involved I’m also working on a slightly trickier problem.

April 15, 2014 | 8:35 am


I am a student in interactive visual arts and I am working on different projects using the Oculus Rift, and I would like to be able to implement its usage in Max / Msp. The only problem is I have no idea how to get the head tracking data inside Max.

If anyone has some progress on how to get the head tracking data in Max / Msp, please, send me a message or share your patch here. It seems like the only thread about that subject anywhere as far as I know.

You can reach me at guillaumebou (at) gmail (dot) com

Thank you

April 25, 2014 | 1:00 am

I am thinking of buying the rift, but only if it can work in max with full featureset. I am hoping someone will make this possible… pretty please? This is like the future, right?

May 29, 2014 | 9:49 am

Hello Guys, I have no idea If this topic is still being followed..

One of our clients asked us If it was possible to control some phisical
objects with oculus rift… we came out with this idea:

in unity ( our programmer is very comfortable with it ) we developed a
udp client that dumps the oculus XYZ, this was developed today for mac

Get the bridge here:

compiled for osx X86 and working on osx 10.8.5 on powerbook

first box ip , second box port then save… the default values are

we did just a small test and the values get dump with this code:

( I remind u this is just a small test, very very beta, not official release or anything, just a work around
that works for our challenge… )

"boxes" : [ {
"box" : {
"maxclass" : "newobj",
"text" : "mxj net.udp.recv @port 8051",
"fontsize" : 12.0,
"numinlets" : 1,
"patching_rect" : [ 291.0, 134.0, 165.0, 20.0 ],
"id" : "obj-4",
"numoutlets" : 2,
"outlettype" : [ "", "" ],
"fontname" : "Arial"

, {
"box" : {
"maxclass" : "newobj",
"text" : "print",
"fontsize" : 12.0,
"numinlets" : 1,
"patching_rect" : [ 297.0, 184.0, 34.0, 20.0 ],
"id" : "obj-2",
"numoutlets" : 0,
"fontname" : "Arial"

"lines" : [ {
"patchline" : {
"source" : [ "obj-4", 0 ],
"destination" : [ "obj-2", 0 ],
"hidden" : 0,
"disabled" : 0

"appversion" : {
"major" : 6,
"minor" : 1,
"revision" : 6,
"architecture" : "x86"


Have Fun

  • This reply was modified 4 months by  wateb.

May 29, 2014 | 11:15 pm

Hi all,

I’ve been working on an oculus max object over the last couple of days, and seems to be working well. I wondered if I could reach out to you Rift owners to get some feedback, especially about the rendering distortion (and thanks to Bob ZEAL (and Perry Hoberman) for inspiration on how to do the Jitter patching for that). The OVR SDK documentation is pretty obscure, so there was some guesswork involved. Currently I only have it built for OSX, but it should be possible to build on Windows.

Binary attached, code here:



May 29, 2014 | 11:27 pm

I should add that I was pleasantly surprised to get 60fps on my macbook pro with mediocre integrated Radeon 6490M GPU, though the scene is fairly simple. It’s not easy to do complex scenes for the Rift because of the multi-pass rendering. (Plus some effects that work well in 2D don’t translate well to stereoscopic display, e.g. many screen-based effects). But I think there are still some improvements that could be made to the rendering pass / distortion stuff in the max patch, which might speed it up a bit more.

  1. Screen-Shot-2014-05-30-at-2.59.22-PM

June 2, 2014 | 12:10 pm

Updated to LibOVR version 0.3.2, which resolved most of the ambiguities, and now uses the faster mesh-based rendering, and implements chromatic aberration correction:

June 3, 2014 | 4:13 am

This works quite well on my mac pro

the only thing I can’t get working is the orientation
does it work ?

Amazing Job u got here…

June 3, 2014 | 5:52 am

Orientation works, though I haven’t gone to town with the prediction/timewarp stuff… I’m not sure if that’s going to be possible in Jitter.

Just built for Win32 too, also working. I’m testing with the DK1 headset, is that what you are using?

June 3, 2014 | 6:55 am

I am sorry , by bad, I had a problem with the usb cable, now everything is working
very well… I’m gonna connect a servo so I can control it with the headset…

Amazing work u done here

June 11, 2014 | 12:21 am

For anyone still reading, I’ve moved this object to it’s own repository here:

June 25, 2014 | 4:05 am

Great work!!! Thx for sharing.

August 26, 2014 | 12:24 pm

Fantastic work! Any plans to support dk2 in this object?

August 26, 2014 | 1:35 pm

Yes of course, as soon as my DK2 arrives, hopefully by October. I can try to port sooner, but of course I can’t test it… do you have a DK2?

August 26, 2014 | 1:50 pm

Yes of course, as soon as my DK2 arrives, hopefully by October.

The SDK that supports the DK2 is currently only in beta. It also depends on a global service running in the background, which currently means users would need the SDK installed. I guess there’ll probably be a separate runtime installer available once it gets out of beta.

August 26, 2014 | 4:27 pm

My DK2 arrived two days ago. Would be happy to help with testing.

I installed the latest runtime and it did install the global background service on its own, without the SDK. Mountain Lion or higher is now required.

September 10, 2014 | 10:32 am

Today I got access to a DK2, so I’ll be updating the project now :-)

September 10, 2014 | 9:16 pm

Well, I managed to get the new SDK working with position tracking, but the performance is disappointing. Might be something I’ve done wrong so I won’t upload yet, but on a GTX 780 (Windows 7, Nvidia driver 335.23) I was only getting ~40fps with the oculus full screen and a minimal scene.

Obviously the DK2 has twice resolution, but with the DK1 I had consistent 60fps even with highly complex scenes, and multiple additional displays. I can’t believe that the resolution alone is to blame; I’ve probably made some silly mistake somewhere; will keep digging to see if I can figure it out.

September 11, 2014 | 12:59 am

Thanks for the update, and for your work on this. I look forward to hearing what you figure out — I have heard that there are issues with using extended desktop mode with dk2. On my Mac it will only work in mirrored mode using the supplied Unity demo.

September 25, 2014 | 12:43 pm


I have a dk2 (not dk1) does this patch work with dk2 yet? I cant get the patch to recognise the dk2.


September 25, 2014 | 5:17 pm

I’ve been swamped with other work and will be for a couple of weeks more, but will be looking at it again after that. What I was seeing was very poor performance on a pretty high-end Windows machine + GPU. I may need to try the non-extended desktop mode for Windows, but this will break compatibility with Mac; need more time to experiment. OTOH it might be necessary to change how it works for the time warp stuff to be usable anyway.

FWIW the updated code is up on github though (in the sdk 0.4.2 branch), but no binaries yet.

September 25, 2014 | 8:06 pm

I’d be happy to take a look, now that I have my DK2 Graham. I should really have posted the DK1 code I had at the time in here, but frankly it was everything I could do to just complete my animation project and move on, the DK1 really made me dizzy and a little nauseous every time I used it.

At least I will have both your codebase and my own to work from, so maybe I can get some insight as to where the bottleneck is that’s preventing 60 FPS.

Also remembering to check the Notify Me checkbox in case I forget to log in to the forum for months at a time again ;)

October 9, 2014 | 3:23 pm

Here is a quick update:

I just received my VR Developer Mount for the Leap Motion, and affixed it to my DK2. I am going to be working on a project for the next little while that combines the usage of these two devices with a Kinect for Windows (v2) in an avateering context.

I’ll try and set up some kind of a blog to show progress, so I will post a link once I’ve gotten far enough to be able to show something.

I did already have the Kinect and Rift working together in Jitter on a previous project, but there were plenty of issues remaining to work around. Hopefully it will be possible to get a really nice stereoscopic image via the DK2, and be able to toggle between Rift and Kinect-based positional head-tracking. I will try to complete a scene graph that has toggles like this – i.e. another one to toggle between Kinect and Leap Motion for driving skeletal nodes in the hand.

October 9, 2014 | 4:38 pm


I have Max/Msp/Jitter/Gen, Rift DK2, Leap Motion (+ vr developer mount), Kinect, a Mac and a Windows computer (spec’d highly for the dk2 and beyond).

Count me in for testing out DK2 + max integration.



October 9, 2014 | 9:57 pm

Ditto. I did a project a couple of months pairing a couple of kinects (not v2) with a Rift (DK1) and using an Aruco marker for tracking, which worked pretty well but not perfectly. As well as the Rift external I have Kinect, Aruco and some OpenCV & PCL externals in development, but not fully stable yet. Planning to open source these later in the year (when my schedule relaxes a bit) and would be very happy to co-develop if anyone is interested.

October 13, 2014 | 12:47 pm

DK2 max 6.1.8 on osx 10.8.5.

Thanks Graham the perspective correction looks amazing. No luck with the head tracking yet though. I can run Tuscany Demo and it works with that. I also installed Latest beta from OculusVr. Does any one have it working or whatsup ? When I go to configure it looks like it finds the headset but not grabbing rotation values. Attached is what prints from max when I click configure message.

  1. Screen-Shot-2014-10-13-at-3.42.06-PM
  2. Screen-Shot-2014-10-13-at-3.42.23-PM

October 13, 2014 | 12:56 pm

Good to know – In the spring we had head tracking working but perspective correction was a bit broken.

I know exactly what is needed to implement head tracking (should still be there in my old modification of the cube rotation demo) so I will try and submit a pull request with the code.

October 22, 2014 | 10:36 am

Does anyone have the DK2 working with MAX? I’d love to try it out!

October 22, 2014 | 9:39 pm

Having the Kinect 2 and DK2 working in Max has me very excited to get my hands dirty in max again! Thanks for all the info so far.

October 22, 2014 | 11:03 pm

Received my DK2 finally, but will be tied up with moving country for the next few weeks, and won’t have time to get back to the oculus external until probably mid-November. Will post as soon as I do.

@KCOUL, did you make any progress?


Viewing 44 posts - 1 through 44 (of 44 total)