Forums > Jitter

Oculus Rift

May 7, 2013 | 2:15 pm

anyone else here got an oculus rift? (I bet some folks do!)

I’ve been playing with a sterescopic patch shared on the forums by Perry Hoberman:
and I’ve modified it for the oculus rift. It’s still a work in progress but it includes a mostly finished shader to compensate for the rifts optics.

no head tracking yet – managed to see the head tracker as a hi device but it wasn’t spitting anything out. Haven’t dug any further.
Would love to connect with anyone else experimenting with this!


May 26, 2013 | 6:46 pm


I’m not currently experimenting with the Rift, but hope to be soon. I’m especially interested in the head tracking, so if anyone does get some data from the tracker, I’d love to hear about it!



July 9, 2013 | 7:22 am

any progress?

July 18, 2013 | 8:59 pm

currently occupied by other projects. Hope the patch is a head start for anyone who wants to investigate further.

August 6, 2013 | 3:42 pm

I have an oculus dev kit.

Really just looking to get the head tracking data in msp.

Anyone have any luck or somewhere they can point me?


January 13, 2014 | 4:19 pm

Hi I just get the kit today and would like to play with it a little bit using Jitter did you get any progress with the head tracking ?

January 15, 2014 | 3:42 pm

Nope, haven’t dug any further. Only a start to stereoscopic rendering and display.

January 22, 2014 | 10:34 pm

It’s been over a week, so quick check if anyone made progress on head tracking since the last post. If not, I will take a decent stab at it over the next short while, will post a patch and instructions here if I manage to get it.

I have a good track record, got Kinect and recently Move data coming into Max (Win only for now, with ports planned in the future to Mac OS).

Thanks to Zeal for the stereoscopic starting point. We have a really fun project planned that anyone can try if they get all the same hardware we’ll be using.

February 4, 2014 | 2:46 pm

Just adding an update, the work is not all done yet, but I am confident I will be able to share a small utility to help the head tracking data to get into Max. Using Zeal’s shared patch as a test, we will be able to see that Rift head tracking data will be able to drive the rotations about XYZ axes via an OSC stream hacked onto the Rift SDK’s SensorBoxTest.

For C# hacks I tended towards the Ventuz.OSC.dll for simplicity, but in this case I will be trying out this oscpack utility:

I don’t see any more obstacles in the way of getting this, just a matter of time. Since I have a project to get this working for, it’s safe to expect that I will share the utility along with a modified version of Zeal’s patch showing it all working together.

Can anyone volunteer to help me test that I don’t accidentally release it in a way where it actually only works on my machine?

And sorry that this will be a bit of a hack, the reason being I would rather take my time integrating this into a larger project I am doing called "GestureLab", that will be a central interfacing application for HCI devices (Wiimote, Kinect, PS Move, Rift, etc.) that will have Max in mind as its initial output destination, but could go on to support other output destinations as well eventually (music applications, Unity, etc.)

February 4, 2014 | 4:17 pm

I can help test. sounds great. you can get me at bob(at)zealousy(dot)com.

February 9, 2014 | 2:37 am

Thanks! Looks like I will definitely need a tiny bit of help with the tail end of this, but maybe I can send you what I have so far and we can probably work out the rest.

I have the option of sending quaternions or converted euler angles into Max, which is still a step away from the nice degree-based rotatexyz option your patch defaults to (though I notice other options such as quat which look promising).

The trick here is to get the data where it needs to go with the minimum number of conversions so as to prevent loss of precision. (for example, notice the note for quat2euler object here: , which goes as far back as SIGGRAPH 1985, see here, where he warns the same: )

I am a CS animation student at the moment, so I hesitate to claim to know the best course of action.

Shall I send you what I have so far? (also, do you have a VS2010 environment already set up by chance?)

February 9, 2014 | 3:00 am

Sounds good. shoot me what you’ve got. Don’t have VS2010 set up right now but need to get it going for other max external work anyway so yes, send me source as well.

Just got one of these little fellas – – so I’m keen to get back into some rift experimentation.


February 9, 2014 | 3:09 am

Should have held out a little longer before my last message – success! I have managed to pipe the quaternion data directly without any conversions.

I’ll send it all over momentarily, and cool, I didn’t know there were 3rd party accessories for the Rift already!

February 17, 2014 | 12:34 pm

Hi Oculo-Maxists,

I’m also interested in helping out if you need more hands. I’m on a Mac set up if that still matters. That ovr vision business is pretty rad, btw.

I’m at tennie[dot]here[at]gmail.

Let me know what I can do to help.


April 2, 2014 | 2:06 pm

Hey mattt. Do you have Bootcamp set up on your Mac?

Do you have a Kinect by chance? Either way, we are trying to work on a slight bit of overlap on the far left/right edges of the displays for Rift alone.

With Kinect involved I’m also working on a slightly trickier problem.

April 15, 2014 | 8:35 am


I am a student in interactive visual arts and I am working on different projects using the Oculus Rift, and I would like to be able to implement its usage in Max / Msp. The only problem is I have no idea how to get the head tracking data inside Max.

If anyone has some progress on how to get the head tracking data in Max / Msp, please, send me a message or share your patch here. It seems like the only thread about that subject anywhere as far as I know.

You can reach me at guillaumebou (at) gmail (dot) com

Thank you

April 25, 2014 | 1:00 am

I am thinking of buying the rift, but only if it can work in max with full featureset. I am hoping someone will make this possible… pretty please? This is like the future, right?

May 29, 2014 | 9:49 am

Hello Guys, I have no idea If this topic is still being followed..

One of our clients asked us If it was possible to control some phisical
objects with oculus rift… we came out with this idea:

in unity ( our programmer is very comfortable with it ) we developed a
udp client that dumps the oculus XYZ, this was developed today for mac

Get the bridge here:

compiled for osx X86 and working on osx 10.8.5 on powerbook

first box ip , second box port then save… the default values are

we did just a small test and the values get dump with this code:

( I remind u this is just a small test, very very beta, not official release or anything, just a work around
that works for our challenge… )

"boxes" : [ {
"box" : {
"maxclass" : "newobj",
"text" : "mxj net.udp.recv @port 8051",
"fontsize" : 12.0,
"numinlets" : 1,
"patching_rect" : [ 291.0, 134.0, 165.0, 20.0 ],
"id" : "obj-4",
"numoutlets" : 2,
"outlettype" : [ "", "" ],
"fontname" : "Arial"

, {
"box" : {
"maxclass" : "newobj",
"text" : "print",
"fontsize" : 12.0,
"numinlets" : 1,
"patching_rect" : [ 297.0, 184.0, 34.0, 20.0 ],
"id" : "obj-2",
"numoutlets" : 0,
"fontname" : "Arial"

"lines" : [ {
"patchline" : {
"source" : [ "obj-4", 0 ],
"destination" : [ "obj-2", 0 ],
"hidden" : 0,
"disabled" : 0

"appversion" : {
"major" : 6,
"minor" : 1,
"revision" : 6,
"architecture" : "x86"


Have Fun

  • This reply was modified 1 year by  wateb.
May 29, 2014 | 11:15 pm

Hi all,

I’ve been working on an oculus max object over the last couple of days, and seems to be working well. I wondered if I could reach out to you Rift owners to get some feedback, especially about the rendering distortion (and thanks to Bob ZEAL (and Perry Hoberman) for inspiration on how to do the Jitter patching for that). The OVR SDK documentation is pretty obscure, so there was some guesswork involved. Currently I only have it built for OSX, but it should be possible to build on Windows.

Binary attached, code here:


May 29, 2014 | 11:27 pm

I should add that I was pleasantly surprised to get 60fps on my macbook pro with mediocre integrated Radeon 6490M GPU, though the scene is fairly simple. It’s not easy to do complex scenes for the Rift because of the multi-pass rendering. (Plus some effects that work well in 2D don’t translate well to stereoscopic display, e.g. many screen-based effects). But I think there are still some improvements that could be made to the rendering pass / distortion stuff in the max patch, which might speed it up a bit more.

  1. Screen-Shot-2014-05-30-at-2.59.22-PM


June 2, 2014 | 12:10 pm

Updated to LibOVR version 0.3.2, which resolved most of the ambiguities, and now uses the faster mesh-based rendering, and implements chromatic aberration correction:

June 3, 2014 | 4:13 am

This works quite well on my mac pro

the only thing I can’t get working is the orientation
does it work ?

Amazing Job u got here…

June 3, 2014 | 5:52 am

Orientation works, though I haven’t gone to town with the prediction/timewarp stuff… I’m not sure if that’s going to be possible in Jitter.

Just built for Win32 too, also working. I’m testing with the DK1 headset, is that what you are using?

June 3, 2014 | 6:55 am

I am sorry , by bad, I had a problem with the usb cable, now everything is working
very well… I’m gonna connect a servo so I can control it with the headset…

Amazing work u done here

June 11, 2014 | 12:21 am

For anyone still reading, I’ve moved this object to it’s own repository here:

June 25, 2014 | 4:05 am

Great work!!! Thx for sharing.

August 26, 2014 | 12:24 pm

Fantastic work! Any plans to support dk2 in this object?

August 26, 2014 | 1:35 pm

Yes of course, as soon as my DK2 arrives, hopefully by October. I can try to port sooner, but of course I can’t test it… do you have a DK2?

August 26, 2014 | 1:50 pm

Yes of course, as soon as my DK2 arrives, hopefully by October.

The SDK that supports the DK2 is currently only in beta. It also depends on a global service running in the background, which currently means users would need the SDK installed. I guess there’ll probably be a separate runtime installer available once it gets out of beta.

August 26, 2014 | 4:27 pm

My DK2 arrived two days ago. Would be happy to help with testing.

I installed the latest runtime and it did install the global background service on its own, without the SDK. Mountain Lion or higher is now required.

September 10, 2014 | 10:32 am

Today I got access to a DK2, so I’ll be updating the project now :-)

September 10, 2014 | 9:16 pm

Well, I managed to get the new SDK working with position tracking, but the performance is disappointing. Might be something I’ve done wrong so I won’t upload yet, but on a GTX 780 (Windows 7, Nvidia driver 335.23) I was only getting ~40fps with the oculus full screen and a minimal scene.

Obviously the DK2 has twice resolution, but with the DK1 I had consistent 60fps even with highly complex scenes, and multiple additional displays. I can’t believe that the resolution alone is to blame; I’ve probably made some silly mistake somewhere; will keep digging to see if I can figure it out.

September 11, 2014 | 12:59 am

Thanks for the update, and for your work on this. I look forward to hearing what you figure out — I have heard that there are issues with using extended desktop mode with dk2. On my Mac it will only work in mirrored mode using the supplied Unity demo.

September 25, 2014 | 12:43 pm


I have a dk2 (not dk1) does this patch work with dk2 yet? I cant get the patch to recognise the dk2.


September 25, 2014 | 5:17 pm

I’ve been swamped with other work and will be for a couple of weeks more, but will be looking at it again after that. What I was seeing was very poor performance on a pretty high-end Windows machine + GPU. I may need to try the non-extended desktop mode for Windows, but this will break compatibility with Mac; need more time to experiment. OTOH it might be necessary to change how it works for the time warp stuff to be usable anyway.

FWIW the updated code is up on github though (in the sdk 0.4.2 branch), but no binaries yet.

September 25, 2014 | 8:06 pm

I’d be happy to take a look, now that I have my DK2 Graham. I should really have posted the DK1 code I had at the time in here, but frankly it was everything I could do to just complete my animation project and move on, the DK1 really made me dizzy and a little nauseous every time I used it.

At least I will have both your codebase and my own to work from, so maybe I can get some insight as to where the bottleneck is that’s preventing 60 FPS.

Also remembering to check the Notify Me checkbox in case I forget to log in to the forum for months at a time again ;)

October 9, 2014 | 3:23 pm

Here is a quick update:

I just received my VR Developer Mount for the Leap Motion, and affixed it to my DK2. I am going to be working on a project for the next little while that combines the usage of these two devices with a Kinect for Windows (v2) in an avateering context.

I’ll try and set up some kind of a blog to show progress, so I will post a link once I’ve gotten far enough to be able to show something.

I did already have the Kinect and Rift working together in Jitter on a previous project, but there were plenty of issues remaining to work around. Hopefully it will be possible to get a really nice stereoscopic image via the DK2, and be able to toggle between Rift and Kinect-based positional head-tracking. I will try to complete a scene graph that has toggles like this – i.e. another one to toggle between Kinect and Leap Motion for driving skeletal nodes in the hand.

October 9, 2014 | 4:38 pm


I have Max/Msp/Jitter/Gen, Rift DK2, Leap Motion (+ vr developer mount), Kinect, a Mac and a Windows computer (spec’d highly for the dk2 and beyond).

Count me in for testing out DK2 + max integration.



October 9, 2014 | 9:57 pm

Ditto. I did a project a couple of months pairing a couple of kinects (not v2) with a Rift (DK1) and using an Aruco marker for tracking, which worked pretty well but not perfectly. As well as the Rift external I have Kinect, Aruco and some OpenCV & PCL externals in development, but not fully stable yet. Planning to open source these later in the year (when my schedule relaxes a bit) and would be very happy to co-develop if anyone is interested.

October 13, 2014 | 12:47 pm

DK2 max 6.1.8 on osx 10.8.5.

Thanks Graham the perspective correction looks amazing. No luck with the head tracking yet though. I can run Tuscany Demo and it works with that. I also installed Latest beta from OculusVr. Does any one have it working or whatsup ? When I go to configure it looks like it finds the headset but not grabbing rotation values. Attached is what prints from max when I click configure message.

  1. Screen-Shot-2014-10-13-at-3.42.06-PM


  2. Screen-Shot-2014-10-13-at-3.42.23-PM


October 13, 2014 | 12:56 pm

Good to know – In the spring we had head tracking working but perspective correction was a bit broken.

I know exactly what is needed to implement head tracking (should still be there in my old modification of the cube rotation demo) so I will try and submit a pull request with the code.

October 22, 2014 | 10:36 am

Does anyone have the DK2 working with MAX? I’d love to try it out!

October 22, 2014 | 9:39 pm

Having the Kinect 2 and DK2 working in Max has me very excited to get my hands dirty in max again! Thanks for all the info so far.

October 22, 2014 | 11:03 pm

Received my DK2 finally, but will be tied up with moving country for the next few weeks, and won’t have time to get back to the oculus external until probably mid-November. Will post as soon as I do.

@KCOUL, did you make any progress?


November 6, 2014 | 3:20 am


For those who can’t wait (like me) I’ve got Graham’s object working for DK2 and SDK 0.4 – many thanks to Graham for sharing the code. The Mac binary can be downloaded here. It works on Mavericks with DK2, but not tested on anything else – use at own risk. The help patch demo runs at ~34FPS here on my 2.3 GHz i7 MBP with GT650 1GB. I need to tidy up my edits to the source code, but then happy to share.

I’m also using this with Leap (V2 SDK), Kinect etc. I’m getting around 63 FPS max with animation of Leap data – using automatic mode to work with the Oculus, + to animate the finger joints and tips, connecting lines using, screenshot attached (fullscreen it on the Oculus display to view in stereo). Any tips to further optimise GL scenes welcome…


ps if you’re interested in what I’m doing with this stuff, see my occasionally updated blog at

  • This reply was modified 11 months by  rp. Reason: duplicated "here" with and without link
  1. leapOculusMax


December 2, 2014 | 12:17 am

First of all thank you guys! Both versions of the object work in max6. In max7 one "eye" is darkened, though making it unusable. So I will keep working with it in max6. But thought you should know, also does anyone have any idea why? Here is a screenshot…

December 2, 2014 | 12:21 am

Hi Chris,

The reason for this is likely due to the DK2 rendering in portrait rather than landscape. I am working on a new version as well, so hopefully I will have something to share soon.

December 2, 2014 | 3:47 am

I get the same problem here. It seems to be an issue with capturing to texture with the 2 cameras in the same world, which is done in the oculus_render sub-patch – which ever is instantiated first seems to render correctly, and the second appears darkened. Or if capture to texture is turned off on one camera, the other appears correctly. I’ve not explored further yet, but at first glance this seems to me like a Max 7 OpenGL bug.

December 2, 2014 | 6:06 am

Have made a test patch which demonstrates this is a rendering problem in Max 7. Have reported it to cycling 74. Not found a workaround at this point.

— Pasted Max Patch, click to expand. —
December 3, 2014 | 10:53 am

Ah this explanation makes sense – it could be a combination of both things – the OpenGL interface in Max 7 didn’t have a case to deal with unusual resolutions like DK2’s where the "screen" is in portrait orientation. If the eye that’s dark is the one that’s higher in the orientation, this would make sense. Also, if someone had a monitor set to portrait orientation they could repro with, then this would be the real reason for sure.

Either way, I guess we will have to wait for a fix for Max 7. At least Max 6 works in the meantime.

I am still working on my patch, which I will try to find optimizations for to get the frame rate a bit higher.

December 4, 2014 | 3:32 pm

That indeed illustrates the problem, and to confirm it has not been fixed in max 7.0.1. I wonder if it happens in windows? I wonder if there is a small chance it could be related to the much poorer performance of mac vs. pc?

December 5, 2014 | 11:38 am

I doubt the portrait orientation is the issue, but I’ve also asked to see what might be going wrong. Hopefully something with my patch, but I’m suspecting some OpenGL state is leaking, or maybe the depth buffer. If you really need to use Max 7, one workaround could be to use different nodes for each eye (but that implies a lot of ugly patching/scripting…)

BTW I have time again (finally) to work on the oculus object. RP, would you mind sending me whatever changes you made (ideally as a pull request on github, or just the individual edits/edited files if not)? Thanks!

December 5, 2014 | 12:32 pm

I doubt the portrait orientation is the issue, but I’ve also asked to see what might be going wrong. Hopefully something with my patch, but I’m suspecting some OpenGL state is leaking, or maybe the depth buffer. If you really need to use Max 7, one workaround could be to use different nodes for each eye (but that implies a lot of ugly patching/scripting…)

BTW I have time again (finally) to work on the oculus object. RP, would you mind sending me whatever changes you made (ideally as a pull request on github, or just the individual edits/edited files if not)? Thanks!

December 7, 2014 | 7:38 am

Just to let you know this will be fixed in the next update (thanks to Rob Ramirez!)

December 9, 2014 | 5:52 am

Great news that it will be fixed in the next update!

I’ve attached the edited source file here – all changes should be commented (labelled *RP* to locate easily).

(Here I also had to change at least one MacOS framework in the project due to compiling on Mavericks/Xcode 6).


December 18, 2014 | 2:00 pm

That is great news! How is the update coming? I’m getting more and more dependent on max7 everyday.

December 21, 2014 | 12:54 am

Hi all,

I’ve forked the project and updated it to the latest 0.4.4 SDK, and also applied RP’s changes. I’ve just submitted a pull request for these changes. I just realized though – the original repo had only the single LibOVR directory, which I assumed was the Mac SDK. So, my pull request included the Mac version of the SDK.

Perhaps in a 2nd pull request, I could re-link the Visual Studio solution on the Windows side to the Windows SDK, that could sit side by side with the Mac SDK? I can’t immediately see how to do it though – the way the dependencies are represented in Visual Studio is part of a giant folder of External Dependencies with no apparent separation of the LibOVR folder – when I swapped in the new SDK on the Windows side, it just seems to reconnect to the updated files.

Christopher – Max 7 compatibility will need to come from the Cycling ’74 side of things, not us. Hopefully the bug will be fixed in Max 7.0.2!

On the Mac side, there were some issues with the latest SDK, I had to remove some private files (OVR_OSX_FocusObserver and OVR_OSX_FocusReader) that were added for some reason, and in CAPI_GLE_GL.h I had to comment out as follows:

//#if defined(__gltypes_h_)
// #error gltypes.h should be included after this, not before.

Finally, I had to replace

typedef ptrdiff_t GLintptr;
typedef ptrdiff_t GLsizeiptr;


typedef intptr_t GLintptr;
typedef intptr_t GLsizeiptr;

But after this it seems to compile.

On the Windows side, I’ve updated the Windows project files to be compatible with VS2013, but should probably leave those changes out of the repo as the repo should really only contain either the Mac or Windows SDK. Since it seems like it was the Mac SDK that was there before, people building on Windows can just overwrite the LibOVR folder and build the .lib files. Here’s a good trick for getting around linker errors:

This only helped me build in debug mode – on two machines, I can’t seem to get around some linker errors in Release mode. Not sure if it’ll be a big issue, will wait to see how high the FPS is.

  • This reply was modified 9 months by  kcoul. Reason: Updated settings to VS2013
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
  • This reply was modified 9 months by  kcoul.
January 12, 2015 | 5:55 pm

I’m a little lost with all the various posts and forks and what not.

Id like to use the oculus external in Max 6 on Windows 7 (32 or 64bit).

Can someone give me a prod in the right direction?


January 12, 2015 | 6:19 pm

APS502: here’s a recap from my understanding:

I’ve just helped to update the code to be compatible with latest LibOVR, but noticed that the code structure makes it hard to maintain both Mac and Win against their respective Oculus SDKs unless another commit separates out the Mac and Win LibOVR directories (did there use to be only a single SDK for Mac and Win?).

You don’t have to wait – jus download from Graham’s repo and then download the Win Rift SDK, delete the LibOVR folder from the codebase’s folder, drop in the one from the Win SDK, and fix whatever’s broken in Visual Studio.

If you’d rather wait to have this all done, I can probably add another pull request to map the VS solution to a parallel folder for the Win LibOVR folder and then you could just download that and it would probably work OOB.

Keep in mind that to totally sync with the project you’ll have to be using the same year of Visual Studio. I was planning to build against VS2013 but if anyone prefers VS2012…!

January 12, 2015 | 6:29 pm

KCOUL As it happens I’m using VS 2012 so that would be preferred.

I think i’ll wait as that sounds a bit beyond me at this point!

Is it not better to make it for the oldest VS as the newer versions will be able to convert or is this a naive assumption on my part?


January 19, 2015 | 9:32 am

any news with win mxe?

January 19, 2015 | 9:37 am

You can build the mxe easily by following the steps I had provided.

APS502: its only a matter of switching the build tools in the project properties and a few other things. Its good practice to be acquainted with such settings.

To reiterate:

On Windows:
Download Git code.
Download Oculus Win SDK.
Replace LibOVR folder.
Adjust build settings in project properties.

We will need to wait for Grahams decision on whether we can set up his repo to be compatible on Mac and Win OOB.

  • This reply was modified 8 months by  kcoul.
January 21, 2015 | 2:40 pm

I’m working on it. Today updated to 0.4.4 SDK (ironically still in the 0.4.2 branch…) and tried things out on mac; builds & runs fine after a couple of small tweaks. Shouldn’t be a problem to merge with windows.

That said, on the mac there isn’t an easy way to rotate the displays, so the rendering sub patch needs to be modified (it’s a simple fix of adding @rotatexyz 0 0 90 to both the meshes). Also confirmed that the rendering is working fine again in Max 7 development head. Still a sickening ~30fps on my laptop, and even more laggy tracking, but hey — don’t use DK2 on a macbook…

For Windows I’d really like to have a go at rendering direct to rift, since that seems to be the recommended way going forward (and sounds like that’s what they plan for OSX too). The main task is sharing an OpenGL texture or two with the Rift display, and being careful about the threads; I think that should be possible from Jitter. Hopefully that will also reduce some of the overhead affecting tracking latency.

But first I’ll get this repo cleaned up and a working .mxe again, and them swap this branch with master. Will update on progress ASAP.

(BTW @KCOUL, I’ve been manually merging the mac oculus SDK into the windows SDK. The only differences are in the /Lib and the /Src/Displays folders. Seemed to make more sense than having two copies of the SDK in the repo…)

January 21, 2015 | 3:02 pm

that are wonderful news! really looking forward and many thanks all for your work!

January 23, 2015 | 12:40 pm

OK with a couple of project tweaks it builds fine in VS2012. I’m building this against the VS2010 sdk though, since that’s what is currently recommended by the Max SDK. Uploaded new .mxo and .mxe to with a few changes — there’s now two more outlets for position and tracking status. Runs fine on Windows 7 (Max in 32-bit) here, 60fps. The next beta/update will fix the issue in Max 7 of no lights in the right eye.

TODO: show the health & safety warning, try predictive tracking, time warp in shaders, rendering direct to rift.

January 23, 2015 | 12:45 pm

OH — and for the DK2 you’ll probably want to apply @rotatexyz 0 0 90 to the two meshes. I’ll add something to the help file for this.

January 24, 2015 | 10:28 am


I find riftshader.jxs used in the help patch need a bit of modification to make it work with DK2+SDK0.4.4 on win8.1 + HD4000 graphics.

Firstly I had to change all valuable name "flat" to something else to get around syntax errors.
Then I had to comment out "uniform sampler2D scene;". With these it works perfectly. Please check, cheers.

Takashi Watanabe

  • This reply was modified 8 months by  unnamed7.
  • This reply was modified 8 months by  unnamed7.
  • This reply was modified 8 months by  unnamed7.
  • This reply was modified 8 months by  unnamed7.
January 25, 2015 | 5:32 pm

Graham thanks so much! It works fine in Win7 with DK2 after i did the shader changes as described by Takashi. Looking forward to the next Max Update now!

January 26, 2015 | 8:58 am

Thanks, I missed that comment. I’ve made those changes and pushed to the repo.

(Oddly the shader code wasn’t a problem on my Win7 machine. Must be a graphics driver difference.)

January 28, 2015 | 4:05 am

While I wait for it to download, I presume that I do not need to build the external from source unless I want to make specific changes to it? Can I otherwise simply use the built external in the repo for win7 max6 oculus sdk 0.4.4?

I opened the oculus.maxhelp file and these are the messages I got after I clicked the qmetro button, the display was all distorted, but I don’t know what I should be seeing in the first instance, however the distorted display does at least respond to physical movement of the headset and the fps is at 66fps. I’ve also posted an image of the distorted view as viewed on desktop.

Jitter initialized
oculus: initialized LibOVR 0.4.4
oculus: 1 HMDs detected
oculus: serial 204WXA05T9MM
oculus: hmdType ovrHmd_DK2
oculus: Manufacturer Oculus VR
oculus: ProductName Oculus Rift DK2
oculus: Firmware 2 12
oculus: CameraFrustumHFovInRadians 1.291544
oculus: CameraFrustumVFovInRadians 0.942478
oculus: CameraFrustumNearZInMeters 0.4
oculus: CameraFrustumFarZInMeters 2.5
oculus: DisplayDeviceName \\.\DISPLAY3\Monitor0
oculus: DisplayId -1
oculus: Resolution 1920 1080
oculus: EyeRenderOrder 1 — START GLSL INFO LOG: vp —
0(18) : error C1008: undefined variable "flat" — END GLSL INFO LOG: vp — GLSL program failed to compile. error deleting GLSL program object: GL Error: Invalid value — START GLSL INFO LOG: vp —
0(18) : error C1008: undefined variable "flat" — END GLSL INFO LOG: vp — GLSL program failed to compile.
ob3d_draw_preamble render lights: GL Error: Invalid value warning: cannot contain lights – adding to parent render context — START GLSL INFO LOG: vp —
0(18) : error C1008: undefined variable "flat" — END GLSL INFO LOG: vp — GLSL program failed to compile. — START GLSL INFO LOG: vp —
0(18) : error C1008: undefined variable "flat" — END GLSL INFO LOG: vp — GLSL program failed to compile.


  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502. Reason: i've upgraded to run-time 0.4.4 as provided in repo
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
January 28, 2015 | 1:55 pm

Sorry about that — typo in the shader. I’ve uploaded the fix to the repo. If you don’t want to download it all again, you can just edit line 29 of riftshader.jxs, to change "flat" to "flattened".

Note that currently the oculus object doesn’t support direct to rift so you’ll need to have the rift driver running in extended mode. You can either rotate the window in the Windows display driver, or you can tick the box for portrait mode in the max patch.

January 29, 2015 | 3:19 am

That seemed to do the trick, i can now see that i’m in a wireframe cage with 3 doughnuts. Head tracking is working as well as positional data at a solid 66 fps!

Hopefully you could clarify a few points for me:

(1) When I turn my head very slowly the wireframe cage appears to shimmer with colour much like a CD does when reflecting light, is this the same for everyone or do I have a graphics issue?

(2) When the mouse is focused on the render window (by clicking and holding), the mouse movements do not rotate the scene, instead the scene moves a little bit initially but then judders and springs back to its original position when i let go of the mouse button, is this intentional as well?

(3) In the world patch, changing the @shape attribute from "opencube" to "sphere" does not result in a sphere enclosure being perceived in the scene. Instead the top and bottom of the "sphere" are elongated alot, much like a rugby ball shape. Is this the same for everyone else?


  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
  • This reply was modified 8 months by  aps502.
January 31, 2015 | 4:27 pm

graham, this is awesome. Do you have any idea how i could improve my fps. right now its around 40.

my specs:
Oculus Rift DK1
Firmware 0.18
Processor 2,3 GHz Intel Core i7
Memory 16 GB 1600 MHz DDR3
Graphics NVIDIA GeForce GT 750M 2048 MB
Software OS X 10.9.5

January 31, 2015 | 6:20 pm


(1) This is due to a combination of the chromatic aberration distortion in the shader, which is needed to cancel out the chromatic aberration induced by the oculus rift’s lenses. It’s never perfect though, and most visible with moving lines. I should have given the example world a skybox instead…

(2) Not intentional. I need to fix up the combination of and the oculus data.


Right — the specs of my mac laptop are almost the same, and I also get about 40fps. The Rift really wants to use a gamer GPU. (Also the Oculus Rift SDK definitely considers Windows before mac, because of the principal audience no doubt…)

One thing you could try is reducing the texture dim in the renderer patch. I get 60fps on the help example with the texture dim reduced to 1024×1024, but it also makes things a bit more blurry.

January 31, 2015 | 8:10 pm

Had a bit of fun improving the example world in the maxhelp. Funnily enough, I’m now getting 60fps at full resolution! You’ll want to download the skybox.png separately from

— Pasted Max Patch, click to expand. —
February 4, 2015 | 10:36 am

no lights now in 6.1 either. I just installed max 6.1 on my new iMac in order to use the oculus there. I don’t know if the light issue crept into your external, or if it is newer computer related. On my 3-year old powerbook the lights worked in 6.1, but not max7, I haven’t had a chance to check it on that machine since the last update. I’m starting to get worried as I am using the oculus in a piece in 8 weeks. Hopefully max7.02 will come out soon and fix this problem, but now I am suspicious seeing it is not just limited to max7, but seems to be my whole machine. (new iMac 5K).

February 4, 2015 | 10:42 am

works again in Max6.1! – I unplugged and replugged all oculus cords and lights now work on my iMac in 6.1 (still not in max7). Go figure….

Thank you for your work!

February 4, 2015 | 11:29 am

This may be of interest to people watching this thread, a simple UDP based bridge between Oculus, Unity and MAX that also sends out your characters location in the virtual world…

February 4, 2015 | 2:02 pm

@Christopher, I’m glad to hear the lights are working. They also work in 7.0.2, which will be released very soon.

@APS502, thanks for sharing. Nice to have if your main platform for rift graphics development is Unity, but e.g. the audio is in MSP. (You might still get better latency using the Max oculus external for orientation though — if latency is important.)

February 6, 2015 | 10:55 am


Good point regarding latency although I haven’t run into any issues myself but im not doing anything that is latency critical (within reason). Yes re. Visuals in Unity and Audio in MSP, this is exactly why I put it together!

  • This reply was modified 8 months by  aps502.
February 11, 2015 | 4:08 pm

one question regarding the lights: when moving the head, the light also seems to move. I have a scene where i placed 5 lamps at the top, which i want to light static areas (later they move). it somehow works, but one light always seems to follow the headmovement. is there any parameter i can adjust?

February 12, 2015 | 12:23 pm

First of all I would like to thank
all of you who participated in making the occulus
possible in jitter. Especially Gramham for the external!
Much much thanks mates :-)

I just got an oculus DK2 about 2 hours ago

Here is my input:

Mbp(oct 2014)
Mac osx 10.9.5 Maverick
2.8Ghz i7
Memory 16G

In Max 7:

Had a lot problems @ 1st when trying to get it to work in Max7.0.1
But when I tried it in Max6 everything worked like charm on the 1st shot.

Then when I tried it again in Max7.0.1 it now worked but not perfectly.
I have the same dark left eye problem. Also, the headtracking is a lot more jittery-shaky
than in Max6. :-(

-fps 35

– Graham , tried your patch posted on January 31, 2015 | 8:10 pm
and I still saddly get a 35 fps and lots of jitter.

Funny, that I get no headmovement shaking in Max6.1.9

Anyways, I have it (Dk2) now and I’ll keep following
this thread and posting my results

Many Thanks again


February 12, 2015 | 12:41 pm


Hi, thanks for the detailed feedback.

The dark left eye problem is due to an issue with in Max 7.0.1, which is already resolved for Max 7.0.2. Max 7.0.2 is going to be released imminently.

I’m curious about the head tracking difference between Max 6.1.9 and Max 7. I wonder if that is related to the dark eye problem? I’ll have to look into that here.

The Windows machine I have is pretty strong; I get 60fps with the pyramid example patch with Max 7.0.2. (The older example is slower, presumably because of all the antialiasing on the cube lines.)


I’m not sure — could you send me a simple example patch? I’m not convinced I’ve got the jit.anim.node stuff right yet, this could definitely use some work; maybe that’s related?


February 12, 2015 | 6:24 pm

@graham just a little update to say that I had not tried
your patch posted on "January 31, 2015 | 8:10 pm"
in Max6.1.9. I get a solid 66fps !!

Just to let you know :-)

Thanks again for the amazing work.
wow! will be such a practical tool for me.


February 23, 2015 | 4:05 pm

Update on direct-to-rift mode path: initial research suggested that this is not possible with the Oculus SDK’s current design, and I’ve just had that confirmed on the Oculus dev forum:

Until that changes, Oculus in Max is limited to using extended desktop mode (if you are a Mac user that doesn’t matter). I’m going to put that to one side and look at other things on the TODO list.

March 12, 2015 | 9:53 am

Max 7.0.2 was just released, and works great with the oculus external (it includes the lighting fix that resolves the dark left eye bug mentioned above).

March 18, 2015 | 9:30 am


Is anyone experiencing judder in the demo scenes? It is worst the faster you move.

I’ve tried it on both MAX 6 Windows 64 bit, GeForce GTX 780 and on a brand new MacBook Pro (with ALL trimmings) running Windows 7 and MAX 7. Both systems run at 66 fps reliably.

Any input is appreciated. Configuration Demo and Tuscany Demo works very nicely on both systems.


  • This reply was modified 6 months by  aps502.
March 18, 2015 | 1:41 pm

Are the demo patchers is running at 60fps consistently? If you can try the patcher below, it has a sub patcher with more useful fps tracking; the fpsgui might only be showing an average.

Also try reducing the resolution of the frame buffer — there are some variants on the right-hand side of [p oculus_render]. If reducing the resolution helps, then the issue is GPU bound.

It might also be worth trying turning @sync off on the jit.window, see if that helps?

One thing that has not yet been implemented for Max is the time warping, which could have a big impact on reducing judder; that might well be the difference. I’m working on it…

Another thing is, to make a fair comparison, you should run non-Max demos in extended display mode (not direct-to-rift). It might make a difference. Unfortunately, the current Oculus SDK just isn’t compatible with running direct-to-rift or software rendering from a plugin (which is what we need for Max), so extended display + client rendered is our only option for now. Not sure if this makes a big difference to performance or not.

March 18, 2015 | 1:42 pm

Here’s the patcher

March 21, 2015 | 5:10 am

Hi Graham,

It’s great to see you working so hard to bring the Occulus rift into Max !
I’m considering buying an Occulus but I’m a little hesitant at the moment.
I mainly work in Jitter and would love to experiment with the Occulus, is it easy to get the Occulus working with your current patch on the Mac os ?
Are you planning on maintaining development until the current version is finished ?
I’d just like to know it’s going to work before I spend that much money :)
Thanks !

March 21, 2015 | 6:01 am

Works fine on OSX, but it would need to be a high-end machine for Oculus work in general (not just for Oculus on Max, but for Unity etc. also). I have a fairly new high-end macbook pro, and it can just about handle it, so long as the scene isn’t too complex. I.e. good enough for development, but just not quite good enough for regular use.

Yes I’ll be continuing to maintain it, and hopefully soon other HMDs too (albeit not at a breakneck pace…)

March 21, 2015 | 8:44 am

Thanks for the info Graham. I’ve got a macbook pro thats about 5 years old. 1 gig graphics card. 8 Gig ram.
Hopefully thats enough to pull it off.

March 23, 2015 | 12:57 pm

I’ve banged my head against this for awhile and haven’t come close… I’m wanting to create a regular view of the oculus world in another openGL context for viewing on a projector for the audience to see what the wearer is seeing. With best frame rate possible, and also the camera view settings synced to the oculus world. Has anybody done this yet that would be willing to share, otherwise at least let me know how I might do this.

March 23, 2015 | 1:19 pm

It’s not really a solution but I’ve been thinking about this aswell. I’m planning on doing it with Unity though and make life easier I’ll be controlling another camera on another instance of the Virtual world on a second machine via UDP messages.

Maybe a similar method would work in MAX?

I’d be glad to know how you get on either way.


  • This reply was modified 6 months by  aps502.
March 23, 2015 | 3:19 pm

A couple of different ways of doing it.

A) One way is to run two Max applications, synchronized via network commands. This works well if the content behaves deterministically or is derived from a database that can be accessed by both applications, or application state is simple enough to be shared via local network messages. The second screen viewpoint can be different than the oculus viewpoint.

B) Otherwise, you can use a second window with a shared context. Set the [jit.window oculus] to have @shared 1. Create a 2nd window, also shared, e.g. [jit.window 2nd @shared 1], and create the corresponding renderer as [ 2nd @shared_context oculus]. This will let you share textures between the two windows. To get the oculus view into this second window, there are a couple of options:

1) The easiest way is to attach the outlet of the [ world] (inside the [p oculus_render] sub patcher) to a [ 2nd @transform_reset 2]. You could also change @scale to fit nicely into your window dimensions.

2) Alternatively, create a new [ world @capture 1] and hook that up to your [ 2nd @transform_reset 2]. This gives you more flexibility, including having the 2nd-person view having a different location/view than the oculus HMD, but will be more expensive as it requires an additional render of the scene. You probably also want to set the camera resolution to match the 2nd display, e.g. [ world @capture 1 @adapt 0 @dim 1024 768] etc.

Hope that helps.

March 23, 2015 | 3:21 pm

Here’s a replacement oculus.maxhelp that uses option B1 above.

  1. oculus.maxhelp
March 27, 2015 | 12:06 am

Thank you. It is working great!

April 3, 2015 | 4:51 am

Hi Graham,

Got the patch working just fine. constant FPS between 60-66-fps with sync off.

– Get the famous " head jitter effect". refresh rate is 75 hz. Resolution 1920 1080 (1080p)
Any tips on how to reduce this ??

Got a AMD Radeon HD 6750M 1024 MB. 8gig of Ram. Mac osx 10.9.5.
All tips welcome and great work by the way !!

April 3, 2015 | 1:36 pm

Hi Andro,

I was also seeing more judder thank I think there should be, FPS is also good. I streamlined the navigation subpatcher and now it looks a bit smoother to me; could you try downloading again and let me know how it is for you?

I’m hoping that adding the timewarping will improve the experience even more. I will be working on it next week.


April 5, 2015 | 5:55 am

Hi Graham, got the oculus for 3 more days. I’ll be able to test it out tomorrow. I already tried running the patch in presentation mode, with the preview window turned off. No joy though. I think rendering it directly to the rift is key here.
Doesn’t the oculus have to run at 75 FPS to eliminate judder ?
I also tried lowering the resolution of from 2048 x 2048 to 1024 x 1024. Playback seemed a lot smoother but i think the shader doesn’t support the resolution as i get glitches during playback.
Is it possible to make a simple toggle function to half the resolution for testing purposes ?
Having a low,med,high quality setting would allow people to do a more detailed benchmark test on different machines.
I’ll also be testing this patch on my desktop PC tomorrow to measure the differences.

April 5, 2015 | 11:49 am

Hi Andro,

Running at 75Hz is preferable, but unless your primary monitor display is also running at 75Hz (unlikely) your GPU might be doing juddery things to the DK2. A lot of DK2 users have reported this issue. Here are some of the common workarounds to try:

– If your primary monitor can do 75Hz (sometimes possible at a lower resolution), that would be the first thing to try.
– On Windows, Aero forces all monitors to vsync to the primary display, perhaps disabling it might help.
– Alternatively you could try setting both monitor and DK2 at 60Hz and see if that is better (though you might see flicker… varies from person to person).
– Another option is to use the rift as primary (or only) display, but that’s a rather drastic solution, as handling the desktop in the HMD is almost impossible.

I’ve also just pushed another update that adds the low persistence option as an attribute. This might also help improve the experience — if you have time to try this and/or the resolution/refresh rate options it would be good to hear your results.

Direct to rift might preferable, but I’ve also seen developers complain about the 60/75Hz judder in that mode too. In any case, until oculus update the SDK this just won’t be possible via Max because their support for OpenGL is just not there yet. I’ll keep following the oculus developer updates and try again for that if there’s a change. There was an SDK update on Mar 26, but I’m hearing mixed responses so far, will wait a bit before updating.

About the resolution question: "I also tried lowering the resolution of from 2048 x 2048 to 1024 x 1024. Playback seemed a lot smoother but i think the shader doesn’t support the resolution as i get glitches during playback." — I can’t reproduce any error here. I’ve just updated the oculus.maxhelp again and added a more friendly umenu for trying out resolutions — can you let me know if it works? Thanks!

And finally, I will also work on timewarp later this week, which may also help.


April 6, 2015 | 9:42 am

Okay, I just tested on my Windows desktop in the office, and can confirm these conditions of judder / no judder:

– main display at 75Hz, rift at 75Hz -> no judder
– main display at 60Hz, rift at 75Hz -> horrible judder
– main display at 60Hz, rift at 60Hz -> no judder, but some faint blur/flicker

So overall it is essential that the main display and rift are set to the same refresh rate. Your main display might support 75Hz at lower resolutions.

Enabling the low persistence attribute definitely helps keep things in focus when turning the head. The motion blur becomes more like a faint ghosting effect. This is probably the same thing as the "black smear" mentioned in

April 6, 2015 | 12:53 pm

Alright. I’ll be running all my tests tomorrow Graham. I do not have the foggiest clue how I can change the refresh rate of my macbook pro though ! I just cannot find an option to change it :P
Any tips ? Took a quick look on google but couldn’t find anything.

April 12, 2015 | 7:25 am

@Andro, Sorry, I was just looking for the same thing myself; and failed to find a solution. Maybe the retina displays are fixed at 60Hz?

By the way, I just found Tobias’ project using the oculus external on the C74 projects page: So nice to see it being used!

@Tobias, if you’re reading, I’d love to hear how it went, e.g. what was easy vs. what could have made things better, etc. Thx.

April 12, 2015 | 8:57 am

Hi Graham,

yeah, it was a real pleasure to work with your external and the Oculus in Max! I definitely already do continue to use it for new projects (and some of my students in shanghai as well…) I got stable 60fps (with a new windows gaming laptop) and only random crashes (1/10) when not closing the max app after each turn, but i guess that is more related to Max itself… So finally i always restarted Max for each turn, and that way it never crashed again. I programmed fírst in Max6, but since the 7.02 update i used Max7 to present it. The external itself is thanks to the helppatch very easy to use… I am running at 60hz, and don’t experience any blur/flicker… The preview monitor you added in the recent helppatch makes testing much easier (i had built one before myself). I worked together with a musician (Ableton Live) and we did some location of objects/soundsources with hoa~ which worked in general, but sometimes buggy and always with quite some latency… I will try to figure out a bit better, and if succeed i will post here again…
I had some struggle with Light somehow seemed to move with the head rotation – that was also experienced with the helppatch you posted before with the 3 gridshapes in the worldbox… at the end i gave up the idea to use light very precisely (had alltogether only 3 weeks time for the video), but i might also just work myself better into all that lighting, shadow + material stuff…
In general, thanks so much for the external! It is really great and easy to use!

April 12, 2015 | 9:30 am

i also tried some patching to detect if objects are in view (to trigger events based on that), that was also working but not sure if my approach was very clever. will have a look into that as well and post when over-worked..

April 12, 2015 | 10:01 am

Thanks Tobias for the feedback.

I can’t reproduce your problem with the lights unfortunately, tried multiple lights, directional, point and spot, everything looked OK. If you can send me an example where it doesn’t work I will look into it.

Updating the Cosm stuff to the Max 7 world is high on my agenda, especially now as they will fit nicely into the oculus world. I have some ambisonic spatialization stuff done, mostly via gen~, already. With much on my plate, a release will probably be during summer. There should be virtually no latency for spatialization (unless it is added for Doppler distance effects).

I also have some students here in Toronto working with the oculus… it’s an interesting experience!


April 12, 2015 | 10:04 am

I’m also working on a new 3D audio library that uses Kinect for Windows 2 to try and find a good set of HRTF filters for a given user. With a lot of luck I’ll also have this working at some point during the summer, there were good early results in the lab.

April 20, 2015 | 10:07 am

Update: Just tried the oculus object and download the new library from github.

Big thanks for all the new updates in the .maxhelp : preview monitor , fps in oculus monitor etc….
really well done :-)

Just to report:

I’m on a MBP Retina late 2014 oct. Os 10.9.5 Maverick/ 2.8Ghz i7 / 16G Ram
I am getting real bad judder and as I read you and Andro said, I cannot find how/where to change the monitor Rate.
Couldn’t find it in mac forums either. <– Do you guys have any update on this ??

I will be using this in a project really soon, will let you know how it goes.

Thanks again for the great work


April 20, 2015 | 10:44 am

Yes, I did some research on this. I could not find any way to change the refresh rate of MBP retina displays, nor disable them. I’m not even sure whether you can drop the Rift down to 60Hz via the OSX Displays Preferences (don’t have it handy to check) — 75Hz would be better, but having both screens matched is more important. If you find a way, please let us know!

Also watch out which port you use on the MBP, I think one of them is attached to the integrated GPU rather than the Nvidia one… you definitely want to use the Nvidia one! I think it is the thunderbolt/displayport port, not the HDMI one, but I’m not 100% sure.

Also disable "automatic graphics switching" in the Energy Saver preferences.

Otherwise, I recommend using a desktop machine with an attached display that can support 75Hz for the best experience. It really does make a big difference.

Just to be clear (for anyone else reading), this is not a Max issue, it is an OSX/Oculus issue.

April 20, 2015 | 9:45 pm

Thanks for the great tips Graham, Will Definitively let you know if I find anything.
I tried on both usb inputs and I am getting the same results on both sides.

Again will let you know if I find anything


April 22, 2015 | 1:53 pm

Mac osx and Window 7 bootcmap / Oculus DK2 testing results

Mac osx tests

MaxMSP setup
– best with no judder has been
-lowpersistence 0
-dynamic prediction 0

system preferences

– In prefences/ gather windows / you do have acces to change the refresh rate on the oculus to 60Hz.
I did like Graham suggested and tried to match the MBP retina’s 60Hz. But the Judder remained.

the best results were:
– gather displays /Rift DK2 /Scaled 1080 948 /refresh automatically goes to 120Hertz

I read the best situation would be to have "direct HMD acces from Apps", direct to GPU , bypassing the OS.
but as the oculus configuration utility indicates, it is presently not available for Mac.

Using an HDMI to thunderbolt adapter.

– no difference in Judder
– could not reset resolution in system preferences/ I get a glitchy window.

windows 7 tests (BOOTCAMP)

– In configuration panel , I cannot change the display to anything other that 60Hz for the MBP monitor
– In the oculus configuration utility, having the oculus set to "Direct HMD Acces from apps" Creates crazy Flickering. <— I decided to set it to Extend desktop to the HMD and set the refresh rate to 60Hz. The result jitter was …..JUDDER ;-(.

-tried testing with "direct HMD acces from Apps" on the Tuscany Demo<— Still JUDDER.
Almost barfed in this one.

Last thing to test if you guys want , I till later I’ve been @ it for 4 hours now.


Conclusion for now:

For me the best results = Mac with Resolution 1080 X 948 setup in system preferences  

April 22, 2015 | 6:33 pm

Hi Phiol, thanks for the detailed feedback!

In your "gather displays /Rift DK2 /Scaled 1080 948 /refresh automatically goes to 120Hertz" configuration, is judder actually reduced? Is the oculus display actually showing 120fps, or 60fps (make sure the oculus window has @sync 1 and show the FPS with the ‘f’ key.) I guess I’m a bit doubtful whether the macbook retina can actually drive a display above 60hz — but happy to be wrong!

FWIW Direct to HMD mode is not currently an option for Oculus in Max, because of the way the Oculus SDK is designed. (Also, although it is ‘recommended’, there have been a lot of reports on Oculus forums of it being less stable than extended mode…) I guess this must be something they’re working on for the release.


April 28, 2015 | 8:23 pm

Hello Graham

To answer your questions:
>>In your "gather displays /Rift DK2 /Scaled 1080 948 /refresh automatically goes to 120Hertz" configuration, is judder actually reduced?
-The final answer is no it wasn’t really. And yes the Retina does seem to be able to go beyond 60Hz.


I tried with another approach , using Rob’s with a hap codec video. And voilà, I was surprised to see
no Judder! Sometimes very very slightly but 97% no judder!

Patch and file setup:
– I textured a video to an open cylinder @scale 50 50 50. @dim 20 20.
– I needed the sound to pan based on the head y position. Also I used your [quat2euler]
So for sound; I decided that MSP would drive’s frame position , cause [spigot~ ] was just simply not working @ all even when using @engine qt.
– Video file : dim = 4096 × 2048 <–HUGE /  TYPE: hap codec @ medium <– converted using QuicktimePro7. / Filesize: 6.31 GB
-sound type: aiff , also exported using QuicktimePRO 7.

The original video file I was using was : AVC Coding H.264. <–the hardest one I know.
I never tried with a ProRes 422. I will though.
Of course I had tried with the much smaller chickens.mp4 and the legacy files, but I was still getting Judder.

Now all is smoothly running at a constant 60 fps. Maybe 30fps for the first 10 sec .
Then everything gets into gear /60fps/ hardly rarely Judder. MSP sync with no Judder.

Thanks again for all your hard work that you put into this.
Kept em’ comin’ and I’ll keep a testin’ and reportin’ ;-)


April 28, 2015 | 8:50 pm

OK — so are you saying that the judder was actually the inability to sync from the audio file at 60fps?

April 28, 2015 | 9:01 pm

No because initially I never had MSP driving anything. I only included that last night.

I’m in Max7 so I use , tried both @engine avf and qt and automatic 1 and automatic 0 <- with explicit bang from master qmetro. @colormode uyvy and rgba. . But the file was always H.264.
Tried quite a few combos. But the best ended up being hap with hap codec.

I don’t know if this is reliable to draw any conclusions and I tried it on a friends older MBP retina and he’s getting 75fps with the same patch. A different pair of oculus. But he does’t get the display screen. It’s all black on his computer.

We both get the error :
jit.displays: display not found at index
jit.displays: display not found at index

Which doesn’t seem to cause any issues on my computer.

There you go

April 29, 2015 | 5:56 am

I forgot to mention that I have the oculus set to 60Hz in the system preferences

May 1, 2015 | 7:23 am

Hello all, thank you for your excellent work, I am both oculus and max 7 new user, and very exciting to get max 7 running on my oculus rift dk2. But after I download the package into my folder:

\Documents\Max 7\Packages

and open the \help\oculus

the console of Max 7 keep saying:

newobj oculus: No such object

Anyone could help on that ?

  1. oculusmax


May 1, 2015 | 10:09 am

Hi QiangPan Chen,

Beware, things have changed wince Max7.
Anytime you use a 3rd party external (an object that doesn’t come with Max7 libraries), you have to place the object or the entire folder in a specific area. In the case the folder is "max_oculus-oculus_package". You place it in MacintoshHD/users/shared/Max7/Library/max_oculus-oculus_package

Now try it and start flying in a free fake world :-)


May 1, 2015 | 5:18 pm

Hmm, that shouldn’t be necessary — it should be enough to place the entire max_oculus-oculus_package folder into the standard packages folder, which on OSX is ~/Documents/Max 7/Packages/ (i.e. Macintosh HD/Users/<yourusername>/Documents/Max 7/Packages/), on Windows is My Documents/Max 7/Packages

(You might have to create the Packages folder, but the Max 7 folder should already exist)

Are you running on Windows or Mac? If on Windows, you need to use the 32-bit version of Max (I haven’t built & tested for 64-bit yet).

May 5, 2015 | 8:24 am

Hi all,

I am wondering if this Max project could get the data from the Oculus Rift dk2? I looked in this project, but I could not find anything yet. Maybe because I am not really advanced with all of this. Can someone give me tips?

Thank you in advance!


May 5, 2015 | 10:34 pm

Yes, DK2 supported, both for tracking and for display.

See README for current status:

May 15, 2015 | 7:10 am

hi @Graham, My name is Franky.. I am doing a BA project using the Oculus and Max/MSP
I wanted to ask you (this may sound stupid) .. How Do I implement 3D sounds into Max/Msp using the new Oculus Audio SDK ?

Please let me known, You seem the expert on this.

Thank you so much

Best Franky

May 16, 2015 | 3:42 am

Hi Franky,

I have not yet integrated the new audio features, as they are not publicly available for C/C++. I can make a request ("The optional OVRAudio C/C++ SDK is available to qualified developers by contacting developer support directly."), but I doubt that the request will be fulfilled, as I would need to distribute it. Perhaps they are doing this because the API is not yet stable, or perhaps it is because it is licensed code? Either way I can’t integrate it yet.

In the meantime, there have been *many many* projects for spatializing audio using Max/MSP, and it should be fairly straightforward to use them with the oculus external. I wrote a few of MSP externals for spatializing audio via ambisonics, which I may integrate into the oculus example via gen~; but really for HMD you probably want to use a HRTF.


May 16, 2015 | 4:00 am

Note that Oculus *is* providing VST plugins, which can be hosted in MSP via [vst~], though on OSX only for 64-bit mode.

AFAICT these plugins do not take into account head orientation, so you would need to derive the relative position within the patcher. And you’d need an instance of the plugin for each sound in your world, but you get a separate reverb for each one too (wasteful). I would still very much recommend using a pure MSP solution rather than the oculus audio SDK at this time. If they open up the C/C++ headers to the public, I will see if they can be better integrated in MSP.

May 19, 2015 | 7:05 am

Hi Graham, Thank you for the reply..
Yes you are right.. I have been advised at school to use MAX for spatializing the sounds.

Btw you Oculus patch already integrated the Head tracking from the Oculus right ? I did not try it yet.

Please let me know,

Thank you

May 19, 2015 | 7:21 am
May 19, 2015 | 7:31 am

@Pelo, yes it is.. but I am getting the DK2. fully supported on MAC ;)

May 19, 2015 | 6:10 pm

DK2 is fully supported on OSX, and in Max/MSP. The blog article is about the commercial version of the Rift, due early next year:

OSX and Linux development is "on hold" for this model — basically they’re up against a deadline of what they promised to deliver and are concentrating on the major platform first. No surprises here. I would expect OSX and maybe Linux support to return maybe a couple of quarters later. Also no surprise that almost no laptop is currently powerful enough to drive the commercial Rift — dedicated 90Hz output at that resolution is pretty tough to achieve without a high-end discrete GPU. They want to make sure that VR is successful as an industry, and not fail (as it did in the 90’s) due to underpowered technology that makes people feel sick. They want to avoid nausea-inducing VR a la Google Cardboard, which I totally get. I had a chance to try the Crescent Bay prototype last week at an Unreal summit in Seoul, and it was amazing — definitely a big step up from the DK2 experience. This is much closer to what the commercial model will be like. For the next year or so *at least*, this will only be possible from tower PCs. Of course you can still develop and prototype stuff on a laptop! I have a high-end mac laptop and that’s barely enough to run the DK2 (it doesn’t get me the 75Hz the DK2 really needs for a smooth experience). But it’s useful to develop via the laptop and then use the DK2 from my Windows tower.

What I’m more concerned about is whether they drop the extended mode for the Rift — because direct-to-HMD isn’t really compatible with Max/MSP as they currently implement it. Hopefully there’s enough other developers needing the extended mode that they will keep it supported.


May 22, 2015 | 11:59 am

Hello Graham

Is there someway to contact outside of this forum about an unrelated matter to oculus? My address is


May 27, 2015 | 3:59 am

Hi Graham, Thank you for the reply.

Graham could you advice me on something please. I am currently reading an OpenGL book (given to me at Uni) but I am struggling to see how I can use it for my project in Max/MSP/Jitter. I am looking for a guide that gives me the understanding of "creating interactive 3D Audio environment" in Max/MSP/Jitter. The book I am reading is called "OpenGL Programming guide v 3.0 and 3.1, seems great but I can actually already create 3d stuff in Max with out knowing anything at all. So I am struggling to understand how this can help me quite heavy and boring actually.

This page seems to be much much more better :

I know you have lots of experience on this subject, please could you advice me any books or net resource I could use for my studies?

btw feel free to email me on:

Thank you for your time Graham,



May 28, 2015 | 9:53 pm

Hi Franky,
I’m not exactly sure how to respond to the question — it’s too broad a topic "creating interactive 3D Audio environment" is a big vague,) and definitely more general than Oculus Rift discussion (this thread).

What I can say is that all Jitter 3D is built on OpenGL, and knowing something about OpenGL can help in understanding Jitter (and vice versa). The books you have been given are reference texts, meant to be consulted while coding, which is perhaps why they seem "boring".( Also, OpenGL itself does not provide any support for audio, so you will not find any guide to creating and audio environment in those books.) 3D graphics and immersive world-making are truly vast spaces of expertise with a lot to master. Don’t be discouraged, just make with what works until you need to understand more. It’s more fun that way.


June 11, 2015 | 7:28 pm

Warning: Oculus released an updated 0.6 SDK for Windows (likely everything oculus is going to be Windows-only for the next 9 months…) **For now this won’t work with the current [oculus] max object — you’ll need to stick with the 0.4.4 driver for Max.** (But you will have to wait for them to fix their website so that you can actually get to the download link… sigh).

Looking ahead, there’s some good news in this new driver, which I’ll try to add support for as soon as I can. First, they got rid of the silly OpenGL driver shims, so we might be able to get full support in Jitter again, including direct-to-rift mode, timewarp, etc. Second, they removed ‘application distortion rendering’ and moved all distortion rendering into the driver itself, via shared textures. In plain terms this means that max won’t be doing the distorting anymore (removing the need for the shader). All I’ll need to figure out is how to get Jitter to talk to the SDK-created textures smoothly — hopefully there’s a way to do that without having to modify Jitter itself :-)

June 11, 2015 | 7:53 pm

Thanks a lot for the heads up Graham.

So basically what this might mean is; less heavy lifting for Jitter == better performance and thus less judder.
But we won’t know this for the next 9 months. (likely everything oculus is going to be Windows-only for the next 9 months…)<– why do you think this is so ?

thanks again Graham :-)


June 11, 2015 | 10:10 pm

Yes, less heavy-lifting for Jitter, and **possibly** improved performance. The distortion pass isn’t an expensive part of the process compared to rendering the scene twice, but it might shave off a little by effectively moving this out of Jitter’s swap interval.

I would expect Windows-only for the next 9 months because Oculus is racing to get the commercial Rift out by Q1 2016, and has already stated that initially it will be Windows only. And numerous comments on the forums to the effect that Mac OSX development has been "paused", e.g.

Also, note that the commercial model will run at 90Hz — more demanding than the DK2. The recommended PC specs are not conservative, see

June 11, 2015 | 10:50 pm

I recently tried to publish as an app via Oculus Share, but got a couple of days ago the message

"Unfortunately, we have rejected "Whiter Room " for the following reason(s):"Crashes on startup (after black screen for awhile.) System specs: Windows 7 64-bit, Oculus Runtime 0.6.0 "

I forgot to tell them that Quicktime needs to be installed for the app in order to work, but might this also be related to:

"Warning: Oculus released an updated 0.6 SDK for Windows (likely everything oculus is going to be Windows-only for the next 9 months…) **For now this won’t work with the current [oculus] max object — you’ll need to stick with the 0.4.4 driver for Max.** "

I have run my app running on 3 different windows machines without problems…

June 11, 2015 | 11:00 pm

Yeah — I’ve got a bunch of stuff running very nicely on Windows too, but it won’t be accepted since they mention "Oculus Runtime 0.6.0" in their test system specs. I wouldn’t expect anything made in Jitter to be accepted to their app store until the oculus external works with their latest SDK, and that’s hard for me to keep up with because they change things so often. Not a criticism — this **is** a developer kit, it’s expected that things will change — and the 0.6 SDK is actually going to be better for Jitter integration once I get it working. Hopefully sooner than later.

Hmm, I thought Quicktime was no longer necessary for Jitter, but apparently I was wrong.

June 11, 2015 | 11:13 pm

Ah ok, don’t know about Quicktime then, just thought previously that this could probably have caused the problem (i have some videoplanes / objects playing quicktime movs, at the beginning), but now i understand…

Thanks for that explanation and for all your efforts for the jitter community!


June 12, 2015 | 6:12 am

Just to say that you can still get 0.4.4 beta and beta on the revamped website – go to downloads on the developer site, select PC first, then you can choose these versions from the list (for Mac, Win and Linux). If you have "any" platform selected, it won’t work for some reason.

I’ve been using beta and it seems to work – is there a reason to go back to 0.4.4 beta instead?


June 12, 2015 | 12:44 pm

Hi Graham, your Max object works great here with runtime (OSX Yosemite). I had to use the because 0.4.4 won’t work with WebVr such as :

Anyway.. Thank you so much for updating this forum. ..and thank you guys too.. looking forward to any updates.


June 15, 2015 | 12:43 pm


Could someone, probably Graham, clarify if the Oculus External works with MAX 6?

If so, i presume it needs run-time 0.4.4 (maybe 0.5) to run based on the discussion above?


June 15, 2015 | 6:56 pm

Yes the external works with Max 6. The help patcher will need to be replaced with, and replaced with or similar (or just remove the FPS counter display).

June 17, 2015 | 11:43 am


Somethings not right, not sure what. I changed one object to and removed the fps counter display. This stopped the error messages. However I only see circling blocks around a pyramid on one side of the viewport and i get the following message on loading the modified patch. can’t find file skybox.png
oculus: 1 HMDs detected
Gen working in runtime mode
texdim: bad number
DistortedViewport: bad number
ViewAdjust: bad number
FOV: bad number
uvScaleAndOffset: bad number warning: cannot contain lights – adding to parent render context

any suggestions?

June 18, 2015 | 7:34 pm

@APS502 — the missing files means packages don’t work on your version of Max 6 — you’ll need to update to the most recent version of Max 6. Or, maybe it means you need to put the oculus_package folder inside of Documents/Max rather than Documents/Max 7 ? See
However, it might be that although the oculus external works in Max 6, the rendering might not — the bug of only seeing the scene in one eye (mentioned earlier in this thread) was fixed in Jitter relatively recently, and I’m not sure if the fix was incorporated to Max 6 (Sorry I don’t have a way to check that right now). If not, then you’ll probably need to use Max 7 after all.


September 9, 2015 | 4:39 am

Hi @graham, I hope you are great. I have started my BA project few weeks ago.. and I am having some problems trying to make the leapmotion interact with phys.body objects in max.

The hands are drawn with which render a GL visualisation of the incoming data from the leap.They are not a phys body them selves, therefore they cannot collide with anything (is this correct ??..but what do I need to make it interact with other phys.bodys??)

Please let me know if you know anything thing on this. Thank you.

  1. Screen-Shot-2015-09-09-at-12.30.25


  2. Screen-Shot-2015-09-09-at-11.27.081


September 21, 2015 | 7:06 am

Hi guys.
I have a quick question: I am using Graham’s Oculus patch (working fine btw) for the Rift and I have designed physical hands (using a leapmotion) to interact with object with in a world.
Everything works great.. I have made 6 jit.pshys which represent my palm position and tip positions. Now I can grab / move things and do other world’s interactions.

The main problem I have right now is that when I move-navigate around.. my hands (visualization of the leapmotion) are not following the head tracking position, so they remain in the same position when I navigate within the world.

I know I can get the head-position (quat / position ) from Graham Oculus object, but I am not sure how-where to send those values so that the hands are following links to the character position (navigation-head position).

Please let me know if anyone can help.
Thank you so much.

September 21, 2015 | 8:07 am

Hi sounds great, would love to see it. I’ve been working on a leap external too, but hadn’t got as far as integrating the two.

I presume that the coordinates coming from the leap motion are relative to the sensor itself. So you need to rotate them by the HMD orientation and offset them by the HMD position to put them in world space. I would guess that a relatively easy way to do that would be using a jit.anim.node (or jit.phys.node) to group all the leap-based objects. Send the oculus quat/position to the group node. Space transformations are sensitive to order-of-operation, so you might need to try chaining two nodes (one for rotation, one for position) and switching their order. You might also need a further rotation/position node for finer calibration — especially since the sensor is actually ~10cm forward from the middle of the head!

Please post an example if you get it working!


September 21, 2015 | 8:15 am

Hi Graham,
Fantastic. Thank you so much. I will work on what you mentioned this week.

tw I mainly replaced elements of your world with mine (p world), as suggested in your patch.
I will def post the patch once I have managed to do it.

Thank you again for this.


September 22, 2015 | 4:02 pm

Hi Graham,

Maybe you can confirm something for me that I currently suspect – if I were to fork your code and try and adapt it to support the 0.7 SDK, it wouldn’t work, as they have removed support for Extended Monitor mode in this version of the SDK.

I understand that due to the incompatibility between the Max plugin system and the direct-to-Rift mode’s need to be closer to the metal wrt OpenGL calls and call ordering, the best we can hope for is a stable version for the 0.6 SDK, and that we should still be using the 0.4.4 SDK for the time being.

Is this right?

If so, it’ll give me a clear direction of how I can best contribute, as it seems like you were already working on updating it to support 0.6

Franky – at a theoretical level, you face an additional challenge given that the hands are not directly connected to the head – there is of course the whole torso and arms in between.

Do you have a Kinect for Windows unit by chance? I’ve been trying to solve a similar problem, that involves more hardware but removes the problem you have – a Kinect will be able to track these missing hierarchical joints, and you can apply some basic animation principles (i.e. kinematics / inverse kinematics) to track an entire avatar by carefully combining the data incoming from the Rift, Kinect, and Leap Motion.

The end result is an avatar that can see its own body/finger movements in realistic detail, pick up objects with any thumb/finger combination, could make a snap sound if thumb/finger was detected as colliding and then moving in opposite direction with rapid velocity, etc.

I had been working on this sometime ago in Max, broken into several modes to try and tackle all the issues involves through different perspectives.

Mode 1 is simply rendering your avatar some distance away from you and facing you as though it was a mirror reflection.
Mode 1.5 is mirror mode, you looking at your (avatar) self in a mirror, near plane is far enough away from the camera that you don’t get anything rendered except what you can see in the mirror.
Mode 2 is direct mode, you are looking through the eyes of the avatar, some adjustments to the near plane were badly needed last time I was working with this mode.

The hardest part is going to be dealing with cases where the Kinect can still see what’s going on but the Leap Motion can’t, and vice versa. Finding ways of creatively dealing with that seems to be the key (experiences "on rails" seem to work best with current hardware).

September 23, 2015 | 9:06 am


Yep, I’ve been working on the 0.6 branch, though now actually the 0.7 SDK, trying to get this to work. In theory it should give us better performance and a cleaner interface in Jitter (at the expense of the OSX port being "forthcoming"), but unfortunately I’ve not yet been able to actually get images to transfer to the Rift. Even a simple example of rendering a colored quad into an FBO backed by the SDK-provided textures is doing nothing. It should work — I’ve confirmed that the Cinder DK2 project builds and works fine with the same hardware, so its either a mistake in my own code, or perhaps an issue with the older OpenGL library provided by the Jitter SDK. (Or, in looking at the Cinder project’s history, perhaps it’s related to an Oculus SDK bug that is flushing the context prematurely; need to look into it more.) If you’re interested in giving it a shot let me know and I’ll talk to you about the current progress in more detail. Otherwise, yes, the current status of the oculus external is tied to the 0.4.4 driver/SDK.

w/r/t combining with the leap & kinect, I’m actually also in the process of developing an artwork that depends on all of them (in the style of your Mode 2). For the application I’m doing I don’t need more than hands and forearms, which in theory Leap already provides, but indeed the Kinect can be used to help reduce the errors, provide depth/volume, and also keep the various spaces a little more calibrated. Early stages yet, but hope to have something more to share soon. Kinect v1 currently, though I may also try the v2.

Anyway, keep me posted on what you’re doing!


Viewing 150 posts - 1 through 150 (of 158 total)

Forums > Jitter