Forums > Jitter

Oculus Rift


Sep 08 2016 | 6:53 am

i have access to a windows machine as well, i am totally fine using it for the VR development [my jitter patches will work on Win/Mac]
I get the same issue when i run the max_oculus in windows. It says it cannot detect the HMD [i have the DK2] The windows machine i have access to has an Nvidia graphics card and all the demos from Oculus work well. It’s windows 10 and the latest version of Max7. If you can tell me what to install on Windows i would be grateful and follow it precisely and report back

Sep 08 2016 | 7:45 am

On Windows I recommend you use the [oculusrift] external from the Max Worldmaking package, which you can download from the link below. This is the one that is in active development:

https://github.com/worldmaking/Max_Worldmaking_Package

Sep 08 2016 | 8:17 am

What i think is going on is that i do not have the external power supply that was previously included with the Oculus or the development kit so it does not see the Oculus as a "Second Monitor" which i assume is crucial. When i plug the Oculus in to a windows machine running the software the Oculus works fine but it does not show up as a second screen monitor in Windows or Macintosh. I cannot confirm this of course but it seems to be why it does not appear

Sep 08 2016 | 8:51 am

If using the oculus (from max_oculus) object, did you set the oculus config utility to treat the HMD as a second display (i.e. disable "direct mode")?

Otherwise, if using the oculusrift object (from max_worldmaking), you do need to set it to direct mode. Honestly is what I would recommend, it works with the latest oculus drivers and it has better performance too.

Sep 14 2016 | 4:14 pm

Just checking if anyone can replicate.. I’m finding the 360video example can no longer autoload (or load) the comp_bg.tif image anymore.

However, the one on this site works: http://panotool.com/panotool/images/ShinjukuEquirectangular.jpg

So I’m wondering if there may be a bug regarding .tif support in the latest Max version?

Oct 11 2016 | 2:04 pm

Trying the HTC Vive object help patch with a new laptop that has a GTX1060 GPU. I’ve tested this with other VR apps and have had great performance, including the SteamVR performance test, which rates the GPU as "high quality" and reports all performance over 90fps.

Unfortunately I can’t achieve greater than 45fps in the help patch — any recommendations as to how I might improve the frame rate? This is consistent between the 64bit and 32bit versions of Max. OS is Win10 Professional, 64bit.

Oct 11 2016 | 4:07 pm

As far as I can tell, The 1060 is right on the fence, as in, it may be fine as a user / runtime card but maybe not as a developer card.

It’s incredibly easy to get the drop to 45fps with the Vive object in Max, just because Max is so incredibly non-optimized and non-optimizable (even in standalone builds).

One thing I’ve noticed for sure: any media playback will do better off an SSD. 64-bit tends to see certain issues disappear with media playback as well, meaning extra memory is useful. Again for media playback, CPU will matter for any codec besides hap.

For pure CGI computation and rendering, I suppose it still mainly comes down to the video card. Have you had a look at overclocking it?
That could make the difference for a card on the fence between user/developer appropriate.

Oct 11 2016 | 9:43 pm

I tried overclocking the card tonight, but does not appear possible with this model.

I’m testing with the included help patch for the htcvive object, so I’m not playing any media. The SSD I’m using is quite fast, so I don’t think that would be a bottleneck in any case.

From what I understand the 1060 is faster than a borderline card, but apparently not fast enough to run at 90fps in Max. FWIW, TechPowerUp GPU-Z reports only 25% GPU Load when running the patch — I’m not sure if that’s a true indicator, but it may indicate that Max is not fully utilizing the card. Not sure why that would be.

When I run Unigine Valley as a benchmark I see GPU load from 50-75% depending on quality settings. Wondering if there’s something limiting the GPU load for Max, or perhaps the htcvive object itself?

Oct 12 2016 | 12:30 am

Hi

I recently spent quite a bit of time hunting for fps in my vive patch, and found a couple of tricks that helped optimizing and staying as close as possible to the so desired 90 fps… :

– htcvive.maxhelp : in this help patch, the bottleneck is with no doubt the 512 "rock boxes" : try to reduce the number of cubes, or simply delete this jit.gl.multiple, and you’ll get your 90 fps !

– displaylist : the displaylist attribute (available on almost all 3D objects in jitter) is very, VERY useful !
("Cache in displaylist flag (default = 0) This feature may be used to speed up rendering time by creating and storing a list of gl drawing commands on the graphics card. This will have no effect if matrixoutput turned on")
enabling displaylist really optimized my patches A LOT (especially when using big meshes / models )
(of course, don’t use it if your object is constantly modified / redrawn : for ex a jit.gl.mesh receiving new vertex matrix on each frame..
but it works fine if object is just moved / scaled / rotated)
for ex : in htcvive.maxhelp, enabling displaylist on the "rock boxes" gridshape
jit.gl.gridshape vive_world @automatic 0 @shape cube @name cubik @dim 8 6 @displaylist 1
should give a much better framerate…

– CPU : most of the time, Max do not takes advantage of the multicores of our recent CPUs (it’s a different story in MSP / poly~ …)
that means that if you have a quad core CPU, Max will only use 1 of then, that’s why you’ll probably won’t reach more than 25% CPU load.

Hyper-threading : enabled by default on all intel CPUs :
"For each processor core that is physically present, the operating system addresses two virtual (logical) cores and shares the workload between them when possible"
for example : I have a quad core i7- 6700k : windows "see" 8 cores (8 threads) : Max (when running only some pure Jitter patch) can only use 12.5 % of my total CPU ressources !
I tried to disable Hyper-threading in my bios settings and made some FPS tests : it’s better !
I can’t really tell "how much better", … but for sure, I got better framerate…
(not twice as much ! but sometime, only a couple of FPS make the difference and let you run a patch @90 fps instead of 45 …)
and Max is now able to use 25 % of my CPU.

… in order to use the remaining 75 % of CPU :
split processes as much as possible between several instances of Max / or standalone.

If anyone has other advice in the same vein, it’s very welcome !

Mathieu

Oct 12 2016 | 10:27 am

Hi Mathieu –

I’ll look into the @displaylist attribute, that’s interesting. Frankly I don’t think a scene with 512 cubes should trouble this card, but I’ll check to see if removing or using the displaylist helps.

Regarding CPU load, I’m very familiar with these techniques. I was referring to GPU load in my post.

It’s interesting to hear that disabling hyerthreading has resulted in higher fps for you. Not sure why that would be but perhaps there’s overhead in maintaining the virtual cores. I suppose choosing this would depend on what other processes you have running along with Max.

Best, Jesse

Oct 12 2016 | 10:51 am

The hyperthreading fix may or may not be due to saving the trouble of making bogus requests between CPU cores, if Max can’t make use of multiple cores. Disabling it may ensure that Max requires maximum allocation of a single core on process start-up, then likely to be the most underused core at start-up time.

Mathieu’s comments on CPU may still apply in this case – there is less of a guarantee in these Max patches that the bulk of the work is being done on GPUs, it all depends on which version of OpenGL is being made use of by Max. There is all of the control-level code which is arbitrating the various jit.gl objects, and it will potentially be a bottleneck to work which is actually done on the GPU, whereas an optimized C++ project in pure, recent OpenGL will have very little of this.

Graham had a few tips when I asked similar question perhaps one page ago in the thread, see if you can find it.

If you strip down even the example patch to the bare minimum, a totally empty scene perhaps, maybe that will help to pinpoint the exact addition to the scene which is causing the frame rate drop. Doing this would be a great help as it can be hard to find these issues if Graham and I are both developing on machines which do not tend to drop to 45fps – I’ll have another look today and see if I can do something small that triggers the drop.

Oct 31 2016 | 10:20 am

Hi,
I tried to get the Max_Worldmaking_Package to run but Max throws me an "Error 126 loading external htcvive". I guess there is some mismatch (or missing?) openvr_api.dll. I tried several combinations of 32bit and 64bit versions and even downloaded the version from https://github.com/ValveSoftware/openvr but no luck.
my steamvr version is from oct/11 version 1476136918 – if this is of any relevance.
cheers, marius.

Oct 31 2016 | 2:42 pm

Hey All,

camera mode for the vive object depends on @projection_mode frustum but this ignores near and far clipping. I have a particularly LARGE scene and this is causing objects that are far to be cut off or totally disappear. Any tips for giving my scene a deeper depth than what is provided through frustum?

Nov 01 2016 | 7:24 am

near and far clipping must be set using htcvive’s near_clip and far_clip attributes.
apparently, the message "‘configure" must then be send to htcvive in order to apply the new clipping…
seems quite unusual… maybe it’s a bug ? Graham ?

Nov 01 2016 | 7:56 am

@MARIUS.SCHEBELLA: The [htcvive] object is currently built against the OpenVR 1.0.2 SDK, I guess there could be some breaking changes with the latest driver, but I don’t see anything listed in https://github.com/ValveSoftware/openvr/releases nor any related issues on https://github.com/ValveSoftware/openvr so I doubt it. The openvr_api.dll is in the worldmaking package’s support folder, and should be picked up automatically by Max if the package is placed in My Documents/Max 7/Packages — but that may depend on using Max 7.2.5 or later for it to work, I’m not sure. If you already are, could you try installing the VS2015 runtime libraries (they’re free) from https://www.microsoft.com/en-us/download/details.aspx?id=48145, restart, and tell me if that fixes it? If so then there’s something I’ve set wrong in the VS project and I should fix that.

@THOMAS JOHN MARTINEZ / @MATHIEU CHAMAGNE: Yeah that looks like a bug, should be fixable by trigginging configure whenever those attributes are set. I’ve added the issue to https://github.com/worldmaking/Max_Worldmaking_Package/issues/15. Hopefully get some time this week to look at outstanding issues. In the meantime, as you say, sending the ‘configure’ message should be enough.

Nov 01 2016 | 9:25 am

Hi Graham and Mathieu that works for me. Thanks!

Nov 02 2016 | 10:07 am

Hi Graham, thanks for your reply. I am using the 64bit version of Max 7.3.1 and installed the Microsoft Visual C++ 2015 Redistributable (x64) package. Still the same error. Is it possible to get more information from Max – like which dll is causing the error or additional dependencies?

Nov 06 2016 | 12:54 pm

any tips on rendering vive_world to a secondary monitor so people in the installation can see what the vr headset is seeing (without two cameras) just one. I created another rendering context and window and can get the scene on a videoplane but only by running through a matrix object which decreases performance (also is still two cameras). Does anyone have an example of this?

Nov 07 2016 | 5:11 am

the easier and cheaper solution is to display in a square window only half of the picture using @tex_scale_x 2. in the jit.gl.videoplane used to monitor the miror texture :

— Pasted Max Patch, click to expand. —

and if you don’t like square window (or don’t have a square screen !), you can use a 3rd jit.gl.camera that you can place wherever you like (or copy position & orientation of HMD)
with this solution, you can tweak lens angle, camera position, add shaders / slabs before displaying to external monitor, … :

— Pasted Max Patch, click to expand. —
Nov 07 2016 | 9:30 pm

Mathieu I’m using the second patch you attached. Perfect. It was easy to get lost in node land but you sorted me out.

Attached is a version that allows you to fullscreen this "audience" display. The normal fullscreen message to the window object was messing up the rendering and stalling max. This takes advantage of the "pos" message. Works on a 1080p display or projector so adjust to fit your setup – but this should be pretty standard.

Thanks again.

— Pasted Max Patch, click to expand. —
Nov 08 2016 | 7:17 am

Hi Thomas, Mathieu,

The glitch with fullscreen is because on Windows going into fullscreen re-creates the OpenGL context, so the Vive’s internal textures become invalid. This can be fixed by sending a "connect" message to the [htcvive] object after entering exiting fullscreen.

For showing only a single eye on the desktop display, the best thing is to re-scale the videoplane to fit the window. Adding another camera is probably best avoided if possible, as it means rendering your scene yet another time (using up 33% of your rendering time), increasing the chance of getting a lower frame rate.

Both of these are added to the patcher below:

— Pasted Max Patch, click to expand. —
Nov 08 2016 | 7:35 am

@Marius,

I’ve just pushed a new build of the package which maybe might fix your troubles — could you try downloading from this link (the development version): https://github.com/worldmaking/Max_Worldmaking_Package/tree/devel

Thanks!

Nov 08 2016 | 7:50 am

By the way, this might be a handy tip for anyone struggling with 45fps performance on the Vive. It’s not a solution, but it helps. In the latest SteamVR beta (instructions on how to get it in link below) there’s a new interleaved reprojection method that does a lot to smoothen the experience when the scene can’t render at 90fps. After installing the beta, pop open the SteamVR settings, under Performance, enable "Allow interleaved reprojection".
https://www.reddit.com/r/Vive/comments/4kk9he/how_to_download_beta_steam_vr/

Nov 17 2016 | 9:24 am

Hey Graham

Is there any support for haptic feedback from the controls? Would be awesome to have some vibration for collisions!

Nov 22 2016 | 7:39 am

Not yet, but I’ve ticketed that and will see if I can add it. Shouldn’t be too hard.

Nov 22 2016 | 8:46 am

And indeed it wasn’t — ‘vibrate <hand> <intensity>’ message now added to htcvive :-)

Nov 22 2016 | 8:57 am

oooh baby

thanks Graham that was fast will try it out in the next couple of days

Nov 22 2016 | 9:35 am

Pushed a few updates to [htcvive] too: in addition to controller vibrations, now there’s support for getting the camera feed, and the controllers output tracking data in world space as well as tracking space (so they properly follow you as you move around), with a navigation example.

Nov 24 2016 | 12:16 am

Graham
many thanks for this update and for your last messages !
everything is smoother with the interleaved reprojection trick,
getting HMD camera as a gl texture is much more efficient than using jit.grab
and the new controller output in tracking/world space is definitely VERY useful ! (I had to ’emulate’ this feature with complicated and ugly combinations of anim.nodes and lot of localtoworld messages… it worked for me, but never took the time to share and ask for better solution… but now it’s much clear and efficient ! thanks, i can clean up my patches :)
+ the vibrator controllers are quite fun to use !!
a small FR : would it be possible to get the battery state of the controllers ? (so I could have a notification in my patch / installation saying it’s time to plug the controller to charge it …)

many thanks !

Mathieu

Nov 24 2016 | 8:54 am

Cheers. Added a ticket for battery state to the github. (In future, everyone please feel free to add issues/tickets there!)

Nov 30 2016 | 7:28 am

Unless I misunderstood something, I think we miss one more small thing in the new navigation and tracked controllers datas : HMD position & quat
for the controllers we now have :
position + tracked_position + quat + tracked_quat
but from 5th outlet (HMD) we only have tracked_position and tracked_quat
so when using the (very cool !) navigation-example patch , the camera is driven by jit.anim.node vive @name nav
actually, to get world coordinate of the HMD, we need to combine position and quat from jit.anim.node vive @name nav with tracked_position and tracked_quat
(adding .anim.node.position + tracked_position, and multiplying anim.node.quat * tracked_quat using jit.quat)

one other method would be to get camera positions from 2nd and 3rd outlet (left and right cameras)
HMD.position = (left.eye.position + right.eye.position) / 2
HMD.quat = left.eye.quat

but having HDM position and quat coming from 5th outlet would be much more straightforward :-)

I you agree, could I add this as a FR on github ?

thanks

Mathieu

Nov 30 2016 | 9:23 am

Hi,

Took me a moment to figure out what you were asking, but yes, this makes sense. The ‘quat’ and ‘position’ messages have now been added to the 5th outlet, and respect navigation in the same way as the controllers. (no need for FR!)

Other thoughts:
– Now that this object has a lot of outlets, and messages to route, I’m wondering if it doesn’t make sense to break up the object into several sub-objects, e.g. htcvive.controller, htcvive.hmd, htcvive.render, htcvive.camera etc. What you you all think?
– I’m also wondering if I might be able to merge a lot of the htcvive/oculusrift features into a generic hmd object that can support both…

Graham

Dec 01 2016 | 6:15 am

Thanks Graham for this new update !
(HMD position & quat + controllers battery : everything works fine ! and my navigation patch is now much cleaner :-)

Breaking apart external : I’m not convinced that it will bring any significant improvement to our patches…
as it is actually : I guess that we all do the same thing : connecting the different outputs (HMD / controllers positions & quat, camera texture, …) and input to other max patches with max standard named send / receive pairs…
for me, it works fine like this, and I can’t see the benefit of doing the same thing using dedicated externals.
but maybe I’m wrong ? could you imagine a scenario where it could make a real difference ?
of course, this would make cleaner and easier to share patches… but does it really worse the time & energy you would spend on it ?

I can think of an intermediate and maybe simpler solution :
providing a patch in the Max_Worldmaking_package with ready-to-use named send and receive objects connected to all inlets and outlets…
a kind of wrapper.
maybe including a jit.gl.render context // or jit.world ?
(but no stone donuts or cloudy sky ! :-)
So this patch could be the starting point of any VR project.
it could be instantiated as an abstraction, and all parameters and datas could be set and retrieved with send/receive (+ with inlet / outlet, and why not with attributes as well…)

that would allow to provide different additional patches demonstrating various techniques separately
(I mean : not including everything in the .maxhelp , but as distinct patches to open or instantiate like plugins :
– navigation
– using controllers
– adding a leap motion
– play with physics
– …

Well, that’s my modular way of thinking and patching… maybe not the universal / ideal solution !
I think it allows user to quickly test different things without leaving his main development patcher (and without reopening the main htcvive.maxhelp al the time, potentially leading to context name conflicts, crashes, …)

I could post an example / draft, i anyone think it’s a good idea…

About merging htcvive/oculusrift :
sure, 1 external to rule them all is a great idea !
but do the oculus touch and Vive controllers will have the same number of outputs, and range, and names, … ?
this will probably imply some sort of naming convention and range normalizing…
and more generally : does it make sens to open a patch made for example for a room-scale setup (HTCvive) with another HMD that provide "only" seated or standing VR ?…
that patch will need some adaptation, for sure… so renaming one external shouldn’t be the more complicated thing to do ;-)

Dec 01 2016 | 7:03 am

You’re right, a basic template would be a great thing to have. Or a couple really, e.g. for walking vs. flying type navigation. Simple enough to add a /templates folder to the package and drop them in there, then they’ll appear in File > New From Template. Actually that would be pretty awesome. A draft would be great, thanks!

Actually dedicated externals isn’t that much work, but I guess the advantage in reducing a couple of send/receive and route objects isn’t that much either. I’ll punt on that for now.

Main benefit of a unified [hmd] object is that patchers don’t need much editing — including the templates. The basic set of properties of Rift/Vive are similar enough at the moment, and OSVR is converging that way too; the seated/standing distinction doesn’t seem to matter much in terms of the external’s interface. There are just some extra features available for each (e.g. camera & chaperone for vive), and some differences in the buttons on the hand controllers. That said, the [htcvive] object should just work with the Oculus already, since it’s just using Openvr/SteamVR. Maybe I should rename it [openvr] or [steamvr].

Jan 23 2017 | 2:01 pm

Hey everyone,

Does anyone have Oculus Touch controllers? I’ve just commited code updates to the package at https://github.com/worldmaking/Max_Worldmaking_Package to add support for the oculus touch controllers, but I don’t have access to the hardware to test it on right now.

Cheers,

Graham

Feb 14 2017 | 7:58 am

Hi,

Could that change things for Mac Max users and Oculus?
https://cindori.org/vrdesktop/

It should be great if some owner of the required oculus (Oculus Rift DK2) could report some test!

Cheers

matteo

Viewing 36 posts - 301 through 336 (of 336 total)

Forums > Jitter