Max VR and 360 Photo/Video Viewing

Brandon's icon

Is it possible to view 360 photos/videos in Max? I have 360 files I want to view and manipulate live but I'm not sure where to start.

Also, is it possible to view them using VR equipment, such as the Vive or Rift?

pseudostereo's icon

This should get you started.

Max Patch
Copy patch and select New From Clipboard in Max.

Graham Wakefield's icon

Here's a patcher demonstrating viewing a 360 panorama image on a Vive or Rift.

You'll need to install the vr package from Max's package manager.

Max Patch
Copy patch and select New From Clipboard in Max.

Sergejs Dombrovskis's icon

Just wanted to say thank you for both suggestions!

The code by @ PSEUDOSTEREO does exactly what I needed. I am still tweaking it to perfection:
among other things, the original code eats a lot of GPU % and displays image with unnatural colors, but I think I solved these 2 issues already. It would take me forever to get the such code working, including the mouse interaction.

And it is great to know that VR viewing is an option straight from Max as well :)

Markus de Seriis's icon

Hey Sergejs,
i am trying to use this patch, too. Can you please tell me what changes you made in order to show
the video in normal colors?

Thanks a lot Markus

Sergejs Dombrovskis's icon

Hi, the trick that worked for me was to enable shadows.

But resource usage is then extremely high. I also reduced the framerate to 0.1 FPS when view is unchanged to make it use less resources, but that is not ideal either. PErhaps with VR extensions it is possible to make more resource-effective approach (even though this method totally works, if resources can be wasted).

Max Patch
Copy patch and select New From Clipboard in Max.

nycto511's icon

@Graham Wakefield @pseudostereo how do I get the position of the headset? I just need the spherical coordinates of the VR headset movement to be sent to a spatial audio patch.

Graham Wakefield's icon

The [vr] object's 4th output sends all kinds of information about the current tracking data.Have a look in the [p tracking] subpatcher of the [vr] helpfile for some patching usage examples.

Since you are interested in the head, you'd want to [route head] fist.

Then you could look at the position & quat messages that come out of that.

tracked_position/tracked_quat are in tracking-space coordinates (in the real world, relative to the vive beacons).
position/quat are in world-space coordinates (in the virtual world, and thus will be affected by navigation in the world, i.e. by changes to @position etc. of the [vr] object)
It's up to you which one makes more sense.

Orientation (rotation) is given as a quaternion as the "quat" or "tracked_quat" message. You can use jit.quat2euler to convert that into the spherical angles typically used for spatial audio. The angles will be in radians (multiply these by 180/pi if you want degrees). You might need to set the @rotate_order attribute of jit.quat2euler to get what you want; e.g. typically you'd want orientation about Y (azimuth) first, then orientation about X (elevation), which if I recall correctly means using [jit.quat2euler @rotate_order yxz]. You should experiment to verify this.

Positions are in meters. If you're just doing 360 audio, no distance based stuff, you don't need the position data; but if you want to do positional stuff then knowing the relative position from head to sound source location will be essential to compute distance and relative angle to the sound (and if you're doing radiance pattern, the relative orientation of the sound object). That would involve a bit of math patching -- vexpr is handy for that.

Hope that helps!

nycto511's icon

Sorry @graham wakefield I only saw your reply now - and yes I did exactly like that at the end and everything works fine :) I presented my project at the last IRCAM forum!

Graham Wakefield's icon

Great! Will you share the project publicly?

nycto511's icon

@Graham Wakefield I wrote a paper about the technical part for a conference and I'm keeping developing the project so to hopefully take it out in the world - I'll send you the link to the paper as soon as is out either for the conference or my blog. Next step is to send the global position of the user from Unreal Engine to Max Spat, and I want to try to send the particle system data to Max Catart so to have a sounding granulator particle system, but I'm very much at the initial stages.

marleynoe's icon

@MARTA NOONE I would be interested in this, too!