Presenting 360º videos immersively on Meta Quest Pro - PsychXR?

PsychoPy version: 2022.2.5
OS: Microsoft Windows 11 Home


I am currently programming an experiment where I want to present an immersive video (i.e., 360º video) using the VR device “Meta Quest Pro.”

When I use the headset, I can run the experiment from start to finish, using the headset “as a screen”. However, the video is not presented immersively; instead, it is shown on a screen within the VR viewer.

What I want is for the video (.mp4) to be played immersively when presented. The video has been downloaded to be viewable in 360 degrees, but I have not found a way to do this in PsychoPy yet.

I have found that there might be ways to achieve this using PsychXR, but it seems this is not compatible with the current version of PsychoPy, is that correct? Additionally, I am not sure if these solutions are compatible with the “Meta Quest Pro.” Are there other solutions available for this type of device?

Any help or guidance on this matter would be very useful.

Thank you very much.

Hi @mdc , I realized that maybe being the lead developer of PsychXR and being part of the Psychopy staff as a software developer you could help me. Thank you very much and sorry for tagging if it wasn’t the right thing to do.

I am trying to figure out the same thing. Have you found a solution?

PsychXR has no means to handle video, the developer must handle video decoding and rendering themselves via OpenGL. All PsychXR does is provide a means of passing textures to the VR device swap chain and obtain sensor data. It might be possible to do this using MovieStim in PsychoPy, however the VR interface isn’t compatible with it.

If you want to do something like this, consider the following steps:

  1. Generate a sphere mesh (‘’ does this as an example)
  2. Decode video frames and set the texture for that mesh
  3. Get the sensor values and apply transformations to the mesh according to head movement
  4. Render the scene to the framebuffer and submit it to the compositor

Thanks for the answer. I’m going to dig a little deeper into what you’re suggesting because there are still things I’m not quite sure how to solve (for example, going from the 2D view to the immersive view I’m looking for). But any progress I can follow in this thread, as I think there are a lot of people who could benefit from knowing how to do this.