psychopy.org | Reference | Downloads | Github

Stereo display support

screen

#1

Hello,

I’m developing libraries for PsychoPy to provide stereo display support on systems that don’t support stereo through the graphics driver. I’ve made lots of progress over the last few weeks and I’m starting to finalize the specification for the rendering pipeline. I hope to have a release in the next few months after extensive testing. The first release will be simple and function much like the current GL_STEREO extension (set buffer and draw), but will render as anaglyph, interleaved, side-by-side, page flipping, etc. modes.

I would like to have an informal survey in this thread regarding what technology people use to conduct stereo experiments (shutter glasses, multi-display stereoscope, VR headsets, etc.) to focus my development and testing efforts. Also your thoughts and ideas are also welcome.


#2

I am using shutter glasses (Nvidia 3D Vision 2) controlled via Matlab + Psychphysics toolbox.

Thanks for your efforts, having such libraries would definitely be helpful!

Cheers


#3

I am using a passive stereo display with polarized glasses. I conducted several studies using PsychoPy already by using side-by-side (Top&Bottom) encoded movie files and presenting them as regular movies through PsychoPy. The display does the 2D(Side-by-Side)->3D conversion itself.

Looking forward to see your library and its capabilities.


#4

Hello,

Thanks for the replies so far.

I’m quite impressed by this result (never seen Side-by-side displays used with PsychoPy). What display are you using to show the content?


#5

It is a 55 inch stereo TV that we connected to our PC. So basically, it is quite a simple setup: Present a T&B encoded 3D movie and treat is as a usual 2D movie in PsychoPy. Use the TV remote and tell the TV that we present T&B encoded material -> TV interlaces the stereoscopic information, such that it can be watched with a pair of polarized glasses.


#6

Hello, I am really interested in the libraries you mention here. Did you have any success in finalising them? I am new to PsychoPy and hoping to run a stereo exp.
Cheers, Kirsten


#7

Hi Kirsten,
depending on your setup you can consider to create side-by-side pictures and use PsychoPy as usual… You will then only need to tell your 3D display to present the side-by-side images stereoscopically.
Frank.


#8

Hello,

Development has slowed since I’m deep into my PHD dissertation. I did make lots of progress on libraries that will automatically handle various stereo formats; I’ve been able to test some of them internally with excellent results. However, the API is too unstable for general release, and likely will break experiments built on it.

I’ve been given approval to dedicate more time on this, so expect development to speed up in the next few weeks. In the mean time, if you specify what type of stereo display you are using, I might be able to give support.


#9

Dear Frank and Matthew, thanks for replying. At the moment my setup is crude. I am using an iMac (Retina 5K, 27-inch, Late 2015) for running Psychopy and displaying the stimulus. So no 3D display monitor. To pilot the experiment/proof of concept I am creating red/blue anaglyphs for viewing with glasses (not great-I know). However, if all goes well I hope to find a little money to improve the setup. I guess I’m looking for the simplest (and hopefully cheapest) setup. I have used True 3Di Monitor before (with Matlab). But now I am wondering if a TV or VR headset might be the more economical way to go? The TV post above is interesting. I am really new to Psychopy, but discovering that it is quite nice. Thanks for taking the time to post during you PhD, mdc! A busy time indeed. Cheers, Kirsten.


#10

Hello Kirsten,

Consumer TVs tend to apply filtering to produce a nice picture to appeal to viewer preferences rather than a correct one. In some cases that’s okay, I’ve seen experiments conducted using them. However, you can run into problems when high fidelity is needed. VR headsets are becoming popular with some labs, they tend to be useful in situations where wide FOV’s are needed or the researcher is interested in head movement/placement. They have course rasters making them not suitable for experiments testing the acuity limits of stereo.

I would recommend a decent monitor and 3D shutter glasses for a low-budget setup, but a TV can work fine if fidelity is not your primary requirement.

I have basic anaglyph support working for PsychoPy. I can PM you details on how to set it up, but it might be not yet suitable for production environments.


#11

Hello,

Just an update.

I have been developing/using the stereo libraries for sometime for my own research and managed to create a stereo window class supports a wide variety of stereo displays, including HMDs. There are also custom stimuli classes for presenting 3D models with textures/materials/lighting and a basic graphics math library to eliminate the need for fixed-function/immediate mode OpenGL rendering.

I’ve been really busy this last year with dissertation and course work, so testing and documenting such an enormous code base for public release is difficult. However, I’m going to soon have support for this process. No ETA yet, but certainly before the research get published.


#12

Hi, Thanks for the effort! Are you going to release the stereo classes soon? I am actually needing it right now. Thanks.


#13

Hi,

There has been some recent effort to implement expanded stereo support into PsychoPy, including additional tools for multiview and 3D model rendering (undocumented, but they’re there). Things have been slow for general stereo support since the Oculus Rift was a priority for me, however, much of the code needed for additional stereo display modes are already in PsychoPy. I just need to find the time to pull it all together.

May I ask what type of stereoscopic display you are using? I’m trying to prioritize which to add first.


#14

Thanks for the reply! Do you have time recently to send a pull request to psychopy? Otherwise, I have to write it myself, basically going through the process you had. We actually want to use both Anaglyph and multiscreen approaches, but we are also interested in other modes as well. If possible, could you please also post an example code snippet? Thanks. Highly appreciate your help!


#15

Actually, I tried to get the psychopy with stereo support. But I never found the stereo class you mentioned in the github issue tracker.

You could also send me a link or the code, no matter which way is at your convenience. Thank you!


#16

Hello,

To clarify there are no stereo classes yet, only supporting functions for stereoscopic rendering, which are a bit involved to use on their own.


#17

Thank you. Please let me know once you send the pull request.