VPIXX hardware class?

I’ve got some time to look into this over the next few days. Hopefully I make some progress (if only to realise that the scope of the challenge is beyond me).

I have a couple of requests from my reading so far:

A script showing the current canonical use of a Bits#?

Looking through the API, there are a few deprecated methods. For example, the visual.Window class contains a deprecated bitsMode, who’s entry now says

DEPRECATED in 1.80.02. Use BitsSharp class from pycrsltd instead.

However, I can’t see the pycrsltd library being loaded anywhere. Instead, there’s the hardware.crs.BitsSharp class within psychopy/hardware/crs/bits.py. The example for this class shows

        from psychopy import visual
        from psychopy.hardware import crs

        # we need to be rendering to framebuffer
        win = visual.Window([1024,768], useFBO=True)
        bits = crs.BitsSharp(win, mode = 'mono++')
        # You can continue using your window as normal and OpenGL shaders
        # will convert the output as needed

Is that the current canonical method for using Bits#?

Ultimately I’d like to emulate something like this for the VIEWPixx, but this week I think I’d be content to get a high-bitdepth greyscale out.

The VPIXX python library

VPIXX software tools, available from here (may require a user account), contains a Python interface library pypixxlib. This is currently version 1.5, and seems quite full-featured in terms of hardware interface. The documentation for this is here.

Using their class-oriented API allows you to do things like this:

from pypixxlib.viewpixx import VIEWPixx3D
# viewpixx and VIEWPixx3D would need to be replaced by the appropriate devices. 
my_device = VIEWPixx3D() # Opens and initiates the device 
my_device.setVideoMode(’M16’) # Set the right video mode 
my_device.updateRegisterCache() # Update the device

which enables high-bitdepth greyscale mode on a VIEWPixx 3D. VPIXX write:

If you create an object that instantiates a class for a device you have connected, all possible functions attached to the object are guaranteed to work and to be applied on the correct device, assuming the operations are permitted.

That seems simple, but I expect it would then be difficult to pass this to PsychoPy’s Window class in a way that it would understand (e.g. LUT and gamma tables correct, correct passing of bits in special modes). I haven’t checked this yet (not in the lab right now) but I should have more info tomorrow.

There is also a lower-level wrapper module _libdpx. VPIXX write:

If you are programming for something more permissible or if you do not know which device will be used, you can use the _libdpx Wrapper Module, which does not guarantee the functions will work, but does guarantee all the functions will be called on the correct devices if you provide the appropriate commands.

These decisions aside, it seems to me that using the VPIXX pypixxlib is probably the best way to interface to the hardware. Then we can rely on VPIXX for maintaining low-level functionality.

A question here: how should I include a dependency to pypixxlib within PsychoPy? Obviously we don’t want this to be required for all users, but rather we would like the system to check for it if the VPIXX hardware is used. How should I implement that in an import statement?

MonitorCenter

Could someone check my high-level summary of how MonitorCenter is working? My understanding is that you have a monitorCentre GUI, which wraps around the functionality in calibTools.py. A monitor object is created and saved, that can contain calibration data if run. When an experiment script is run with a monitor name specified, the calibration data and other information is loaded for that monitor, and gamma correction is applied via that.

Does this description differ in any important way for the Bits#? I have read a bit about the difficulty of finding the identity LUT but haven’t examined that carefully yet.

I’m basically hoping to start with a really simple calibration script that would allow me to test high-bitdepth modes, but I want to keep the monitorcenter in mind as an eventual goal for integration.

Hopefully more info later this week.