I would like to create an experiment in PsychoPy that uses volumetric 3D images as stimuli and allows participants to rotate and zoom in/out on them interactively.
This is not my video, but we are also working with 3D images of luggage, and this is exactly the kind of thing I would like to achieve in PsychoPy: https://www.youtube.com/watch?v=EslqbxdbDWc
I’ve seen that there is now the functionality in PsychoPy to load up wavefront .OBJ files, but these don’t allow for the transparency that I require (it’s important that participants can see through the 3D volume to the objects located within). The images I have are currently saved as .VTI files, but are essentially just a structured 3D grid of data points that I’m able to manipulate in NumPy.
Does anyone know if this is possible in PsychoPy?