Show only part of a video stimulus in psychopy

Is it possible to have Psychopy’s MovieStim play only a certain spatial percentage of a movie file?

I would like to be able to show 0-25% of the x-size of a stimulus, then 25-41% etc. It is not feasible for me to manually crop the videos, because I need to control the crop coordinates at runtime.

I am currently using this code, and I was hoping to get the effect I want with the MovieStim.size attribute somehow, but that invariably controls the size of the entire movie.

No, this isn’t a property of MovieStim, but it would be possible by using an aperture of varying size I think. Worst case scenario is that you could cover your movie with squares that are the same color as the background :wink:

Could you show me an example of how to use the Apperture object?

I tried a few things, including dropping this

visual.Aperture(win, size=0.2, pos=(0, 0), ori=0, nVert=120, shape='square', inverted=False, units="norm", name=None, autoLog=None)

at various points in my code, but ist seems to do nothing.
Do I need to somehow pass it to the MovieStim object?

There’s an example in the demos>stimuli menu.

No, you don’t pass it to anything. When enabled (on by default) it is applied to any other drawing being performed

Strange that it hasn’t worked yet, then. But I hope the usage examlple will clear stuff up.
I can’t however, find the demo you mentioned (having installed psychopy-1.84.0):

PsychoPy2 Demos/
├── BART
│   ├── balloons.xlsx
│   ├── bang.wav
│   ├── bart.psyexp
│   ├── blueBalloon.png
│   ├── greenBalloon.png
│   ├── README.txt
│   ├── redBalloon.png
│   └── trialTypes.xlsx
├── branchedExp
│   ├── branching.psyexp
│   ├── README.txt
│   └── trialTypes.xlsx
├── __init__.py
├── __init__.pyc
├── __init__.pyo
├── iohub
│   ├── stroop_eyetracking
│   │   ├── LC_eyegaze_std.yaml
│   │   ├── README.txt
│   │   ├── SMI_iview_std.yaml
│   │   ├── SRR_eyelink_std.yaml
│   │   ├── stroop.psyexp
│   │   ├── tobii_std.yaml
│   │   └── trialTypes.xlsx
│   └── stroop_keyboard
│       ├── iohub_config.yaml
│       ├── README.txt
│       ├── stroop.psyexp
│       └── trialTypes.xlsx
├── mental_rotation
│   ├── F.png
│   ├── FR.png
│   ├── MentalRot.csv
│   ├── MentalRot.psyexp
│   └── README.txt
├── navon
│   ├── bigHsmallH.png
│   ├── bigHsmallS.png
│   ├── bigSsmallH.png
│   ├── bigSsmallS.png
│   ├── mask.png
│   ├── NavonTask.psyexp
│   ├── README.txt
│   └── trialTypes.xlsx
├── practical IAT
│   ├── All.csv
│   ├── All_rev.csv
│   ├── Creat_Prac.csv
│   ├── Creat_Prac_rev.csv
│   ├── Good_Bad.csv
│   └── IAT.psyexp
├── psychophysicsStaircase
│   ├── psychophysicsStaircase.psyexp
│   └── README.txt
├── psychophysicsStairsInterleaved
│   ├── interleaved_SF_contrast.psyexp
│   ├── README.txt
│   └── stairDefinitions.xlsx
├── ratingScales
│   └── ratingScaleBuilder.psyexp
├── README.txt
├── sternberg
│   ├── mainTrials.xlsx
│   ├── pracTrials.xlsx
│   ├── README.txt
│   └── sternberg.psyexp
├── stroop
│   ├── README.txt
│   ├── stroop.psyexp
│   └── trialTypes.xlsx
├── stroopExtended
│   ├── README.txt
│   ├── stroop.psyexp
│   ├── stroopReverse.psyexp
│   ├── trialTypesReverse.xlsx
│   └── trialTypes.xlsx
├── voiceCapture
│   └── voiceCapture.psyexp
└── word_naming
    ├── conditions.csv
    └── word_naming.psyexp

Those are the Builder demos, but you’re coding. Coder has its own demos menu. You can also see the Coder demos linked here if you aren’t using Coder:

Ok, I was looking under the wrong window.
Perhaps expectedly so, I found the aforementioned demo under the coding window.

The mistake with my example was that I did not assign the call to a variable; the correct usage would have been:

aperture = visual.Aperture(win, size=0.2, pos=(0, 0), ori=0, nVert=120, shape='square', inverted=False, units="norm", name=None, autoLog=None)

This does what I initially wanted, except that it requires some more roundabout calculations to define the position of the video region of interest.