Frames being dropped when updating the opacity of multiple stimuli on every frame

I’m coding an experiment where a single trial involves 32 stimuli fading in and out of the display in different locations and at different times. Trials last about 15 seconds and the stimuli each fade in over one second, remain at maximum opacity for one second, and fade out over one second. Multiple stimuli are visible at once during trials. I have done this by defining each stimulus as a separate image with a display location that remains constant during the trial. All that changes during the trial is the opacity of each stimulus, which is updated on every frame. Each stimulus has an opacity of 0 at the start each trial, so they are invisible. Then, depending on the time in trial, the opacity of each stimulus will either remain the same, increase by a single step (if it is fading in), or decrease by a single step (if it is fading out). A single step is defined as the maximum opacity of the stimuli (set to 0.8) divided by the number of frames per second (calculated from the frameDur). The opacity of each stimulus is updated in the ‘Each Frame’ tab of a python code component.

I initially coded an online version of this task in JS to run on Pavlovia and it worked fine. I am now trying to run an offline version of the task and having issues. The problem is that after the first few seconds of a trial have passed, the stimuli stop fading in to max opacity, instead remaining very faint before fading out. The issue is that frames are being dropped in the middle of trials. The time between successive frames goes from about 0.01s at the start of trials to about 0.1s for a period in the middle of trials, before going back down to around 0.01s again towards the end of each trial. When this increase in time between frames happens is different on every trial, but always starts to get particularly bad about 4 or 5 seconds after the trial begins. I can’t see that this time corresponds to anything in particular, but the middle of trials is when the most stimuli have an opacity above 0. I have tried running the task with fewer stimuli and I still get the same problem, just less extreme. I have tried to make my ‘Each Frame’ code more efficient, storing the opacities in an array and updating them simultaneously using logical indexing rather than using an if loop for each stimulus, however this didn’t fix anything.

I am using macOS Catalina but I still get the same problem when running the task on other macs and on our PCs we use to test participants running windows 64, so it seems to be an issue with the code rather than with the computer. I have coded it using PsychoPy version 2021.2.3 but I have also tried various older versions of PsychoPy and this doesn’t solve the problem.

I am confused why I am dropping frames as the task worked fine online, so perhaps it is a problem specific to my python code or with PsychoPy offline. Any ideas for how I can avoid dropping frames or alternative ideas for how I can update the opacity of multiple stimuli each frame without dropping frames would be greatly appreciated! It may also be helpful to know the kind of things that tend to slow down the frame rate in general, or whether it is possible to force a frame rate throughout the course of an experiment. Thank you!

Personally, I prefer to make Each Frame updates in code, and only when there is an actual change, and sometimes I also only update when frameN%2 or frameN%3 == 0 to avoid having a resource overhead for imperceptible changes.

Thank you very much.

Unfortunately this approach didn’t work for me here, updating only every second or third frame still led to an inconsistent frame rate and just made the changes in opacity appear jumpy.

How far did you get with my other suggestion?

i.e. only object_x.setOpacity(opacity_x) only when the opacity changes.

Unfortunately that didn’t work in this case either. Using setOpacity actually seemed to result in more frames being dropped than when the opacities were defined as code variables whose values changed on certain frames.

Interesting. Perhaps I should consider going back to variables in components rather than updating in code.

The other thing that occurred to me was that frameDur might be different offline and online, so you may get dropped frames offline because you are trying to update more often.

Ah that’s a good thought, thank you, but I can’t seem to fix it. The code is a totally separate file to my online version so I don’t think the previous online frameDur would interfere. Manually setting the opacity of stimuli to update based on a frame rate of 30Hz did make trials look a bit more smooth visually but I’m still dropping lots of frames so the timing of my study will be off.

In fact, I made a totally new experiment where I just presented my background image for 15 seconds and I still dropped frames, which seems odd as nothing was being updated on each frame. Could this perhaps be a problem with my PsychoPy or mac settings?

I’m still stuck with this problem. I’ve attached a stripped down version of the code I’m using - if anyone is able to take a look and suggest what might be the problem that would be great!

The ‘online’ video shows what the trials should look like, with all stimuli fading in to maximum opacity, as they were when I coded the study in JS. The ‘offline’ video shows what’s happening now with my code in Python, with the stimuli in the second half of trials being much more faint and slower to fade in and out.

I tried running the code on a much more powerful computer recently and the stimuli were not as faint however the problem persisted. I think the problem therefore lies in my code rather than the machine I’m using.

Hi There,

That is a fair few stimuli to be simultaneously varying the opacity of on each frame, so it isn’t too surprising that frames are being dropped - though it is surprising the problem is only locally and not in browser, so let’s talk through some possibilities.

A couple of things I might suggest:

  1. I notice the images are mostly simple shapes - have you tried using polygon components instead of images?
  2. For local use, Might I suggest instead the use of element array stim this creates one stimulus of component parts that can be independently modulated, rather than many things, which is a little more efficient. You can find a demo of how to use them in PsychoPy coder view > demos > stimuli > elementarrays. That demo makes many of the same stimulus, though I believe you can use an array of shapes instead. It will probably feel a bit more complex to start with - though it might be more precise (note element arryas do not work online yet - I’m only suggesting this to resolve the local problem!).

Hope this might give some more suggestions to look into.

Becca

Hi! I was wondering if you were ever able to resolve this problem, and if so, could you kindly share how you went about it?