I’m coding an experiment where a single trial involves 32 stimuli fading in and out of the display in different locations and at different times. Trials last about 15 seconds and the stimuli each fade in over one second, remain at maximum opacity for one second, and fade out over one second. Multiple stimuli are visible at once during trials. I have done this by defining each stimulus as a separate image with a display location that remains constant during the trial. All that changes during the trial is the opacity of each stimulus, which is updated on every frame. Each stimulus has an opacity of 0 at the start each trial, so they are invisible. Then, depending on the time in trial, the opacity of each stimulus will either remain the same, increase by a single step (if it is fading in), or decrease by a single step (if it is fading out). A single step is defined as the maximum opacity of the stimuli (set to 0.8) divided by the number of frames per second (calculated from the frameDur). The opacity of each stimulus is updated in the ‘Each Frame’ tab of a python code component.
I initially coded an online version of this task in JS to run on Pavlovia and it worked fine. I am now trying to run an offline version of the task and having issues. The problem is that after the first few seconds of a trial have passed, the stimuli stop fading in to max opacity, instead remaining very faint before fading out. The issue is that frames are being dropped in the middle of trials. The time between successive frames goes from about 0.01s at the start of trials to about 0.1s for a period in the middle of trials, before going back down to around 0.01s again towards the end of each trial. When this increase in time between frames happens is different on every trial, but always starts to get particularly bad about 4 or 5 seconds after the trial begins. I can’t see that this time corresponds to anything in particular, but the middle of trials is when the most stimuli have an opacity above 0. I have tried running the task with fewer stimuli and I still get the same problem, just less extreme. I have tried to make my ‘Each Frame’ code more efficient, storing the opacities in an array and updating them simultaneously using logical indexing rather than using an if loop for each stimulus, however this didn’t fix anything.
I am using macOS Catalina but I still get the same problem when running the task on other macs and on our PCs we use to test participants running windows 64, so it seems to be an issue with the code rather than with the computer. I have coded it using PsychoPy version 2021.2.3 but I have also tried various older versions of PsychoPy and this doesn’t solve the problem.
I am confused why I am dropping frames as the task worked fine online, so perhaps it is a problem specific to my python code or with PsychoPy offline. Any ideas for how I can avoid dropping frames or alternative ideas for how I can update the opacity of multiple stimuli each frame without dropping frames would be greatly appreciated! It may also be helpful to know the kind of things that tend to slow down the frame rate in general, or whether it is possible to force a frame rate throughout the course of an experiment. Thank you!