Arduino sends trigger fast but visual stimulus is delayed by 3/4 frames

If this template helps then use it. If not then just delete and start from scratch.

OS Tested on MacOS 15.4.1 and Windows 10 computer
PsychoPy 2024.2.4
Standard Standalone Installation? (y/n) y
Do you want it to also run online? (y/n) n
What are you trying to achieve?:
My goal is to estimate the latency of my experimental setup with Arduino. To test that I have simple test: Arduino sends a trigger (0/1) every 2000ms through serial. There is a black rectangle in the middle of the screen. When Psychopy receives the trigger, the rectangle changes color to white for 500ms and then back to black. A photodiode connected to Arduino reads the time when the rectangle turns white and measures the latency between the trigger transmission and change of color.

What did you try to make it work?:
I use builder. In the “Begin experiment” tab, I setup a serial communication with the Arduino in a background thread with Python threading. Arduino transmits a trigger flag every 5ms to the serial port. The flag turns to 1 every 2000ms and stays 1 for 500ms (all timings rely on Arduino’s internal timer) and then becomes 0. The background thread pushes this flag to a Python deque (in Python collections). In “Each frame” tab, which is a slower process, I’ll read the last value in the queue and if the value was 1, I change the fill color of a rectangle polygon from black to white. This way my data stream will be very fast, every 5ms, and my graphical presentation will also rely on the latest arrived data + some latency.

I use this code in “Each frame” tab:

# read the flag from deque:
flag = data_queue[-1]

# change color of rectangle based on flag:
if flag == 1:
     flashRect.fillColor = 'white'
else:
     flashRect.fillColor = 'black'

What specifically went wrong when you tried that?:
So, my expectation was that the rectangle changes colour in the next frame after the flag was received. So a latency of <16.7ms. However, I noticed that my latency was larger, around ~66ms, meaning that after the trigger is sent, the rectangle changes color after 3 frames not 1. Obviously, first I thought that my serial transmission is delayed and the flag is arriving late. To test that, in “Each frame” tab I also saved some other timing info transmitted from Arduino. I then reviewed those and made sure that at frame N the flag was arrived but the rectangle changed color at frame N+4. This behaviour is very consistent in both windows and mac setups I tested. So I think the problem is that Psychopy takes 3/4 frames to change color of a rectangle but that sounds unreasonable so I thought maybe the color change code I am using is not optimized or I should change some settings for the color change to take immediate effect after 1 frame.

Thanks for any insight and help you guys can provide!

Hello @alighavam

Does this excerpt from section 3.7.1 ‘Fast and slow functions’ of the manual, help?

The fact that modern graphics processors are extremely powerful; they can carry out a great deal of processing from a very small number of commands. Consider, for instance, the Coder demo elementArrayStim in which several hundred Gabor patches are updated frame by frame. The graphics card has to blend a sinusoidal grating with a grey background, using a Gaussian profile, several hundred times each at a different orientation and location and it does this in less than
one screen refresh on a good graphics card. There are three things that are relatively slow and should be avoided at critical points in time (e.g. when rendering a dynamic or brief stimulus). These are:
1. disk accesses
2. passing large amounts of data to the graphics card
3. making large numbers of python calls.
Functions that are very fast:
1. Calls that move, resize, rotate your stimuli are likely to carry almost no overhead
2. Calls that alter the color, contrast or opacity of your stimulus will also have no overhead IF your graphics card supports OpenGL Shaders
3. Updating of stimulus parameters for psychopy.visual.ElementArrayStim is also surprisingly fast BUT you should try to update your stimuli using numpy arrays for the maths rather than for. . . loops
Notable slow functions in PsychoPy calls:
1. Calls to set the image or set the mask of a stimulus. This involves having to transfer large amounts of data between the computer’s main processor and the graphics card, which is a relatively time-consuming process.
2. Any of your own code that uses a Python for. . . loop is likely to be slow if

Or this from section 3.8.9 ‘Reducing dropped frames’

1. run in full-screen mode (rather than simply filling the screen with your window). This way the OS doesn’t have to spend time working out what application is currently getting keyboard/mouse events.
2. don’t generate your stimuli when you need them. Generate them in advance and then just modify them later with the methods like setContrast(), setOrientation() etc. . .
3. calls to the following functions are comparatively slow; they require more CPU time than most other functions and then have to send a large amount of data to the graphics card. Try to use these methods in inter-trial intervals. This is especially true when you need to load an image from disk too as the texture.
• GratingStim.setTexture()
• RadialStim.setTexture()
• TextStim.setText()
4. if you don’t have OpenGL 2.0 then calls to setContrast, setRGB and setOpacity will also be slow, because they also make a call to setTexture(). If you have shader support then this call is not necessary and a large speed increase will result.
5. avoid loops in your python code (use numpy arrays to do maths with lots of elements) Note: numpy arrays will not work for online experiments, which use JavaScript*
6. if you need to create a large number (e.g. greater than 10) similar stimuli, then try the ElementArrayStim (currently not supported for online experiments)

Best wishes Jens