psychopy.org | Reference | Downloads | Github

Screen Update Latency

I have created a simple RSVP paradigm that flashes images at a given rate, but I am getting a latency between calling win.flip() and the time when the screen actually updates of ~85ms with a jitter/variance of ~7ms at best. Is there anything I can do to reduce this latency, or at least decrease the jitter? Is this similar to the values anyone else gets with your devices? How do you typically deal with latencies?

RSVP for Latency Code

from __future__ import division
from psychopy import visual, event, core
from psychopy.hardware import keyboard
from pylsl import StreamInfo, StreamOutlet
import os
import xlrd
import numpy
import time
import psychopy.iohub

### INPUTS
xpix=1600 #your screen width in pixels
ypix=900 #your screen height in pixels

#how many trials
npics=100
#how  long you want each image to show
t=.5 #seconds

#LSL Setup
info = StreamInfo('PyMarkerStream', 'Markers', 1, 0, 'string', 'myuidw43536')
outlet = StreamOutlet(info)
#initialize markers
markernames=['s','b','w','k','e','E']
         #start, black, white, keypress,end,escape
         
#give you time to start lab recorder
t2start=5
print("you have ",t2start," seconds to start lab recorder")
time.sleep(t2start)

###

#Initialize window
win =visual.Window(size=[xpix, ypix],units="pix",fullscr=True,
                    waitBlanking=True,#after a flip wait for the blank before continuing
                    mouseVisible=False)

#Initialize image stimulus
black_img=visual.ImageStim(win=win,units="pix",size=[1200,800],image="*\black.png",name='black')
white_img=visual.ImageStim(win=win,units="pix",size=[1200,800],image="*\white.jpg",name='white')
imgs=[black_img, white_img] #list of images
mkrs=[1,2]

#Initialize a text stimulus to display beginning, end, and breaks
txt= visual.TextStim(win=win, pos=(0,0))

#Initialize a Keyboard to check for escape key to end experiment
kbend = keyboard.Keyboard()
kbpress=keyboard.Keyboard()

#Initialize timer to track trials 
trialtimer=core.CountdownTimer()

###RSVP Experiment Begins
outlet.push_sample(markernames[0])#push start marker
print("Starting Experiment")
core.wait(1)

for j in range(0,npics): #go through each image, start at 1 bc title of column
    trialtimer.reset()
    trialtimer.add(t) #adds time to the countdown
    imgs[j%2].draw()#if even draw black if odd draw white
    
    #send marker
    win.callOnFlip(outlet.push_sample, markernames[int(mkrs[j%2])])
    win.flip()
    
    while trialtimer.getTime()>0: #keep it on screen for said time
        #end experiment if escape was pressed
        if kbend.getKeys(keyList=['escape']):
            outlet.push_sample(markernames[-1])
            win.close()
            core.quit()
        #send marker for any other keypress 
        keys=kbpress.getKeys()
        if keys!=[]: #if a key was pressed
            outlet.push_sample(markernames[4])           

#push end marker
outlet.push_sample(markernames[5])

print("End of Experiment")
win.close()
core.quit()

Latency Calculation
To test the latency of the paradigm I changed the images to black and white, sent markers from psychopy to LSL after calling win.flip() and used a photodetector (also sending markers to LSL via arduino serial port) to determine when the screen actually changed. I subtracted the marker time from the time the screen flip was detected. Here’s an example of the results from a gaming computer with a NVIDIA GeForce GTX 1080 with Max-Q Design graphics card:
image
As you can see, the marker is actually being sent ~85ms early.

Things I have already tried (not exhaustive)

  • moving win.flip() before and after sending the marker
  • simplifying the code in the for loop and eliminating the keyboard component
  • tested timing of arduino/photodetector (consistently takes only 126µs to read and send the value)
  • all of the things mentioned on the millisecond precision page, (turning off triple buffer, 100% scaling, adjusting other graphics card settings, etc.)

The only thing that affected the latency was running the exact same experiment on a gaming computer vs another laptop with Intel UHD Graphics 630 and NVIDIA Quadro P620 graphics cards. On the other computer, both cards led to a worse average latency of ~110ms and similar jitter of ~10ms. Is there anything else I can try?

After reading through the psychopy timing paper and checking out their code I realized one thing I did not try is using frames instead of time.

However, since I already had the photodetector all set-up, I’ve decided to just use the photodetector during the experiments, and use the photodetector data to correct the marker times in matlab after the fact.

I wouldn’t say this fully solves the underlying issue, so I’d still be happy to hear anyone else’s methods for dealing with latency and jitter primarily