Recording infants looking time

Hi all,

We want to write a code that records the duration of infants looking behaviour when they look at a fixed image on the left or right of the screen.

A trial starts with an image on the left or right screen, and we play a continuous stream of auditory samples until the infant looks away 2s.

We want to record the infant attention to the stimulus; however, we want to move subsequent trial if the infant looked away more than 2s.

We use the ‘up’ key to record looking behaviour, but there is a problem between actual time because it does not look normal. Please see the message I got when I ran the study.

########### Running: C:\Users\Infant Lab\Desktop\HSs\HSsS_lastrun.py ###########

17014.4448 INFO Loaded monitor calibration from [‘2021_05_18 12:31’]

pygame 1.9.6

Hello from the pygame community. https://www.pygame.org/contribute.html

up 268626.6679573 0.7888032999471761

1623939166

up 268630.5078914 4.628737399994861

1623939170

up 268634.8439907 8.964836700004525

1623939174

up 268638.131816 12.252661999955308

1623939177

up 268641.507716 15.628561999998055

1623939181

up 268644.3395635 18.460409499995876

1623939183

0.8805

My supervisor has kindly helped me writing the code, but I assume there is a problem between actual time and recording time.

Please see the code below and help me to fix it.

Begin Experiment
from psychopy.hardware import keyboard

kb=keyboard.Keyboard()

Begin Routine
kb.clock.reset()
kb.getKeys(clear=True)

Each Frame
keys=kb.getKeys(['up'], waitRelease=False)

for thisKey in keys:
if thisKey.name=='up':
print(thisKey.name, thisKey.tDown, thisKey.rt)
print(clock.getAbsTime())
if thisKey.tDown - clock.getAbsTime() >2:
continueRoutine=False

Many thanks in advance.
Best,
Zehra

1 Like

Hi,

Please use triple backticks to format text as code. This makes the post much more readable.
```
print(‘Hello World’)
```

… yields:

print('Hello World')

When you use waitRelease=False, can you not use thisKey.duration to find the duration of a keypress? Check docs.

Thanks for your reply @filiabel. I will use them and work on the coding.

1 Like

The next release of psychopy (should be coming out in a matter of hours, if all goes well) will include eyetracker support in Builder and a component (:roi: Region Of Interest) which does exactly what you describe - progresses the experiment when the participant looks at / away from an area. As with any .0 release, approach with caution and keep an eye out for bug fix releases over the next few weeks, but I think this will make what you’re trying to do a lot simpler :slight_smile:

Hi @TParsons,

That’s good to know, and I hope it won’t take long :crossed_fingers:
I must work on the coding with the current one.

Best wishes,
Zehra

1 Like

Hello all,

I have the same kind of design for my experiment. I have tried to do it with “look away”, but a funny thing happens : the trials stops and goes on to the next stimuli when I look at the screen.
In other words, it does exactly the same thing as the “looking at” condition.
So when I close my eyes the audio stream continues but whenever I look at the screen it stops (even if I checked the “look away” condition), and I want the exact opposite : the audio stream continues as long as the child is looking and stops and goes on to the next one when the child is not looking.
Anyone has had the same issue ?

Thanks a lot
Best regards

Thank you for reporting the issue. It seems like the ‘min look time’ is not being used when the ROI is set to end routine on ‘look away’, instead it seems to end the routine as soon as the eye has left the ROI (which is also when the eye is missing / your eyes are closed).

I am not aware of any way to work around this issue; other than using custom code to create your own ROI type logic.

Added this to the bug list and will aim to have it fixed for the winter release.

Thanks for the quick reply.
Unfortunately, I cannot wait till the winter release to build this experiment.
Any thoughts on what could be in the custom code object to make this ‘end routine on look away’ work ?
I left the ROI and put “none” on end routine.

Thanks a lot !
Best regards

Could you create ROI’s to cover the screen areas you do not want to be looked at? Then use the functioning ROI ‘looking at’ logic to know if the participant entered a ROI that should not be looked at and end the routine.

Any thoughts on what could be in the custom code object to make this ‘end routine on look away’ work ?

All the ROI type logic would need to be implemented as custom code. This post has some example custom code for checking eye position on each frame of a routine.

Actually it could be a good solution to do so but the video playing is a spinning loop (a distractor) while an audio is playing. My ROI is actually the full screen and I wanted to make the experiment go to the next stimulus as soon as the child is not looking at the screen itself.
Maybe I can try to center the video and make it smaller, the rest of the screen will be the grey background of PsychoPy. Whenever the child looks at it then we go on to the next stimulus.
Not sure it will work though in case the child just stops looking at the spinning loop without looking at the grey around the video.
Whatever I end up doing I will let you know.
Thank you so much for your help !