Emotion detection software in online experiment

Hello, I am trying to intergate an emotion detection software into the experiment my team created so that we will possibly be able to run it outside the lab. I have been told about demo_faceapi (Rebecca Hirst / demo_faceapi · GitLab) which I should be able to integrate into the experiement, but I am not sure how to go about it without the readme. Other questions I have are:

  1. Is it possible to integrate the demo_faceapi within PsychoPy?
  2. If possible, how hard is it and how skilled do I need to be in programming?
    I would appreciate any help!
  1. Yes. If you get the ‘face_api.js’ file from that repository, add it to your experiment folder, and add it in the “resources” list in the experiment settings, then it should be accessible to code components in your experiment.
  2. A bit. You can probably copy most of the code you would need from the code components in the experiment you linked. Just grab the psyexp file and see what the code components look like. Most of it looks pretty modular, you should be able to just copy over the code into whatever trial you want to record face info from. If you want to do something more complex than just record, however, that’ll be a different story.
1 Like

Thank you for your answer!
I am also interested in whether it is possible setting it up in a way that the participant does not see the recording of them and the emotion being detected? And also, is it possible to create some kind of time stamps where I would know which part of the experiment the participant was in while a certain emotion was being detected?

I haven’t used the system myself so I don’t know. I think they’ll know the webcam is active but you can likely set it up so that it’s not displaying things to the participant in real-time.

According to the demo it’s saved on a frame-by-frame basis for each individual trial, and it should be saved in your normal data file marking which trial it’s from and when in that trial it happened.

1 Like