Hello, I am trying to intergate an emotion detection software into the experiment my team created so that we will possibly be able to run it outside the lab. I have been told about demo_faceapi (Rebecca Hirst / demo_faceapi · GitLab) which I should be able to integrate into the experiement, but I am not sure how to go about it without the readme. Other questions I have are:
- Is it possible to integrate the demo_faceapi within PsychoPy?
- If possible, how hard is it and how skilled do I need to be in programming?
I would appreciate any help!
Thank you for your answer!
I am also interested in whether it is possible setting it up in a way that the participant does not see the recording of them and the emotion being detected? And also, is it possible to create some kind of time stamps where I would know which part of the experiment the participant was in while a certain emotion was being detected?
I haven’t used the system myself so I don’t know. I think they’ll know the webcam is active but you can likely set it up so that it’s not displaying things to the participant in real-time.
According to the demo it’s saved on a frame-by-frame basis for each individual trial, and it should be saved in your normal data file marking which trial it’s from and when in that trial it happened.
1 Like