Tracking all keyboard and mouse inputs for experiment that has participants who are blind using a screen reader

I am working on an experiment that will have participants use the computer and receive interruptions throughout the experiment. Their task performance on the computer-based tasks will be measured in terms of accuracy and time to task completion. I am running two sample groups- one group of users of visual computer interfaces (sighted users) and a group of auditory computer interface users (blind users). I need to collect all mouse and keyboard input data throughout the experimental trials with time stamps, as I am measuring time to task completion.

I am running into issues doing so with PsychoPy- and I am new to using it so they could certainly be user error. I have tried integrating pynput, as well as tried using the ioHub to collect all mouse and keyboard input and log them with a timestamp. Frankly, I used ChatGPT to give me the code to do so, and I can’t determine if I am on the right track or not.

During the experiment, I want participants to work outside of PyschoPy- ideally in Excel for the computer based tasks. I will need to interrupt them during the trials, so plan to use PsychoPy to provide text or video/audio stimuli to do so.

Any guidance is appreciated!

So are you building an experiment that exists both within psychopy (to create interruptions) and outside of psychopy (excel sheets)? This is going to pretty difficult and lead to a lot of experimental error as psychopy makes assumptions based on window and screen sizes, and not being the focused program make it VERY unhappy (Reaction times and overall processing).

You are better off:

A: Recreating your “Excel” task in psychopy.

B: Creating a python program that records mouse and keyboard inputs, or maybe use a 3rd party software that can do the same? But then you’d still need to create a python script to create the “interruptions”.

Issac

Thanks for your response!

I think my backup plan is to create that python program.

My concern with recreating the tasks in PyschoPy is the accessibility with screen readers. I have read a bit on another forum post that the stimuli are not always screen reader accessible, and I need them to be consistently accessible. Using Excel or another mainstream platform also cuts down on the training time with the participant during the setup of the experiment, prior to the trials.

I am not sure what you mean by screen reader accessible? But if you are referring to psychopy being used solely for participant input tracking, outside of the psychopy window, then yes. Psychopy is unhappy with that. Depending on what exactly needs to happen in your “excel” task, you could just recreate excel to some degree.

But in regards to 3rd party programs in tandem with the python for interruptions, I don’t have any recommendations. Maybe having something like OBS, just make sure you have a screen recording of the participants actions and inputs during the task in case the mouse/keyboard recording program messes up?

Issac

By screen reader accessible- I mean that one of my sample groups are participants who are blind and use screen readers to access the computer. Screen reader users will solely use the keyboard to interact with the computer, and some sites/software warrant mouse use or are not well developed to be accessed with a screen reader- so the participant would not be able to complete the tasks in that program. I had seen another post (Accessibility of online experiments) where they described some barriers, so that makes the PsychoPy stimuli concerning to use for my experiment.

What is OBS? I do think after the information you provided about PsychoPy being unhappy with other programs, I will need to consider not using it afterall.

I appreciate your feedback!

OBS (Open Broadcast Software) is a 3rd party screen recording/streaming program. It would be able to record what is happening on the screen (as in physically seeing what is happening on the screen and where the mouse cursor is. I have used OBS to record portions of my experiments to showcase in presentations.

I know that there is some plugins that are available that allows you to also include mouse and keyboard inputs as a VISUAL component to the recording. You would still need a program to grab the times automatically unless you wanted to manually scrub the videos yourself.
Input Overlay | OBS Forums This is an example of such a plugin, I have never used these types of plugins before with OBS and have no idea if the one I linked is up to date with current OBS. I would recommend you do some searches to find something more recent, and there are guides for integrating them with OBS.

Another avenue is to maybe ask some visually impaired streamers on the best way to go about it? As they would have the lived experience in utilizing these programs.

(P.S Sorry mods if linking 3rd party software outside of psychopy is against the rules.)

Issac