Forceplate-driven Biofeedback

PsychoPy 2022 2.2
Standard Standalone? Y

I am working on a simple task that instructs participants to move by leaning, shifting their weight, or stepping (3 conditions). Within each condition, they will be instructed to move left, right, forward, or backward. They will be standing on an AMTI forceplate embedded in the floor with a TV monitor in front of them, which will be displaying the visual instructions. The instructions will be triggered by the force data they generate as they stand or move. For example, to start the task, they will need to stand still for X amount of seconds, which will trigger the screen to display “step to the left” and a beep will signal the start of the step movement. Once the participant has stepped to the left and is no longer moving, the screen will display “step back to center”, where the loop continues 3 more times. This will repeat to the right, forwards, and backwards and is NOT randomized.

So far, I have the task almost completed in PsychoPy builder. There is one loop for each direction for leaning, weight shifting, and stepping (12 loops). I have code to pull the force plate data from where it is being streamed in Qualisys Track Manager (QTM) motion capture software with Qualisys Connect for MATLAB (QCM).

I am trying to figure out how to insert the force plate data into PsychoPy so that it triggers the events as needed. Considering using Lab Streaming Layer to receive the force plate data from QCM (MATLAB) and send the data to PsychoPy (Python).

Has anyone used a similar process to drive visual stimuli in PsychoPy? In short, I need help with:

1.) Connecting all data streams between QCM, LSL, and PsychoPy
2.) Inserting forceplate data into PsychoPy to serve as trigger for stimuli loops