I’d like participants to be able to touch a button on either side of the screen of a Windows Surface for an on-site IAT test. Thus far, I’ve run into a few issues.
The first is that, in order for the program to move forward, the screen has to be tapped twice. I’m not sure exactly why this is. I used Builder to create a click event so that when the mouse is clicked (or in this case, the equivalent event of ‘screen is touched’) the routine ends. However, this doesn’t seem to be translating over well to the touchscreen as the screen always has to be tapped twice before the programs moves forward. Does anyone have any experience with this or any ideas?
The more pressing, second problem is that I’m not sure how to code the ‘touch’ for a specific button As of right now, double tapping any area of the screen moves the program forward. I’d like to restrict the clickable area to the size of the button (which is composed of a polygon and text). I looked at the API and saw isPressedIn(), which seems like the function most relevant to what I’m trying to accomplish. Since the polygon has a .contains() method, I figured that would be the next step. Unfortunately, upon testing, isPressedIn() never returns true. At this point, even dividing the screen in half and storing a different result for right and left would be fine. I think that would require a getPos() call, but I’m not quite sure where to go from there. I don’t have much experience with Python or PsychoPy, so any help here would be really appreciated!
If anyone can point me in the direction of helpful manuals or resources, or has similar code they’d be willing to share, I’d be very grateful.
Thank you so much for any assistance or ideas!