Windows7 use of touchscreen monitor as mouse - unexpected behavior

Hi all,

I’m running the latest standalone version of Psychopy on a Windows 7 machine. We have two monitors connected that are mirrored. The primary display or screen 0 (ViewSonic touchscreen) is the one participants are using during the experiment. Outside of PsychoPy the touch screen works as expected, meaning that finger pressed are interpreted as single mouse (left button) clicks as expected.

I’ve pre-tested my experiment using an ordinary mouse - the script uses the functions mouse.getPressed( ) and mouse.isPressedIn( ) during two different routines - and everything works fine.

With the touchscreen, however, a single finger press/touch is not recognized as a single left-button click. I figured out that I need to touch the screen in one position then move it and release it in a different position - so more of a swiping motion than a click.

What makes this even stranger is that in the logfile the single finger presses are registered exactly as one would expect from a single mouse click.

Here are some parts from the beginning of the experiment. The instructions are being shown and the participant is asked to touch the screen to carry on. The first few logged responses are single finger presses that fail to end that routine.

9.8566 	EXP 	instr: autoDraw = True
11.4736 	DATA 	Mouse: Left button down, pos=(1263,334)
11.4736 	DATA 	Mouse:  Left button up, pos=(1263,334)
16.7396 	DATA 	Mouse: Left button down, pos=(1437,308)
16.7397 	DATA 	Mouse:  Left button up, pos=(1437,308)
20.3412 	DATA 	Mouse: Left button down, pos=(1423,384)
20.3413 	DATA 	Mouse:  Left button up, pos=(1423,384)
20.4728 	DATA 	Mouse: Left button down, pos=(1423,384)
20.4729 	DATA 	Mouse:  Left button up, pos=(1423,384)
25.0736 	DATA 	Mouse: Left button down, pos=(1204,663)
25.0737 	DATA 	Mouse:  Left button up, pos=(1204,663)
27.8057 	DATA 	Mouse: Left button down, pos=(1218,371)
27.8057 	DATA 	Mouse:  Left button up, pos=(1218,371)
29.8221 	DATA 	Mouse: Left button down, pos=(984,287)
29.8346 	EXP 	Imported lists/test_list.csv as conditions, 16 conditions, 31 params
29.8351 	EXP 	Created sequence: sequential, trialTypes=16, nReps=1, seed=None
29.8362 	EXP 	New trial (rep=0, index=0): {'R_item': u'R_T_spec_4', 'br': u'Sperber', 'item': 4, 'bl': u'Geier', 'confusion': u'Falken', 'cond_code': 120, 'item_code': 4, 'rel_code': 160, 'cond': u'spec_sym', 'distractor1': u'Geier', 'distractor2': u'Sperber', 'L_item': u'L_T_spec_4', 'conf_loc': u'topright', 'R_sentence': u'Karin bestaunte morgens zw\xf6lf Falken', 'sem_cue': u'Raubv\xf6gel', 'correct_loc': u'topleft', 'stim': u'audio_training/T_spec_4.wav', 'probed_code': 181, 'fixdur1': 138, 'L_sentence': u'Thomas zeichnete mittags drei Adler', 'fixdur2': 95, 'tr': u'Falken', 'presented2': u'Falken', 'presented1': u'Adler', 'tl': u'Adler', 'L_final': u'Adler', 'rel_cue': u'pics/20_80.png', 'sem_code': 130, 'R_final': u'Falken', 'correct': u'Adler', 'probed': u'L'}
29.8551 	EXP 	instr: autoDraw = False
29.8551 	EXP 	fix1: autoDraw = True
30.1058 	DATA 	Mouse:  Left button up, pos=(1088,284)
32.1466 	EXP 	Set  sound=audio_training/T_spec_4.wav
32.1550 	EXP 	fix1: autoDraw = False

Using the mouse instead of touch screen:

9.5666 	EXP 	instr: autoDraw = True
10.3668 	DATA 	Mouse: Left button down, pos=(772,647)
10.3784 	EXP 	Imported lists/test_list.csv as conditions, 16 conditions, 31 params
10.3789 	EXP 	Created sequence: sequential, trialTypes=16, nReps=1, seed=None
10.3798 	EXP 	New trial (rep=0, index=0): {'R_item': u'R_T_spec_4', 'br': u'Sperber', 'item': 4, 'bl': u'Geier', 'confusion': u'Falken', 'cond_code': 120, 'item_code': 4, 'rel_code': 160, 'cond': u'spec_sym', 'distractor1': u'Geier', 'distractor2': u'Sperber', 'L_item': u'L_T_spec_4', 'conf_loc': u'topright', 'R_sentence': u'Karin bestaunte morgens zw\xf6lf Falken', 'sem_cue': u'Raubv\xf6gel', 'correct_loc': u'topleft', 'stim': u'audio_training/T_spec_4.wav', 'probed_code': 181, 'fixdur1': 138, 'L_sentence': u'Thomas zeichnete mittags drei Adler', 'fixdur2': 95, 'tr': u'Falken', 'presented2': u'Falken', 'presented1': u'Adler', 'tl': u'Adler', 'L_final': u'Adler', 'rel_cue': u'pics/20_80.png', 'sem_code': 130, 'R_final': u'Falken', 'correct': u'Adler', 'probed': u'L'}
10.3999 	EXP 	instr: autoDraw = False
10.3999 	EXP 	fix1: autoDraw = True
10.5001 	DATA 	Mouse:  Left button up, pos=(772,647)
12.6919 	EXP 	Set  sound=audio_training/T_spec_4.wav
12.6919 	EXP 	window1: recordFrameIntervals = False
12.6998 	EXP 	fix1: autoDraw = False

Is this a timing issue? Is seems that the button down and button up responses are logged all most at the same time for the touch screen. I’ve already tried holding my finger down on the screen a bit longer but it seems that moving it to a different location is the only solution for now.

Any ideas on how to solve this problem? I’m working with older participants who sometimes have trouble using a mouse, so I would like to avoid asking them to swipe instead of touch/click on the screen.

Thanks,
Sarah

Hi Sarah - this may be because the touch signal is subject to much more interference than a mouse click - you’ll see that the up and down on the touch screen are much closer than the mouse click up and down. Check this thread out and see if that helps at all:

BW

Oliver

Hi Oliver,

thank you for your response. I followed your instructions from the linked thread but unfortunately this didn’t solve my problem. Using the additional while loop, “Pressed” is only set to “True” if I hold down my finger and move it slightly. So the single finger presses are again not registered as single mouse button presses. I also reasoned that it might have to do with the fact that the up and down are much closer together than those of an actual mouse.

Any other clues on how to solve this?

Thanks,
Sarah

Hi Sarah,

I don’t have the same set up as you so I can’t do any testing or checking. Have you tried it on a single monitor?

BW

Oli

Hi Sarah,
I suspect that depends on 3 facts:

1 - from the log you provided looks like a touch is translated in 2 almost instantaneous events of PRESS and RELEASE
2 - the mouse component does store (asynchronously) only the last event occurred
3 - within a routine loop the mouse is checked at every flip(), which means roughly every 15 ms on average (on 60Hz screens), depending on the vertical refresh frequency of your screen

If the mouse events are triggered at a higher frequency, you’re going to miss some or all of them very likely.
Of course normally everything works fine as it’s hard to imagine someone clicking and releasing the mouse button so fast.

Anyway, I had a similar problem in the past. To workaround it, I changed the callback _onPygletMousePress and _onPygletMouseRelease (in the event.py module) so that they would store the mouse history if I’d set programmatically a special flag. Then I added a a couple functions: one to set the flag, the other to be used within the psychopy routine loop to retrieve and empty the mouse history in between one call and the successive.

I should probably ask to introduce this feature to the future version of PsychoPy.

L.

Hi Luca,

thanks for your reply. Your explanations make a lot of sense and made me understand the cause of the problem I’m dealing with a lot better.

Would you mind sharing the code for the adaptations you’ve done to the event.py module? Logically I can follow your proposed changes but I’m not sure I would be able to implement them myself.
I think it would be great if such features could be generally incorporated since I’m probably not the only one facing these issues.

Thanks again,
Sarah

If you just need to register button presses anywhere, you could also use ioHub:

from __future__ import print_function
from psychopy import iohub

io = iohub.launchHubServer()
mouse = io.devices.mouse
mouse.clearEvents()

while True:
    e = mouse.getEvents(event_type=(iohub
                                    .EventConstants
                                    .MOUSE_BUTTON_PRESS))

    if e:
        break

print('Received button press!')

However, there is no equivalent to event.Mouse.isPressedIn() in ioHub, I’m afraid.

Thanks Richard, this would definitely help with the instructions routine but unfortunately I’m relying on event.Mouse.isPressedIn() for the responses at the end of each trial.

If your stimuli inherit from ShapeStim, they will have a contains() method, which you could employ. The following example uses ioHub to check whether a button press occurred inside a circle.

NB: I’m not sure whether this works with other units than pix. It will certainly only work in fullscreen mode!

from __future__ import print_function

from psychopy import iohub
from psychopy.visual.window import Window
from psychopy.visual.circle import Circle

io = iohub.launchHubServer()
mouse = io.devices.mouse

win = Window(units='pix', fullscr=True)
circle = Circle(win, size=300)
circle.draw()
win.flip()

mouse.clearEvents()

while True:
    e = mouse.getEvents(event_type=(iohub
                                    .EventConstants
                                    .MOUSE_BUTTON_PRESS))

    if e:
        x, y = e[0].x_position, e[0].y_position

        if circle.contains((x, y), units='pix'):
            print('Received button press!')
            break

win.close()
io.quit()
2 Likes

Yes, I’m using visual.Rect to present four response boxes to choose from so your suggestion should work in this case as well.

I’ll give this a try and report back!

Thanks everyone,
Sarah

1 Like

Hi Sarah,
mine was a patch on the fly, not actually integrated with the Mouse component. I created a new component, apart. So, as you’re using Mouse.isPressedIn() it won’t work straightaway so I should adapt it to make it usable for your purposes and I’d need a bit of time. I see that Richard has provided you an alternative solution. If that works, I’d prefer to wait before posting any code.

Cheers,
L.

Hi all,

just to provide some feedback - I’ve tried the mouse alternative Richard suggested and it works! Thus far I only had time to test the minimalistic example that Richard provided but I don’t see why it should not work for my actual experiment as well.
I have only one related question that I couldn’t figure out from the details provided in the reference manual on iohub - since I want to compute reaction times for the responses at the end of the trial, how do I access the time stamp for the occurrence of events monitored by ìo.devices.mouse.getEvents() ?

Thanks,
Sarah

1 Like

In the example above, you would access the time stamp of the first button press event via e[0].time. The time base is the same as the PsychoPy time base (monotonically increasing). So you would commonly do something like this (untested):

circle.draw()
mouse.clearEvents()

t0 = win.flip()

while True:
    e = mouse.getEvents(event_type=(iohub
                                    .EventConstants
                                    .MOUSE_BUTTON_PRESS))

    if e:
        break

rt = e[0].time - t0
print('RT was %.3f sec' % rt)

Good luck!

1 Like

This thread is old but if anyone is on the “touchscree-train” the current equivalent of the mouse testing example:

from __future__ import print_function
from psychopy import iohub

io = iohub.launchHubServer()
mouse = io.devices.mouse
mouse.clearEvents()

while True:
    e = mouse.getEvents(event_type=(iohub
									.constants
                                    .EventConstants
                                    .MOUSE_BUTTON_PRESS))

    if e:
        print(e)
        break

print('Received button press!')
io.quit()

will result in print(e) containing:
[‘P’, ‘C’, ‘’, ‘D’, ‘E’, ‘V’, ‘I’, ‘C’, ‘E’, '’, ‘R’, ‘U’, ‘N’, ‘T’, ‘I’, ‘M’, ‘E’, ‘_’, ‘E’, ‘R’, ‘R’, ‘O’, ‘R’]

Psychopy: 1.9.2
Python: 3.5.6 Anaconda