Position of stimuli different when running online vs locally

URL of experiment: Pavlovia

Description of the problem: Position of stimuli are correct when running locally but altered when running online.

I am trying to run a visual search task which presents red and blue letters L and T.
The task is to indicate when a blue L is present on the screen. If a blue L is present the L key should be pressed. If the blue L is not present then the K key should be pressed.

I am using the calibration procedure from Pavlovia
to adjust the size and position of text stimuli on the screen to real cm measurements. The x_scale and y_scale variables are normally passed to this experiment from a previous experiment in a chain of experiments, however this experiment can be run in isolation if the values are input manually at the beginning of the experiment.

I am specifying the x and y coordinates for each stimulus in the conditions file. These values are scaled using the x_scale and y_scale values at the beginning of each trial.

Everything runs perfectly locally when running through builder, AND when running in the debug mode in builder, however when running on Pavlovia the experiment runs but the positions are not the same as when running locally. The coordinates and rotation or each letter stimulus was chosen to avoid any overlap, but when running online some trials will have overlapping elements.

Another problem is that when a response button is pressed some times it seems as if the button press is not detected and will require the participant to press it a second and sometimes a third time before the experiment moves on to the next trial.

Thank you for any help you can give me.

I found that the real issue was that rotations are counter clockwise when running on Pavlovia, however I can still find no cause of the experiment hanging or not detecting a button press.

What I do to solve the rotation issue is multiply the rotation by a variable which I set to 1 in Python and -1 in JavaScript.

Not detecting button presses can happen if you are trying to detect them in code and also via a keyboard component. What keyboard related code and components are in the offending routine?

Hello,

I am only detecting keyboard responses using a keyboard component which limits the response to two keys.

There are no code components that detect key presses in the entire experiment.

The key press is supposed to end the routine but sometimes for no visible reason it will not terminate unless the button is pressed a second or third time.

Do you know of anything else that can cause this behaviour?

Thank you.

When I’ve had this, it has usually been due to a timing issue. Will it respond first time if you wait for a few seconds first? Does the keyboard component start at 0 seconds?

Hello,

Yes the keyboard component is set to start at 0. I will have to test out what happens if I consistantly wait a second or two before answering and get back to you with that information.

@wakecarter

Hello, Oddly the issue with the none detected keypresses seems to have sorted itself out after I corrected the direction of the rotations to be counter clockwise to correct for the overlaping elements.

However, some of the pilot participants are reporting that elements are falling off the screen.
I am using your credit card calibration code to scale the items and their distances to real world measurments that would enable them to fit inside of a 13" monitor. The experiment is set to use height units for the calibration and subsequent scaled stimulus presentation. On some computers with 13" monitors it works perfectly, as well as on larger monitors. The items are of the desired size and spacing with extra room on the perimeters if the screen is larger than 13". However on other computers the items are being cut off at the edges of the screen. Have you had any other reports of this? Do you know what could be causing this to happen?

Thank you.

Have you checked what devices (and monitor sizes) the participants who are seeing cut-off items are using? Might they be smaller than 13" or being used in portrait mode?

Might some participants be overestimating the size of their credit card (either through lack of care or misunderstanding instructions (are they placing the card on the screen or holding it in front of them at arms length)?

Hello,

Yes, I have collected the physical dimensions of the screens as well as the resolutions of the screens and they are exactly the size of screen I designed for. I based the size and spacing of the elements to fit on a 13" laptop monitor.

The method which the participants are using to size the card is a good suggestion. I will verify this to make sure that they are holding the card against the screen.

In the meantime I have adjusted the coordinates to move the outlying elements in towards the center and this has fixed the problem for one of the people who was seeing elements falling off of the screen, however one person is still seeing the elements being cut off. A screen shot of this person’s screen tells me that she is still seeing the old coordinates in spite of having cleared her cache before running the experiment again. Is this something that you have encountered before?

Thank you.

If the same participant retrys the experiment they should either use Ctrl-Shift-R to flush the cache, or use a different browser, or use incognito mode, or possibly add ?a=b to the end of the URL (or equivalent) to make the Browser think that the contents might be different.

Thank you.

I will have her try these solutions.

Hello,

Another strange behaviour that this individual is seeing is the following:
I have several experiments daisy chained together, she is able to run the experiment when she runs the entire sequence of experiments but is unable to run the experiment using the direct url. This gives her the unspecified javascript error.

Here is the direct URL for your reference.

https://run.pavlovia.org/LabevRISUQ/trailmakingcs-5/html

Thank you.

Hello,

I just double checked and she is actually getting the 403 Forbidden error not the unspecified javascript error when trying to run the direct url.

Thank you.

You get the 403 Forbidden error if you use https://run.pavlovia.org/LabevRISUQ/trailmakingcs-5/ instead of https://run.pavlovia.org/LabevRISUQ/trailmakingcs-5/html