Online experiment only runs a few routines

I am having an issue running my experiment online.

The experiment consists of several routines as below:

However, when I try to pilot this online it is currently only runs: instructions_1 followed by pause, then terminates. It seems really strange that even the other instructions routines don’t run (as most of them are very straightforward, containing only some text, as in instructions_1).

I have tried to use chrome developer mode when running the study to find bugs in the script. While this flags some warnings, I lack the knowledge/skills to know if these are related to the issue:

image

image

Some intial bugs in my Java code were initially stopping the program from running at all, but these have already been identified and fixed.

Here is the URL for the study: https://run.pavlovia.org/Gerard_cee/validation_study_5/?__pilotToken=d3d9446802a44259755d38e6d163e820&__oauthToken=65b9883bdd0a5accba19d32e18d12f87e59866a1f56a0c03697837bad07a0ee2

It is still in pilot mode though, so I am aware it will expire.

Any help anyone can give on this is very much appreciated.

Having probed at the issue further, it seems this is being caused by my routines having ‘click to continue’ buttons (as removing this and replacing it with a key press to end the routine works).

However, my study requires the ‘click to continue’ feature. I’ve just saw some posts about the button feature not working online, but these are from a few years back. Does anyone know if this is still the case, and if so, what work arounds there are?

Do you have any callback code in your buttons? It seems code like a = 1 doesn’t currently work whereas a += 1 is fine.

I do now use buttons online but avoid callback code.

Thanks for your response. Some of them had callback code but most did not (and none of them worked). Still trying to get a solution that works online.

For instructions, simply having participant press enter will be fine to move through script.

However, for the two key routines I have, one involves participants making a typed response, so I don’t want them to be able to continue until they have typed something. Becca offers a solution for this here, but it does not seem to work online:

In another routine, I don’t want participants to progress until they have completed two rating scales. For that, something like this might work:

You could use a conditional start to those buttons.

Thanks very much for this suggestion, it worked!