Stuck on Unspecified Javascript Error

URL of experiment: Pavlovia

Hi,

I’ve created an experiment in Builder (Psychopy 2020.2.3) which is working fine locally in Python. I’m trying to get it ported to use on Pavlovia, and I’ve been able to fix a few different bugs using Wakefield’s crib sheet, but I’m now stuck on an Unspecified Javascript Error that I haven’t been able to resolve.

The experiment is a fairly straightforward visual search task which gets repeated 6 times, first in a practice block and then in 5 testing blocks which are looped with a break in between each. The online version loads the associated resources and allows the participant ID to be entered but then immediately hangs and gives an “Unspecified Javascript Error” message. The following errors are visible from the console:

It seems that the error has to do with initializing the first loop, based on where the list of undefined attributes are listed, but I haven’t been able to figure out how to get it to proceed beyond this point. The conditions for each loop are called from an excel file. I’ve tried doing the following:

  1. Creating new conditions files making sure there were no empty cell
  2. Splitting the stimulus locations for each trial into separate x and y coordinates rather than using an array
  3. Running the experiment in older versions of PsychoPy
  4. Creating versions of the experiment with individual loops vs. nested loops

None of these have changed the error message at all, and my Google searches so far haven’t revealed much. I’m new to Python/JS and unsure about where to go next. If anyone is able to help, I’d appreciate it!

A link to the code is here and I can give other information/logs if needed:

Thanks a ton!

After rebuilding the experiment one component at a time, I was able to solve my own issue. Apparently the error was caused by an component which displayed a text item for a 1-second interval in between trials. This was set to opaque so I hadn’t noticed it at first, however the size on this component was accidentally set to 0.1, which caused the online experiment to error (despite causing no problems in the python version). Setting this to a more reasonable font size fixed the problem and the online version is now working fine.

1 Like

Thanks, this solved my issue.