Pavlovia Black Screen No Error Message After Initialization on version 2020.2.5 but not 2020.2.4

My experiment works fine when compiling the javascript code and uploading to pavlovia with version 2020.2.4, but with 2020.2.5 my screen goes black after inputing participant id without any error message in the console.

Any ideas?

Hey again,
Could you share your gitlab repo with me? Then I can take a look. I’m tpronk

Thanks!

I will share two versions of the experiment. One repo will be clearly marked with the tag “NEW_PSYCHOPY” denoting that 2020.2.5 is used and 2020.2.4 on the other repo I share with you.

I think the difference is in how the javascript is converted.

Tagging @sotiri to think along. Could you give him access as well?

Yes, added. Thanks! @sotiri @thomas_pronk

@rkempner OK thanks for giving me access, on it, s.

1 Like

Sorry @rkempner, could you make it developer access? The code is hidden from me as guest. Also, is this the 2020.2.4 repo? Thanks, x

Gave developer access, sorry about that.

The experiment is still in a testing phase, so there are some quirks I must tell you about it in order for it to run.

When asked for participant id, you must insert the digit 1 or the program will not work.

I shared two repositories with you, “pilot2_testing.” and “pilot_NEW_PSYCHOPY.”

The 2020.2.4 version which works for me is under the repo, “pilot2_testing.”

The 2020.2.5 version which does not work is under the repo “pilot_NEW_PSYCHOPY.” Importantly, I deleted the line, const { round } = util , which 2020.2.5 generated since that interferes with a function I made in a code snippet called round. That deletion of the const { round } = util line is the only difference I made between these two versions.

In “pilot2_testing”, things work out as intended, but in “pilot_NEW_PSYCHOPY”, my screen is staying black with no error messages in console.

Alright, clear, I was able to reproduce the problem, many thanks for your patience while I look for the culprit, x

Hey @rkempner, quite possibly there is an issue with the 2020.2.5 generated boilerplate when it comes to checking whether a component has requested a forced routine end.

I created a develop branch and commented out a few of those checks to demonstrate a temporary fix. However, it would be really helpful if you sent over your .psyexp for us to determine exactly what the problem is and to have it addressed as part of our next release.

Also, PsychoPy now relies on PixiJS latest, but the PsychoJS post upgrade tweaks have yet to reach Pavlovia’s servers. That should be fairly soon, but I added those in anyway so you don’t have to wait.

If you would rather not post your .psyexp on the forum, my email is sotiri@opensciencetools.org

Please let me know if you need more details, thanks, s.

Hey @sotiri, thanks for checking this out.

I see now that the RoutineBegin functions mistakenly had that if (!continueRoutine) {} with the 2020.2.5 version. Some notes on that: that if statement was generated and not something I put in a code snippet. Also: the builder may have been confused by the fact that in all my BeginRoutine tabs I run one of “continueRoutine = True” or “continueRoutine = False.” When you see my .psyexp you will have a clearer idea of what I mean, but that vague description may help you see why my code triggered this anomaly. In every single RoutineEachFrame function, I comment out the line " let continueRoutine = true; // until we’re told otherwise" before I upload my JS code to pavlovia because I am controlling the continueRoutine in the BeginRoutine tab. Please let me know if you need more details there, and look out for my .psyexp coming to your email.

Re: “That should be fairly soon, but I added those in anyway so you don’t have to wait.” Is that what that change to the imports is all about? I was confused about the change of imports.

Thanks again for helping out!

-Ross

Hey Ross, no problem, OK that explains it some, on standby for the .psyexp. Yes about the imports, if you can live with those for the time being, I will be making sure to let you know when a permanent fix is in place, thanks :blush:

great – thanks!

Hi @sotiri

Did you get everything you need for this?

Hey @rkempner, yes I brought it up on one of our standups this week again. One of our python developers is on it, but to speed things up a little, if at all possible, would a .psyexp cut down to the essential bits be asking too much? Many thanks for your patience, x

Hi @sotiri, could you clarify what is non-essential?

Yes @rkempner I meant maybe having just a single trial routine to demonstrate what the problem is? Is that possible? Only asking, thank you, x

So the idea would be to eliminate parts of the program until I arrive at the smallest form of it in which the error remains?

I’m confused by the procedure you had in mind here, since it is unclear where the problem is. Could you elaborate some more please?

No problem, because the repository is gone, I can no longer link to the relevant bits. Those checks around whether a component has requested a forced routine end? Along the lines of continueRoutine = false? x

I will add you again to the repositories sorry about that!

Maybe after I re-add you then you could let me know about the relevant bits? I’m having a hard time following you about what I should delete.