psychopy.org | Reference | Downloads | Github

Experiment stops running after presenting two randomized stimuli or all sequential stimuli

OS (e.g. Win10): MacOS 12.0.1
PsychoPy version (e.g. 1.84.x): 2021.2.3
Standard Standalone? (y/n) If not then what?: Y
What are you trying to achieve?:
I created an experiment in builder from one that I already had working and where I needed similar requirements (just including new excel files for the loops), but now the experiment crashes after presenting two stimuli in a randomized loop, or after the full list of stimuli in a sequential loop. There’s still about half the experiment to go after this.

Before the experiment crashes, I get a TypeError:
TypeError: must be str, not NoneType

I went into the excel files and changed the data type from “general” to “text” in the hopes that that would work, but the same thing happened. Someone on the forum mentioned problems with randomization, which is why I tried running through the stimuli sequentially, but the experiment still crashes at the end of the sequential list with the same error.

This is the exact same experiment down to the settings of a previous experiment that ran and still runs fine, just with different excel files/stimuli lists. Any ideas for what’s going on?

ImpExp2FixNC.psyexp (103.1 KB)

Hi There,

Thanks for sharing your psyexp file, unfortunately we can’t run it without the corresponding spreadsheets/resources, so do make sure to share those to if you can. Did the error provide a line number where the error was occuring? and if you click it does it say what component is the source of the error?

Thanks,
Becca

Specifically, the issue might be due to a problem with TrainLoop_2fixNC.xlsx if trainLoops usually starts but doesn’t manage to complete.

@Becca It says it happened in line 1142 in the trainWugs.setText module, which is where the loop pulls in information from the text in the TrainLoop columns.

@wakecarter Right, must be something in there, except it manages to complete when the stimuli in the loop aren’t randomized (i.e., if I set them to sequential). So given that, I don’t think it’s in the excel file but somewhere else in the psyexp file.

And again, this is extra puzzling because to generate the psyexp file for this experiment, I added in the excel files for the loops and then renamed it/saved over the previous file, which has been working fine both locally and on Pavlovia. I generated the excel files in a similar way because they use the same image stimuli. Neither of these things seem like they should make a difference, but I generated a third psyexp file in a different way (duplicating on hard drive, rather than Save As in psychopy) and it’s broken in a completely different way.

NC2fixGramLoop.xlsx (8.9 KB)
TestLoop_2fixNC.xlsx (11.8 KB)
TrainLoop_2fixNC.xlsx (19.4 KB)
InstLoop.xlsx (9.0 KB)

I opened TrainLoop_2fixNC.xlsx and noticed that you’ve added a comment to cell A1.

However, when I saved it in CSV format and opened that in a text editor I discovered 20 additional blank rows at the bottom.

Try this file (which still has the comment, because I don’t think that’s the issue).

TrainLoop_2fixNC.xlsx (18.8 KB)

The comment was there from when I generated the file from the button in Builder that creates a blank excel file formatted correctly for you (in screenshot below), so hopefully that’s not the issue!

Your file worked, so those empty rows must have been the issue. Do you happen to know enough about how excel files are saved to explain why this happens? This is at least the second or third time I’ve had trouble with this kind of error and gotten it solved like this, even when I’ve tried clearing the rows below the loop contents and saving in excel. I don’t convert back and forth between csv and xlsx; is that something I ought to do for troubleshooting in the future?

I don’t usually save as CSV to check, but I do sometimes highlight and delete rows (not cells) below the last row I want.

The issue is probably because you are reusing Excel files that already have blank rows when creating new files.