RuntimeError: can't start new thread. Experiment crashes near the end

OS: Win10
PsychoPy version: 3.0.4
Standard Standalone? yes

I’m running an experiment which is basically the Alternative Uses Task: participants see a word on the screen (a common household object like “paperclip”). Within three minutes they have to come up with as many alternative uses for that object as possible. They type in their answers and press ENTER after each use. The uses they have previously typed appear as a list in the top right corner of the screen. When the three minutes are up, the experiment moves on automatically (to an instruction screen, and when they press space it moves on to the next word for which they again have 3 minutes).

When each word is shown, a short sound is played as well. It’s a long story, but these sounds are necessary because I later want to play those sounds to them in sleep.

There are three different parts to this experiment: part 1 (they do 4 items), part 2 (they do 8 items), and part 3 (8 items again).

It’s all working, but the problem is that the experiment crashes near the end. I’ve actually gotten two different errors from this crash; one was a ‘memory error’ (I stupidly didn’t save the exact error out of panic because it crashed during the actual experiment). The other one was the ‘RuntimeError’ in the title. I will paste the full error below.

As I said, the error occurred during the testing of a participant. The weird thing is that it never occurred before (obviously I ran some pilots before starting the experiment!).

Some other things:

  • Part 1 of the experiment (with only 4 items) never crashes.
  • Part 2 was when I first encountered the error. It crashed for two participants (who were testing at the same time) after 6 items.
  • I made some small changes hoping to remove the error: rather than giving the sound item a blank duration I gave it a fixed duration that was slightly longer than the longest of the sounds. I also followed the slightly strange advice at the bottom of this post: Experiment crashes when it runs for too long and I unchecked the log file saving.
  • Then when I ran Part 3 (because I am determined to keep going with these participants!) with these slight changes, it crashed for one participant after 7 items (so it did one more item than before), and it didn’t crash for the other participant.

The full error:

Running: C:\Users\CUBRIC\Desktop\martyna\AUT\AUT_part2_lastrun.py

pyo version 0.9.0 (uses single precision)
8.6346 WARNING psychopy.sound.backend_pyo.init could not find microphone hardware; recording not available
Traceback (most recent call last):
File “C:\Users\CUBRIC\Desktop\martyna\AUT\AUT_part2_lastrun.py”, line 1267, in
win.flip()
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\window.py”, line 796, in flip
callEntry[‘function’](*callEntry[‘args’], **callEntry[‘kwargs’])
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\sound\backend_pyo.py”, line 392, in play
self.terminator.start()
File “C:\Program Files (x86)\PsychoPy3\lib\threading.py”, line 846, in start
_start_new_thread(self._bootstrap, ())
RuntimeError: can’t start new thread

So, does anyone have any ideas? Note that I’m using pyo because sounddevice cuts off my sounds before they finish. Admittedly I haven’t tried pygame, and that’s going to be my next step just to see if that makes a difference. However, going off the error message and the fact that my small changes seemed to help, I’m not sure that it’s a sound backend issue but rather one of memory or parallel processes that are happening?

This probably won’t help you with your issue. But just to fill you in on what was the cause of my own problem:

Stopping the logging made a very minor difference (minor enough that I considered putting the logging back in again). The main problem in my instance turned out to be that I was (stupidly) reassigning a textStim variable within a win.flip() loop, meaning that it was creating a new textStim variable on every frame… The textStim class turns out (turned out? Perhaps it’s since been fixed) to have a small memory leak in it, but this leak compounded quite fast when I was creating new textStim variables on every frame.

Just to keep you in the loop. I hope some of the others can provide you with some answers.

1 Like

Thanks Sam!

Actually, your reply made me have another look at that thread, and I decided to check whether it could be a memory leak like in your case. It does seem like the experiment is taking up increasing amounts of memory as it runs, so I guess yes?

It is difficult in my case because the experiment has to monitor each keystroke and display what people have written. So I guess that may be part of the problem; I think each frame it’s checking whether people have typed something and then displaying that. I don’t know how to get around that though… because it’s literally what the experiment is about.

So, I checked pygame, which crashed the experiment immediately.

I think I have narrowed down the main issue to the list of participant replies that I’m displaying. When I disable this feature, the memory leak seems to be minimal to zero. However, I’d still really like to have this feature in the experiment.

Currently how I’ve implemented it is by adding a coding component to my builder experiment, based on the threads about user text input that I found on the forum and beyond. The main relevant bit is the following:

Begin experiment:

#Create a string in which to store the user's answers
#And create a list in which to store all replies for an item
inputText = ""
list_AUT  = []

Begin routine:

#Create empty theseKeys variable to track the keys
#Set the shift flag to False because we want lowercase letters for now
#Align the name of the item horizontally
theseKeys                   = ""
shift_flag                  = False
text_AUT.alignHoriz         = 'left'

Each frame:

n = len(theseKeys)
i = 0

while i < n:

#When a user presses return we save the input and clear the input
if theseKeys[i] == 'return' and len(inputText) > 1:
    list_AUT.append(inputText)
    inputText=""

#When a user presses backspace we lose the final character
elif theseKeys[i] == 'backspace':
    inputText = inputText[:-1]
    i = i + 1

#When a user presses space we add a space
elif theseKeys[i] == 'space':
    inputText += ' '
    i = i + 1

#When a user presses shift we capitalise the next character
elif theseKeys[i] in ['lshift', 'rshift']:
    shift_flag = True
    i = i + 1

#When a user presses . (period) we display that also
elif theseKeys[i] == 'period':
    inputText = inputText + "."
    i = i + 1

#When a user presses , (comma), we display that also
elif theseKeys[i] == 'comma':
    inputText = inputText + ","
    i = i + 1

#When a user presses / (slash), we display that also
elif theseKeys[i] == 'slash':
    inputText = inputText + "/"
    i = i + 1

#When a user presses - (dash), we display that also
elif theseKeys[i] == 'minus':
    inputText = inputText + "-"
    i = i + 1

else:
    if len(theseKeys[i]) == 1:
        # we only have 1 char so should be a normal key, 
        # otherwise it might be 'ctrl' or similar so ignore it
        if shift_flag:
            inputText += chr( ord(theseKeys[i]) - ord(' '))
            shift_flag = False
        else:
            inputText += theseKeys[i]

    i = i + 1

End routine:

#Save the generated list of uses
thisExp.addData('list_AUT', list_AUT)

#Empty the list in anticipation of the next AUT item
list_AUT  = []

#Empty the string shown on the screen in case someone was
#halfway through typing a response and didn't press return
inputText = ""

Then there are two text objects: one to display the inputText (what users are writing at the moment) and one to display the list_AUT (which is the list of the previous answers they gave about this object). Both are set to “set every frame”, as we want them to be showing the current situation.

So my question, I guess, is if there is a way that I could still show the list_AUT but without it causing my experiment to crash? Perhaps I could have it update every (few) second(s) instead of every frame? I’m guessing I’d have to go into the code then, because the builder options are either ‘set every repeat’ which is not often enough or ‘set every frame’ which (I think) causes a crash later on.

I just thought I’d provide an update in case anyone else ever stumbles onto this topic!

What I ended up doing was the following:

Because I needed my list of words to be nearly continuously updated, I had to find a way to present the stimuli while keeping the memory leak to a minimum.
In my text object, rather than just putting $list_AUT, I put the following:
', '.join($list_AUT)

This then makes it so that the list is treated as a string, with separate items being joined together by a comma and a space (really you could put anything in between the quotation marks; if you wanted just a space you would put ’ ’ or if you wanted a dash you would put ‘-’ etcetera).

This has seemingly completely resolved any issues, and my experiment has not crashed since.

1 Like