MemoryError: intensity = numpy.array(im)

OS: Win10
PsychoPy v: 3.0.0b9
Coder/Builder: Builder

Error message:

Traceback (most recent call last):
File “C:\Users\rahmanf2\Desktop\Magic_Cups_v2\PsychoPy_MagicCups_lastrun.py”, line 859, in
Opener.setImage(image1)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\image.py”, line 301, in setImage
setAttribute(self, ‘image’, value, log)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\tools\attributetools.py”, line 141, in setAttribute
setattr(self, attrib, value)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\tools\attributetools.py”, line 32, in set
newValue = self.func(obj, value)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\image.py”, line 288, in image
forcePOW2=False)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\basevisual.py”, line 875, in _createTexture
intensity = numpy.array(im)
MemoryError

Hi guys, I’m getting this error message when running my task. Interestingly, it runs for about half way through the first block (four blocks in total) and then stops. Each trial in the task consists of four hi-res (~ 25mb each) .png image files, presented sequentially.

I am assuming this is to do with limitations on my machine’s memory? (I am running it on a 64-bit Windows 10, Intel i5, 7th gen machine with 8gb ram).

Thank you.
Foyzul.

Hi, just an update, I’ve now tried the task on my desktop pc which is a lot more powerful (i7, 16gb ram) and I’m still getting the same issue. So I’m assuming this is less to do with the actual computing capability of the machine and more to do with how PsychoPy is drawing the files?

Anyone know how to resolve this?

Thank you. Foyzul.

You should only be presenting images at the native resolution of your screen. Anything higher is simply pushing around pixels that will not be perceived by the subject, and this leads to needless memory consumption. So a good start would be to down-sample all images to no more than the native dimensions of your display. There are simple Python scripts around that can automate this for you, or it could be done manually in any image editing software.

Hi Michael, many thanks for your reply. Unfortunately, even after ‘lowering’ the quality of these images from 25mb to around 200kb each, I still got the MemoryError half way through the first block. Here’s the message:

Traceback (most recent call last):
File “D:\Magic_Cups_v2\PsychoPy_MagicCups_lastrun.py”, line 852, in
ResponseProbe.setImage(probe)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\image.py”, line 301, in setImage
setAttribute(self, ‘image’, value, log)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\tools\attributetools.py”, line 141, in setAttribute
setattr(self, attrib, value)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\tools\attributetools.py”, line 32, in set
newValue = self.func(obj, value)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\image.py”, line 288, in image
forcePOW2=False)
File “C:\Program Files (x86)\PsychoPy3\lib\site-packages\psychopy\visual\basevisual.py”, line 875, in _createTexture
intensity = numpy.array(im)
MemoryError

Is there something else that could be causing this error?

Thank you. Foyzul.

The point is not about quality or file size, but pixel dimensions. It’s the number of pixels in your image that determines how much memory they consume. No matter how aggressive the compression is in an image, it’s irrelevant to the memory footprint, as all images must be decompressed when they are read in from disk.

But maybe we’re just not using the same terminology?

Hello,

Memory errors typically arise from the inability to allocate enough space to store pixel data.

Images are stored internally in a high-precision floating point format (32-bit float per channel). Image data is converted to this format and stored in RAM prior to uploading it to video memory. The amount of memory required is the same for all images sharing the same pixel dimensions (resolution), regardless of file size. You can use the following formula to estimate the amount of RAM consumed by your image,

reqMemory = 16 * (widthPixels * heightPixels)

The 16 is the number of bytes needed to store the data for a single pixel. 32-bit floats require 4 bytes (8 bits- each). You then need to multiply by the number of channels, which is 4 (R, G, B, A). So a 1920 x 1080 image will require 33177600 bytes (1920 * 1080 * 16) or ~33 Mb.

Doesn’t seem like much, but there is another thing to consider, 32-bit Python (numpy) is only provided about 2Gbs of addressable memory by the OS (doesn’t matter that you are using a 64-bit machine). This means that your application needs to allocate a contiguous array within this 2Gb space. This might not be possible since that memory is being used to store other things related to PsychoPy.

1 Like

Hi,

Thank you for your detailed explanation - this makes a lot of sense. I mistakenly thought that reducing file size (by reducing image quality) would resolve the issue. Each image is currently about 3500 x 2000 pixels. I am currently reducing each image to 1200 x 800 pixels, I’m hoping this will resolve the problem?

Thank you.
Foyzul.

You’ll have to try it and see, but as @mdc explains above, this will reduce the amount of memory required per image by a factor of ~7.3. There should be no visible reduction in image quality, as none of those extra pixels were visible before anyway, because the image still needed be scaled down to fit your 1200 × 800 display anyway. So they were consuming extra memory for no visible benefit (but the very strong disk compression you were using could quite likely reduce image quality).

Hi Michael and @mdc, thank you for your input. Reducing the pixel dimensions to 1200*800 did indeed work, with (as you both rightly pointed out) no reduction in visible quality.

This is an area I do not know much about so thank you for explaining this to me.

Foyzul.