| Reference | Downloads | Github

Memory storage on Gitlab repository - Pavlovia

Dear all,

I was just wandering whether there is a max limit in storage size for the dataset recorded through online experiments. I am saving results in .csv format. According to the dataset size that I will record, I expect to have ~2’300 MB of data by the end of the experiment.

Is there enough space for this dataset size? or there is the risk that the data won’t be saved anymore on the server due to limits in the max storage size ?

Sorry for the naive question, and many thanks for your help!

Dear All,

Does anyone has an idea about that? Many thanks for your help :slight_smile: @dvbridges @jon


We haven’t manually inserted any limits ourselves at this point. Do you mean 2.3GB (sorry I’m not sure about the ’ notation you’ve used)? I’m wondering what would cause such large files. What are you saving into these? Or are the files relatively small but you’re expecting thousands of participants?

Many thanks for your reply!

Yes, I mean 2.3 GB, sorry about that. You are right: We have a longitudinal study, each participant (30 in total) is expected to complete 30 experimental sessions, for a total of 900 sessions. Each session produces ~ 2.7 MB of data (log + csv). That’s why the final dataset will be quite large.

I guess there won’t be any issues with our study since there are not data storage limits?

Many thanks again for your reply!


It’s something we’ll have to monitor for excessive usage, but for now there is no limit to prevent this

2.7Mb is a huge data file for one session though, since these are raw text. What are you saving in the file?

Many thanks for your reply.

Each session produces a CSV that is actually quite small (457 KB), and a .log file that is actually 1.9 MB. The problem I would say is the log file

Is there a way to avoid saving the .log file?


I can provide both if you wish to have a look at it, no problem at all.


Oh. Interesting. That might be something we can make smaller at our end if you could send one (or just a link) that might be handy

Sure, with pleasure ;).

The log file can be downloaded from this link:


Speaking of the size of log files …
I’m currently trying to implement a task in pavlovia that used to run as a PsychoPy Coder experiment in the lab.
The stimuli are pretty complex. There are 409 elements (predominantly of the type polygon) on average on the screen in each trial. Since every single element is logged, the log file accumulates to a size of up to 300 MB (with approximately 2 million lines).

Is there an option to switch off the log file or change the logging level in an online experiment?
Might the accumulation of such a huge amount of plain text cause a visible increase of RAM usage over time as well?

Any help is appreciated and thanks in advance!

1 Like

I would also like to know if there are ways to avoid saving the log files on Pavlovia! Thank you!