Difficulty in Posting Large CSV file to Pavlovia Database

URL of experiment: https://gitlab.pavlovia.org/lifelab/three-choice-shift-task

Description of the problem: We are hosting a js-psych experiment on pavlovia and while we have had success in getting initial pilot data to upload with a shortened trial set, once the full 400 trials are used the pavlovia-finish script gives a “413 Request Entity Too Large” error. After trying a series of trial lengths it seemed that when our file was larger than 600 kb this error arose but that when it was closer to 300kb things uploaded fine.

One idea we were thinking of testing was to first split the ~600kb data in half and upload each csv separately but I wanted to first ask an expert opinion to get a sense of whether this would be liable to fail and create any problems for our data collection.

With thanks,
Nick

Hmm, yes, @alain and I hadn’t anticipated csv files being 600kb. That’s a lot of characters for a text file. What are you storing in it?

I don’t know if compression / inflation is supported, but compression can sometimes dramatically reduce file-transfer size. A .csv of trialtypes would be a good candidate for this (many repetitions of a few text basic elements).