Duplicate participant IDs

I used this solution to assign sequential participant numbers to my experiment.
However, I have a few duplicate numbers.
Under what circumstances does this occur? I have never received duplicate numbers using this method before and I am wondering if these numbers belong to the same person.
Thanks in advance!

The most likely way to get duplicate participant numbers when using my app is if the participant refreshes their browser window after launching the experiment on Pavlovia. Are you saving partial data? The refresh could have been a crash in your experiment or due to actions of the participant (swiping down on a mobile device can cause this). It is probably the same participant, unless they deliberately refreshed in order to pass their device to a friend.

Thank you very much for your quick reply. The strange thing is that I only save complete data and all records are indeed complete. Also, sometimes the demographic data entered differs. Nevertheless, presumably they are the same participants.

Could the participant be passing their device to another person?

What happens at the end of the experiment? You could add a link to your recruitment advert to encourage a new click?

Do the times of the data files imply consecutive presentation? The start time is in the file and the end time is in the commit information.

Please could you PM me your recruitment link?

If my app somehow assigned the same number then it should skip the next number?

I’m not entirely sure what the participants did - but I recruited them through Prolific. So I don’t think that’s what happened. At the end they are redirected to prolific (with the completion code).

No - it does not skip the next number.

I’ll send you the link in a minute. Thank you very much!

I’ve sent you a spreadsheet which indicates 176 launches of your experiment, 88 of which were during the same minute.

The participant id is created by counting the number of rows in the database after the new participant has been added. I’ve done this with two separate database calls so I can imagine the possibility where multiple participants launch simultaneously and both rows are added before they get counted. However, this would still result in a gap [add participants 22 and 23 but they both get counted as participant 23].

The VESPR Study Portal uses a different method for allocating participant numbers, but in theory again there are two consecutive calls to the database and it is theoretically possible (I think) for two participants to have the same number if the calls are processed ABAB instead of AABB. This wouldn’t result in a gap, but the participants would have different values for session and that value can’t be duplicated with simultaneous launches.

Do your participants with identical participant numbers have identical start times? If not, then I suspect that they are pretending to be different people in order to earn more money and the data should be treated with caution.

Thank you for the spreadsheet and the explanation! That is very helpful.

Actually, I think both ways could happened in the current study. I wanted to recruit 150 participants via prolific (and according to prolific I have 150 participants). When I downloaded the data, I have 153 participants. However, not three, but four numbers are duplicated.

All duplicate numbers did not start the experiment at the same time. So I guess I have to exclude them.

Is there any way to prevent this for future studies? Specifically, to prevent participants from accessing the study more than once? (For now, I have relied on Prolific). Also, is it possible to prevent too many people from clicking on the link at the same time? (Which happens very often in Prolific).

Thank you very much again!

I would recommend that you try my new portal. Participants clicking on the link at the same time should have different values for session, even if they don’t have different values for participant.

If you use my portal to enable the link back to Prolific then each participant is only able to use the link once. You can see a demo of this function here:


If you click on Return to Sona and then reload the page, the link will disappear (until you reset it which is only possible for that demo participant). I also have a manual credit granting link which is emailed to the researcher when the participant starts.

I would recommend using Prolific credit granting via a Prolific id (which can be passed through the experiment to the debrief) rather than a single code that works for anyone.

I am currently trying to access the new study portal, but I always get an error message. Also, studies using the old method (https://moryscarter.com/vespr/pavlovia.php) are not accessible. (If I use the “pure” pavlovia link, it works) Does anyone else have this problem?

I haven’t noticed any issues recently, with either app. Perhaps you are sending people to the historical html folder.

What link are you using? What error does it give you?

I get this error message:
“Error: Network timeout.
The server on moryscarter.com is taking too long to send a response.”

I have noticed that I can’t reach any VESPR tools - which worked fine until yesterday.

I haven’t noticed a problem myself but the System Status page suggests that there might be connectivity issues.

Mh ok - thanks a lot for this information! So I guess I have no choice but to wait (also some of my colleagues can’t access the website) .

1 Like

Where are you based?

I am in Germany.
5 minutes ago I tried to reach the site again- and it seems to work now! :slight_smile:

I hope it is safe to start collecting the data.