We bought a few Edulab test units (also masquerading under the name NeuLog?). One of them was the GSR.
We stayed away from the software, instead logging with Python (mostly) and one attempt at logging and manipulating with MATLAB.
Using the relatively simple API - which is quite restrictive but easy to use - (we actually used the NeuLog API but I don’t know if there’s any difference here?) and Python’s urllib2 on PsychoPy we made a few builder components that we are considering submitting to PsychoPy’s main repo (once they’re better tested and I’ve made the code a bit cleaner – I wrote it in a rush) but that we have locally available here on PsychoPy installations if people are using them.
We use event marking obviously within the experimental software itself this way to make sure it not only lined up with stimuli but that it was in one nice neat package – we still ran the wifi module alongside to allow people to monitor remotely/use the software if they wanted but we encouraged logging in PsychoPy as much as possible. Once we had a CSV output, you could then do any analysis or graphing you wanted from the values.
Using the NeuLog software (rather than the EduLab one), we ran the software on both Windows and Mac PCs without issue but these weren’t used in testing conditions – just an as is.
Though both softwares, completely unrelated to EduLab, were a pain to get going on our centralised and virtualised IT network… but that’s a different fight!
NeuLog software and API available at: Software and Application | NeuLog Sensors
We also linked it up as a trial to Qualtrics to record during a question, (using a bit of jquery and the prototype.js framework that Qualtrics uses and using a simple window.open() ) and got an experiment to run and save in the background (but this is clunky and was never gone beyond the ‘oh look, we can get it to run for fun’ stage and not sure of the use for it as the person using it would have to be sat at a computer with the API installed, running, set to the correct ports… you may as well use PsychoPy or MATLAB at that stage!).
I should add – for fun as well since we got a Hand Dynanometer, for open days, using the BART demo on PsychoPy, we make live values come from the sensor by just sending repeated calls to it every frame and making it print that value to the screen – we tell participants it’s a task to test their strength while they play the BART and we get them to repeatedly grip – but I’ve toyed around with making a separate module where you have a bar with distinct squares rather than like an LED meter ‘style’ just reading the values to the screen (but that’s completely not a necessary thing – just for fun).
With the API you can log in the background but if anyone found a use for it, you could also show the value to participants during a particular experiment (as well as logging it) of any of the sensors on that API list. So far, we haven’t found any use for it other than a fun task to demo the sensors live (we do a variant with the GSR and the Hand Dynanometer while they do the BART as well).
And you can, because you just need to make URL calls to it on the local machine, control it remotely – albeit with a relatively restrictive environment – if needed by using a bit of Javascript on a page the participant visit.
We’re working on a separate js library/web builder based on a new localised time method and stacking requests that I was digging into - that I will probably incorporate this function into with the ability to set type of sensor, logging interval, etc. But that’s a pipe dream currently and is currently sat on a LEMP server on the internal network with a very limited GUI and a sparse library – especially with our workload – but it is entirely possible. Again, same as with Qualtrics, I see very little use in it but the possibility is definitely there.