| Reference | Downloads | Github

"X Window System error" when using Monitor Calibrated values

I used a Konica Minolta photometer for monitor calibration. But because I do not have the cable at hand to connect it to the PC, I ran the calibration under the semi mode instead. (Manual hand capture and recording of values)

I’m assuming that the table that is generated at the end of the calibration (with values of 0.0) would be the table where I manually key in the values that I obtained from the photometer, which was what I did.

After saving the relevant settings, then running an experiment with units=‘deg’ and then calling for the new monitor which has the saved values from calibration, I get the following error message:-

The program '' received an X Window System error.
This probably reflects a bug in the program.
The error was 'BadValue (integer parameter out of range for operation)'.
  (Details: serial 141 error_code 2 request_code 151 minor_code 17)
  (Note to programmers: normally, X errors are reported asynchronously;
   that is, you will receive the error a while after causing it.
   To debug your program, run it with the --sync command line
   option to change this behavior. You can then get a meaningful
   backtrace from your debugger if you break on the gdk_x_error() function.)

I am able to run the script however, when I set the monitor=‘testMonitor’, which is not really what I’d like to use because it uses the default values of 1.0.

I’m assuming this error is the result of the calibrated values. To investigate, I tried playing around with the values and noticed that it’s the Gamma values specifically that is causing this error (Anything other than a value of 1.0). Changing the ‘Min’ and ‘Max’ values did not result in the error. So my questions are:-

  1. Am I doing the calibration thing right by manually keying in the values obtained? If not, what is causing this problem?

  2. Because after filling in the table with the respective calibrated value, while the ‘Min’, ‘Max’ and ‘Gamma’ values were automatically generated for lum, r g and b, the rest of the values such as a b and k were ‘nan’ and the 2 other tables below (which I have no clue what those values do) remained 1.0. Do I have to be concerned with these unchanged values?

This is happening on both Ubuntu 14.04, KDE 4.14.3 as well as on Ubuntu Unity 16.04 with psychopy version 1.83.04.

UPDATE: I still cannot resolve this issue. Although I noticed that this seems to be happening specifically on HP desktops. I tried on a non-lab dell desktop and the calibration worked fine. This is also a linux specific issue as calibration on windows for all desktops worked fine.

My temporary work around solution for this (though not the best), is to write a shell script that consists of the gamma values obtained from the calculated values from the monitor centre for R G and B, then run this script via terminal before the start of the experiment to change the gamma values for the respective colours.

(Ie. Typing xgamma -rgamma 2.0 at the terminal changes the gamma value for red to 2.0. g for green and b for blue.)

To know why this isn’t working I think we’d need to see the values you’re trying to use when setting your gamma. The error message suggests that the values aren’t in the right range.

The values are as follows:-

This problem doesn’t replicate on windows though. And running on a dell desktop with KDE installed on it worked fine as well.