I have seen many reports of errors in the voltage and current readings of these charge controllers. I want to share with you my experience calibrating them. Unfortunately there is not much documentation available about them, even less about the "hidden" menu in the MyGreenSolar software provided by the manufacturer.
The menu has "ratio" and "offset" parameters for each voltage or current. It is reasonable to think these are somewhat related to the slope and intercept of a calibration line. If that is the case then one would expect a change in the "offset" parameter would cause equal changes in the measurement across a range of values within the 0.1 unit resolution of the devices. Similarly a change in the "ratio" parameter should have larger effects as a value is increased. That is what I found, testing with battery voltage, however the "offset" parameter works opposite the value, increasing "offset" decreases the value. I also found the device accepts negative values for "offset" so if there is a need to increase the value and the "offset" is at 0 it can be done.
Note that I have only tested battery voltage, which in my case is the only important value, and with the 60 AMP model only.
I have worked out the following calibration procedure:
Warning: Don't use the "hidden" menu to do this, It does not seem to work.
1. Apply a low (about 10 to 12 volts) voltage to the battery inputs. Note the voltage difference and write it down, it will be the TARGET.
2. Apply a high (about 50 to 60 volts) voltage to the battery inputs, change the "ratio" parameter until the voltage difference is the TARGET or less.
3. Go back to the low voltage and adjust the "offset" if more than 0.1 volt.
4. Repeat steps 2 and 3 if needed.
I tried this with four devices and only had to change the "ratio".
The menu has "ratio" and "offset" parameters for each voltage or current. It is reasonable to think these are somewhat related to the slope and intercept of a calibration line. If that is the case then one would expect a change in the "offset" parameter would cause equal changes in the measurement across a range of values within the 0.1 unit resolution of the devices. Similarly a change in the "ratio" parameter should have larger effects as a value is increased. That is what I found, testing with battery voltage, however the "offset" parameter works opposite the value, increasing "offset" decreases the value. I also found the device accepts negative values for "offset" so if there is a need to increase the value and the "offset" is at 0 it can be done.
Note that I have only tested battery voltage, which in my case is the only important value, and with the 60 AMP model only.
I have worked out the following calibration procedure:
Warning: Don't use the "hidden" menu to do this, It does not seem to work.
1. Apply a low (about 10 to 12 volts) voltage to the battery inputs. Note the voltage difference and write it down, it will be the TARGET.
2. Apply a high (about 50 to 60 volts) voltage to the battery inputs, change the "ratio" parameter until the voltage difference is the TARGET or less.
3. Go back to the low voltage and adjust the "offset" if more than 0.1 volt.
4. Repeat steps 2 and 3 if needed.
I tried this with four devices and only had to change the "ratio".