SoC% false accuracy? State of Charge

GeneralDJ

New member
Joined
Mar 5, 2019
Messages
6
I just read this on Battery University
Source 1: https://batteryuniversity.com/learn/article/how_to_monitor_a_battery
Source 2: https://www.mpoweruk.com/soc.htm

" The EBM (electronic battery monitor) works well when the battery is new but most sensors do not adjust correctly to aging. The SoC accuracy of a new battery is about +/10 percent. With aging, the EBM begins to drift and the accuracy can drop to 20 percent and higher. This is in part connected to capacity fade, a value most BMS cannot estimate effectively. It is not an oversight by engineers; they fully understand the complexities and shortcomings involved. "

This brings up the question:

Why use a moreexpensive BMS systems or hardware like shunts to measure SoC% on battery packs with second life Li-ion cells?

Concluding from the quoted text, it seems that measuring SoC% on second lifebatteriesgives false accuracy for a high cost.

thoughts and ideas??

edit 1: added another source with similar statements around SoC inaccuracy for aged cells.
 
Coloumb counting with proper calculation and it works down to 1% accurazy or something
 
GeneralDJ said:
The EBM (electronic battery monitor) works well when the battery is new but most sensors do not adjust correctly to aging..

"Most sensors" is the key term here. Most are usually cheap and are just based on voltage curves. More advanced and expensive ones will use a coulomb counter. The key to the coulomb counter is that a full charge and discharge are usually required to calibrate them every once in a while. Most phones have a coulomb counter as well.
 
Issue with typical joule / coulomb approach is it's your battery cycle efficiency that limits the accuracy unless you reset the counter each charge cycle.

You may put 5600Wh of energy in, but your litmium cells may only return say 5500Wh, after that one cycle your count position is out by 100Wh. This efficieny loss is dependant upon the cell loading so will vary cycle to cycle. Coulomb counting always drifts unless the counter can correct for losses at a given loading or charge rate.

Cycle that 10 times without calibration and your accuracy is possibly close to 20% out.

There is a different much simpler way... voltage to Wh lookup table.

The relationship between charge and volts is not linear and is different for different battery chemistries. But a single controlled cycle can give you an accurate lookup table. Much simpler. Cheap as a piece of A4 paper stuck on the wall next to a volt meter.

That's what I use as a means of calibration for my Wh meter...
 
completelycharged said:
There is a different much simpler way... voltage to Wh lookup table.

The relationship between charge and volts is not linear and is different for different battery chemistries. But a single controlled cycle can give you an accurate lookup table. Much simpler. Cheap as a piece of A4 paper stuck on the wall next to a volt meter.

Isn't battery voltage also related to temperature & current at the time? And with LiFePo4, the "curve" is very flat.

My 100% SoC readings are set up togetreset most charge cycleswhen the cells have been at "100%" for a while & current dropped away.
For longer term SoC battery total capacity loss with years & use, etc, I guess this would require maybe a once a year measured fullcycle to empty (by cell voltage), then recalculate the slightly reducedbattery total Ahrs & reset SoC devices accordingly?

I have both Batrium & a Victron BMV-712 tracking SoC (both do coulomb count method)& they don't seem to be drifting in SoC +they usually match pretty well.
Eg, right now they are reading 85.9% (BMV) & 87.0% (Batrium)
 
Voltage wont work on LiFePo4 as example. They are flat in the middle and drift perhaps 20mV when it goes 10% SOC.. And this is in resting state and left for hours... So impossible to do it properly.
LA is same thing. If you look at that during a discharge you will see the SOC jump all over the place on those using voltage. Voltage is just fine during resting and on the chems that have a pretty linear curve and not Life :)


I use coloumb as well and had Victron before and now I use 3x Batrium. They for sure do not drift so it gets noticeable at all.

The Victron ran on LA that do have easy 10-30% loss during charge and that one did cope with it pretty well due to it "resetting" during charge up keeping track on where it was.
 
If there is a reason to use the state of charge, then I would, but my PIP doesn't care about the SoC. All it cares about is the voltage, therefore I really don't see the real need to calculate the SoC.

It is quite a problem, because during heavy loads, like >1500W, my voltage will drop by 1V. So if I was to set the cut-off voltage at 25V, my battery would bounce back to 26V. At rest 25V should be 0% SoC, but at 26V is about 28% SoC. So it is a bummer that the PIP doesn't use SoC.

In any case, with laptop chemistries, the SoC can be determined with a fair accuracy. The graph is quite linear between 80% and 20% SoC, going from 4V to 3.5V.

So I just worked up some math to see if it's possible:

Battery Max = 28.5V
Battery Min = 25.0V
Voltage Drop @ 1000W = 0.5V

Current Status
Current Voltage = 27.5V
AC Usage = 200W
PV OUtput = 1200W

Voltage Compensation = (PV Output - AC Usage) / 1000 * Voltage Drop@1000W = (1200 - 200) / 1000 * 0.5 = 0.5V

SoC = (Current Voltage - Voltage Compensation) - Battery Min / (Battery Max - Battery Min) = (27.5-0.5) - 25 / (28.5-25) = 57.14%
 
Voltage is doable as long as you have resting voltage yes. Or close to. Thats how You do in RC. During load is something else. Then you need a counter and then you have voltage as a dead end check and in case over current too.

My MPP inverter do SOC based on voltage but it jumps all from 20-80% when used. Due to it goes by voltage :)

On RC you do the same. Voltage on resting is ok but during running the system you generally do it with a counter instead for better accurazy. But as you know the counter need to take into account the current for it to be close enough.
 
Uh if you bothered to read the rest of my post that's why I have a voltage compensation during load in my formula. All you have to do is figure the voltage drop at 1000W load and it should extrapolate the rest.
 
Vd varies with SoC - which is why good inverters will have a variable (current dependent)LVD
 
not2bme and if you read mine you see I confirmed it ;) Though the voltage drop varies dependant on SOC too ;)
 
The below chart is of a Panasonic CGR18650E. You can see between 80% and 20% it's quite linear. It's not exactly linear but it's good enough. The voltage drop doesn't vary much during the entire SoC range as you have mentioned.

So using my formula, I put it into my Grafana and you can see the SoC chart at the bottom is fairly accurate and smooth, while the battery voltage on top swings as the load and PV load changes during the day.


image_olxpcr.jpg

image_oxsais.jpg
 
For sure, yes the curve on Li-ion (typical 18650 cells) is steeper & easier to read SoC from.
So it seems the question is "can SoC be measured by other than coulomb count methods?".
And it seems that if
- you have Li-ion (not LiFePo4)
- you characterize/measure your battery curve over the range of SoC,
- know the load/charge current influence on voltage
- know the temperature effects on voltage
you can calculate reasonable SoC values like not2bme is doing.
Which is easier/better/more accurate?
Seems like the coulomb method is more direct to say SoC = xyz% right now.
But each method requires calculation for any accuracy.
I guess it's up to each persons preferences & how they build their system :)
 
My philosophy is SoC your looking for a rough indication and not really bothering about a 220Wh diffencence when you have "about" 10kWh left. For me the rough lookup on the sheet and seeing if there are any more clouds...

I wonder if most SoC coulomb counters on a 15kWh battery pack have an end SoC accracy at 10% the equivalent of boiling the kettle for a cup of tea... Isn't that what really counts, having enough energy for a fresh brew ? :)


All cells decrease in voltage through discharge, just depends if your meter detects it.. Do many people really know thier real powerwall voltage profile....


The lookup + load / charge offsets might get more complicated as the volt drop in the cells under load is not linear through the discharge process, so full accuracy needs a matrix to get relatively well accuracte... The rest of the wiring, etc. will see the basic IxR volt drop.

This is my pack, old data though and based on a controlled test on one cell, extrapolated for the pack size. I have some separate test data on the whole pack but it's a little rough (manual sampling) but follows the curve well..


image_bxkwoi.jpg



This is a different view for the same pack, with actual figures, load was varying a bit, hence the noise.


image_laedtn.jpg


This then shows how many kWh you have per incremental volt drop in the pack, This is where what you think is linear is really maybe 10% out in the middle, or more...
 
Fiddled around the SoC today and looks like it is working, although I don't have a victron unit to be able to benchmark it against. But at least now I know even though my voltage is at 28.4V (my max) it is still only 76% charged.

I didn't think knowing my SoC was important but it sure is nice to be able to tell how charged my batteries are!

The SingleStat number does fluctuate by 5% sometimes because my readings are not captured in realtime but in 15 second intervals, so the output isn't actually captured all at the same time. Using the graph I used a mean over 1 minute and that smooths out the blips. Also it was easy to figure out the voltage drop as you bump from 0.2V to 0.6V in 0.1V increments you can see the graph just smoothen out.


image_lgrwmz.jpg
 
I would say SOC is one of the most important numbers. I dont look at Voltage any more. I look at SoC and the computed "time-left" and 3 years later it havent let me down a single time.
 
I have a48v@780ah 18650 Lith-ion battery bank with Midnight Solar Wizhbang Jr Shunt and Batrium system with Batrium shunt.

Operationally I use voltage -e.g. Inverter/auto-transfer-switchon at 52.0v, max/float at56.4v(4.03v/pack), and Inverter off at 47.5 (3.39v/pack). This allows my battery bank to supply up to 24kWh (60%) of my 40.5kWh battery in conjunction with the PV array and home wiring so I can just barely consume 100% of PV array on most intense days in summer. The goal is to consume every watt of power that I can and not waste any to 'battery full'.

SOC is not important in the short run but I'm looking to use SOC for the long run as the battery degenerates to maintain the 24kWh from the battery as a higher and higher SOC is required due to battery degradation. For example, right now a 60% SOC = 24kWh. I imagine that in a couple of years I'll need 70% SOC then 80% SOC etc... to maintain the 24Kwh per day.

So, I'm recording an SOC metric because over time I hope to detect the battery degradation in terms of SOC and adjust the voltage range to maintain 24kWh per day.

So far, I'm using voltage difference - highestvoltage for the day (from PV array charging) minus theinvert off voltage (47.5). SOC = (HiVolt - LowVolt)/0.1483/100.

I also have Batrium "Discharge AH" info that I'm recording but... the PV array can go up, down (discharch AH), and up again and down again (discharg AH) in same day - e.g. multiple up/down - so its not clear to me thatDischarge AH is a good SOC metric.

I also have Bartrium SOC info - but I've set it and it seems to reset and I don't understand how it works or what its based on. Maybe you guys know?

I also have Midnite Classic SOC - but it resets only if the system reaches max Volt setting (e.g. Float voltage) and does not seem that accurate if you don't hit max voltage quite often.
 
It resets on top and bottom voltage. Thats how you callibrate it. Under shunt tab
 
daromer said:
It resets on top and bottom voltage. Thats how you callibrate it. Under shunt tab

I don't see a direct 'voltage -> reset' on Shunt page.. but looking at the Batrium Shunt page and reading your comment... and since I know my low voltage = 47.5v / 28% SOC then...

If I use"Re-Calibrate Low SoC" = ONand set the percent to 28%,then Batrium will rest to 28% at lowest voltage point each day?
 
Back
Top