SoC% false accuracy? State of Charge

OffGrid, the Batrium's SoC should not drift much day to day, especially not in one direction.
Not sure how you have your system wired but ALL loads, charging & other connections (incl the Batrium controller) should be to one side of the shunt & ONLY the battery packs (& cellmons) on the other.
There is a known problem where some inverters are electrically noisy & this causes shunt inaccuracy. There's a fix for this out from Batrium I think.
Heavy balancing operation may also make the SoC drift.

Since you don't take your packs to 100%, getting a completely accurate SoC doesn't sound like it's for you.
But since you like to operate your packs in the middle of the range, you could still do an "accurate enough" method eg
- enter the total "full to empty" Ahr capacity you understand your battery bank has (ie full range 4.2V to 3.5 or whatever low end V)
- manually setting the Batrium SoC value based on your existing knowledge of your bank at that time (ie take an at-rest V reading & decide the SoC from that).
- occasionally (weekly?) manually rechecking the voltage at-rest SoC & re-entering the SoC
You would see the SoC cycling say 85% down to 40% or whatever the range is for you.

Will this be perfect? No. But since you're operating in the middle of the range this should let you relax a bit & start using coulomb SoC from the Batrium system.
Hopefully the SoC corrections should only be small eg <5% or better maybe <2%
If you are getting big corrections at first, the values or pack connections might be wrong to start with.
If, later, you start getting increasing corrections being needed, this might indicate a developing pack problem.

Re DoD, this is basically the same as the low end of SoC.
 
Thanks to the discussion - I think I'm further along, appreciate it.

@Daromer + others: What I read is that once the SoC % is set, and things are hooked properly the precision should be pretty good that ah in/out via the shunt should cause the SoC to stay 'true' unless.... There is some self discharge or interference or something going - e.g. the expectation for a healthy shunt is that it will be so accurate that it would take a long time to dift much.

@Redpacket (thanks for the idea) butfor the record - yes, the only thing battery side of the shunt is the battery bank. I do have a Midnite Classic Wizbjr shunt in series with the Batrium shunt upstream from the Batrium shunt- but I wouldn't expect that to affect Batrium.

@gauss163: The documentation is basically 'echo the dialog' and does not have detailed info on how things work.
- This talks about max charging - https://support.batrium.com/article/229-soc-calibration - which I don't have in my daily operations.
- This talks about the shunt settings - https://support.batrium.com/article...mon-hardware-configuration-shunt-soc-settings - and there isn't any 'low voltage reset' that I can see or anything that would work for me. I don't know of any other docs - but of course any info is welcome :)
Note: A while back, I was fooling with Shunt settings and I caused Batrium logic tostop reading the shunt (v, a, and w) 2 weeks after I entered a setting! So I'm gun-shy about just trying things without knowing how they work as I have a live shunt-trip in play and dont' want Batrium logic to seize.


Key issue for me: This discussion seems to confirm that there is no'auto-reset on low voltage' feature. Theonly auto-reset feature is at the high end. So I'm not missing anything obvious :)

For now - I have reset my SoC to 65% (1014ah out of 1560ah total)at 54.25v and will see what happens over the coming days - see if it continues to show 65% SoC every time it gets to 1014ah
 
What the full charge is up to you to set. It should be a mark that you define. If that is 3.95 or 4.0 or 4.2 doesnt matter. Whats important is that you define the capacity between those 2 markers. The Soc should not drift away from the Ah. Im not sure how batrium have done it in the code but it should be the same or close to the same i guess.

If you rarely hit high but always hit the low marker it sounds like you need to redesign i would say. In my world you should hit the full more often than empty :) But still. During the winter i never hit the high marker for weeks... And dont see any issues so far. It seem to be where it should be. I have never had any failure due to soc being wrong and have had sudden voltage drop. (I base the automation on the soc value and the voltage is just for critica and they havent colided)

Regarding the low part i cannot fully answer since i havent touched it. I hit low perhaps 1 time per year at most. I always have 20+% left when switching.
 
OffGridInTheCity said:
.......
@Redpacket (thanks for the idea) butfor the record - yes, the only thing battery side of the shunt is the battery bank. I do have a Midnite Classic Wizbjr shunt in series with the Batrium shunt upstream from the Batrium shunt- but I wouldn't expect that to affect Batrium.
................
Key issue for me: This discussion seems to confirm that there is no'auto-reset on low voltage' feature. Theonly auto-reset feature is at the high end. So I'm not missing anything obvious :)

Agree, Midnite shunt should have no effect.

I think the Batrium system does have a "reset at min" see here:

image_fjvkns.jpg


OffGridInTheCity said:
For now - I have reset my SoC to 65% (1014ah out of 1560ah total)at 54.25v and will see what happens over the coming days - see if it continues to show 65% SoC every time it gets to 1014ah
Great, be interested to hear how it goes.
 
>@Daromer said "...If you rarely hit high but always hit the low marker it sounds like you need to redesign i would say."
[size=small]In your case you can send excess power to the grid, but in my case, being off-grid, I don't have that option.[/size]
When I hit float/hi it means(for me) thatPV input power is being wasted as there's nothing consuming it and the battery is full so it can't be stored to burn at night.ThruPV and battery bank sizing - my system is designed to *avoid* hitting hi voltage 99% of the time - so I only losePV power now and then -as close as I can manage it.


>@Redpacket said "...[size=small]I think the Batrium system does have a "reset at min" see here:"[/size]
[size=small]Yea - the docs (https://support.batrium.com/article...mon-hardware-configuration-shunt-soc-settings)say "[size=small]Re-Calibrate Low SoC: ON/OFF to allow reset the State of Charge back to empty (0%) if it goesbelowthis value. Only available in advanced mode."[/size][/size]

[size=small][size=small]But I don't get how to use this that will help...[/size][/size]
[size=small][size=small]- If I set it to 0% - I never get to 0% (e.g. 3.0v/cell) as 3.54v/cell is the lowest I go. [/size][/size]
[size=small][size=small]- If I set it to 20% or something like that, which is closer to 3.54v/cell... then if it went < 20% it would go to 0%?? That just misrepresents the SoC.[/size][/size]

[size=small][size=small]Maybe I'm missing something? :) but I don't see how[size=small][size=small][size=small]Re-Calibrate Low SoC:[/size][/size]works other than actually going down near 0% SoC - which for battery life span purposes I work to avoid![/size][/size][/size]
 
Yeah, you're right not much is going to help here.
Since you're working in the middle of the range, you don't really hit the calibrate points.

Seems like you would need to keep checking the Batrium SoC meter calibration via the measure resting voltage method you've got going.
Hopefully you can watch the SoC Batrium's measurements & see how stable they are vs your resting voltage SoC.
Maybe knowing how well the Batrium SoC works would let you ease off on the more difficult resting voltage method?
 
Interesting.

My pack tested out at ?186Ah.Since I am only using it from a voltage range of ?3.5V to 4.05V i took an educated guess and entered 140Ah into the Nominal capacity field.
Is that a true representation of my actual capacity? No. So I believe that if you set your Nominal Capacity to what your pack will actually be able to produce in Ah and you really never charge to that level then yea I can see that the SOC would be not accurate.

The way I figured it, I wanted my SOC to reflect what I am actually using not what my actual capacity could be. If that was the case then my SOC would toggle between 80% and 20% so there would never be any adjustment or reset happening as the criteria to re-calibrate the SOC would never be met.
Now maybe I am totally off on this or it was just dumb luck that I got it to work for me. When my system hits 4.05v it considers that 100% SOC and when it goes down to ?3.5 it considers that 0% SOC.
I also set my Empty SoC% cycle threshold to 35.0 and my Full SoC% cycle threshold to 85.0.

Now I may be totally off on that but it seems to work for me.
Wolf
 
Offgrid ahh yeah that is Good design:)
 
Thank you all for the discussion. At least I don't think I'm missing anything obvious now :)

Also learned the shunt should be 'very accurately' measuring things - so SoC should be stable / minor drift. A wider driftcan alert me to a problem. I'll incorporate this into my stat collectionsand see what happens :)
 
OffGridInTheCity said:
Thank you all for the discussion. At least I don't think I'm missing anything obvious now :)

Also learned the shunt should be 'very accurately' measuring things - so SoC should be stable / minor drift. A wider driftcan alert me to a problem. I'll incorporate this into my stat collectionsand see what happens :)

Supper hopefully it all pans out for you. I also learned a bunch on this thread.
The neat thing about having all of Batriums data recorded in influx is that the data can be replayed and looked at in a specific timeframe.
So this is a graph of7 days of my C/D/C cycle of my pack.
The top graph shows Voltage and SoC in comparison.The small little bumps in the SoC trace at 100% and 0% is Batrium doing a fine tune adjustment whenever the -1.5% or 101.5% thresholds are reached.

The bottom graph shows shunt current and the cumulative Ah charge and discharge. On8/26 I was completely discharged for the evening, fortunately 8/27 proved to be a great day and I was able to charge back up to 100% which from a "0%" SoC is about 150Ah.

image_iwurmh.jpg


Here is a graph showing all the cycles since I started recording with influx. (8/15 was a hardware glitch) I'm sure looking at the max Ah charged and the max Ah discharge correlation over time would show a pack that is degrading.

image_brnftc.jpg


Here is a trace with a 7 day filter from the start of recording that shows the correlation between charging Ah and discharging Ah.If the trace staysrelatively even over time esp. over a year then the pack should be OK in my opinion.

image_hfreoi.jpg


Wolf
 
>If the trace staysrelatively even over time esp.
Which brings up charge/discharge efficiency for healthy 18650 Lithium-ion cells. What's the current/generalexpectations- e.g. is it like 0.5%loss between power in and power out? Does this metric meaningfully widen as cells degrade?

*I know I shouldn't ask such a technical question so imprecisely- but trying to ask... in practical terms... is it large enough to be visible in daily Soc ah in / ah out - or is it such a small fraction is basically hidden in the data noise?
 
^^^ Li-ion charge efficiency is very high - typically over 99% for healthy cells. But energy efficiency is lower due to IR losses.
 
OffGridInTheCity said:
>If the trace staysrelatively even over time esp.
Which brings up charge/discharge efficiency for healthy 18650 Lithium-ion cells. What's the current/generalexpectations- e.g. is it like 0.5%loss between power in and power out? Does this metric meaningfully widen as cells degrade?

*I know I shouldn't ask such a technical question so imprecisely- but trying to ask... in practical terms... is it large enough to be visible in daily Soc ah in / ah out - or is it such a small fraction is basically hidden in the data noise?

That is a good question.
I do not believe that you can get an accurate reading on a packs degradation on a daily basis the variables are just too (well) variable.
Over time I think it is possible though.

Looking at my charts for the limited time I have been recording, and usingthe 7 day filter to average things out, these are the numbers I come up with.

image_ymtnnd.jpg


A loss of about 23% between Charge and Discharge.
If I do a daily comparison and average it out it comes to to the same 23% loss.

image_vsypmk.jpg


I guess if my math is correct then I am running at about a 77% efficiency.
Now if I run these numbers again,in let's say 3 months and my efficiency stays the same I would say I am OK but if it drops to say 70% then I may be seeing a drop in theSOH of the pack. Unfortunately those numbers may be compromised by installing another pack to supplement my system as I am building another 14s80p pack.
So how to tell if one pack is performing worse than anotherI don't know.

Wolf
 
>I guess if my math is correct then I am running at about a 77% efficiency.
You're saying your batteries have a23% loss of efficiency between charge/discharge? or whatdo you think makes up the 23% loss.

I track PV input kwh (Midnite Classic Controller provides this number) and Inverter output (cheap Amzon 120v meters on output for this number) and based on that I get86% efficiency (14% loss) when the inverter is reasonablyloaded (e.g. no idle watts). AIMS documented peak is88% efficiency - so at least 12% of the 14% is AIMS.

That leaves 2%.. and it could be AIMS is not at peak, other stuff likewiring / component heataccount for 1%? That's whatleads me to guess <=1%'ish charge/discharge loss battery wise in a singe cycle.
 
The batteries itself should not have more than like 1% loss at most. Lithium is darn efficient :)

Im at around 82-88% efficiency on my inverter. During low idle loads its more like 60% efficient due to its self consumption :D
 
It's not clear what the Batrium numbers mean (if anything) and probably the only way to infer that given the lack of any docs is to reverse engineer it. As for some well-specifiedLi-ion battery energy efficiencyresults, below are a couple examples from here (2016)and here (2017) showing energy efficiencies of about 99% to90%, and98% to85% (at C/10 to 2C rates).In the final graph note the close correspondence between energy losses calculated electrically vs. by heat (isothermal calorimetry).


image_cswnjx.jpg


image_ldyghp.jpg




image_nfhaoh.jpg
 
OffGridInTheCity said:
>I guess if my math is correct then I am running at about a 77% efficiency.
You're saying your batteries have a23% loss of efficiency between charge/discharge? or whatdo you think makes up the 23% loss.

IDK I am just looking at the average of the numbers as in Ah in and Ah out. Am I correct withthat calculation? Again IDK.

I kinda think that (the loss)is pretty high also. Now my system is grid tied so my system charges my battery when the grid tie inverters are satisfied and the excess goes to the batteries. Some days I get full charge (4.05V)others I don't.
At this point I do not have reliable array wattage as I only record Vand A I'm sure I can spreadsheet the data and calculate the wattagealso comparing it to grid tie output which I do get the wattage from my IoTaWatt. I suppose that is where I can calculate my final loss.Further investigation is requiredsomeday when I have more time and energy.

But as far as SOC% I think we got that part covered. :D

Wolf
 
System efficiency is different to battery efficiency.
The inverter losses are typically the biggest. For systems with AC coupled batteries this is worse again from the double conversion.

I'd suggest that yes, like Offgrid said, the different efficiency curves, eg the inverter in particular would introduce "noise" in the data, likely higher than the small amounts we're looking for in the battery/SoC drift measurements.
For SoC tracking aren't we looking a bit closer in, ie just tracking in/out of the batteries?
Ie trying to confirm the SoC tracking & sources of drift for just the battery section of our systems?
 
daromer said:
The batteries itself should not have more than like 1% loss at most. Lithium is darn efficient

But that's theircharge (Ah) efficiency. It may be much less for energy (Wh) efficiency, e.g. see the graphs in post #56.
 
There's this law:
https://en.wikipedia.org/wiki/Peukert's_law
But it's more complex for lithium - but apparently mostly for higher rate charge/discharge eg like power tool use (agree for with above & wiki that application).

Those graphs are focussed on higher rates, eg if you take ~1C rates (way higher than most DIYers use) the "round trip" in the graphs for energy looks like about .97 (2%) charging & .98 (3%) discharging & figure 6 suggests a whopping 9% @1C!
Since most of us run the batteries at low C charge/discharge rates, we would be operating in the lower % numbers. And some version of Peukert's law is still valid as the heating effects would be small.
In practice it seems most of us will be getting round trips at well under 0.5C so losses will be at the low end of the graphs. Eg using @ 500mA/cell with 2500mAhr cells is 0.2C
My system (LiFePo4) typically charges at 0.1C (~50A into a 480Ah system) & discharges (varies a lot) but say average approx 700W, a high approx 3kW, 53V nom, so avg ~0.03C, high ~0.13C.
That'd mean my round trip around the 1% ish mark.
I know my Batrium shunt tracks the SoC pretty well & doesn't seem to drift much even when cloudy & not hitting full/reset point.
I have not had to manually reset the SoC after initial build "settle down" in > 2 years operation - however it does this automatically on good solar days.
 
Back
Top