SoC% false accuracy? State of Charge

Yeah the efficiency for charging/discharging do depend on how hard you push it. The average for us is though very low unless you have a very very small battery bank, small solar and then heavy load...
 
Redpacket said:
[...] Those graphs are focussed on higher rates, eg if you take ~1C rates (way higher than most DIYers use) the "round trip" in the graphs for energy looks like about .97 (2%) charging & .98 (3%) discharging & figure 6 suggests a whopping 9% @1C!
Since most of us run the batteries at low C charge/discharge rates, we would be operating in the lower % numbers.

The graphs are not "focused on higher rates" since they show both low and "high" (2C) rates - which was the point of posting them, i.e. to emphasize how the energy efficiency drops off at higherrates. Many folks confuse energy and charge efficiency and so end up overlooking that Li-ion energy losses are nontrivial except at low rates.

As the examples show, for healthy cells,ata lowC/10 rate wecan get round trip efficiency in the range 98% - 99%,but at a higher C/2 rate it dropsto 95% - 97%, and even lower at 1C: 91% to 95%. Of course these energy losses will increase much as the cell ages and IR increases much (a point which needs to be considered during thermal design of any Li-ion powered device).
 
So I did a data dump from Batrium since the inception on 5/11/2020 to now and am relieved to see that my previous calculations of battery efficiency as far as Ahs in and out was way off.

My TotalCumulAhCharge was 10656.519
My TotalCumulAhDischg was 10572.424
which gives me a 0.79% loss

Much better :D

Now to figure out the efficiency of the rest of the system.

Wolf
 
Sounds better :) I think im roughly in that range as well. Im a tad higher than you due to my slightly older LiFePo4 bank that have some years of use.
 
Wolf said:
So I did a data dump from Batrium since the inception on 5/11/2020 to now and am relieved to see that my previous calculations of battery efficiency as far as Ahs in and out was way off.

My TotalCumulAhCharge was 10656.519
My TotalCumulAhDischg was 10572.424
which gives me a 0.79% loss

Round trip charge (Ah) losses will normally be less than 1% for any healthy cells, so that's not unexpected.

However, I highly doubt that the numbers reported by Batrium have much to do with actual charge loss, because to measure that requires veryprecise equipment, and also very careful control over the charge and discharge process - all of which is lacking in typical DIY BMS units like the Batrium.

If anyone ever figures out precisely what the Batrium numbers really mean then please do elaborate.
 
gauss163 said:
Redpacket said:
[...] Those graphs are focussed on higher rates, eg if you take ~1C rates (way higher than most DIYers use) the "round trip" in the graphs for energy looks like about .97 (2%) charging & .98 (3%) discharging & figure 6 suggests a whopping 9% @1C!
Since most of us run the batteries at low C charge/discharge rates, we would be operating in the lower % numbers.

The graphs are not "focused on higher rates" since they show both low and "high" (2C) rates - which was the point of posting them, i.e. to emphasize how the energy efficiency drops off at higherrates. Many folks confuse energy and charge efficiency and so end up overlooking that Li-ion energy losses are nontrivial except at low rates.

As the examples show, for healthy cells,ata lowC/10 rate wecan get round trip efficiency in the range 98% - 99%,but at a higher C/2 rate it dropsto 95% - 97%, and even lower at 1C: 91% to 95%. Of course these energy losses will increase much as the cell ages and IR increases much (a point which needs to be considered during thermal design of any Li-ion powered device).

As clearly confirmed by several peoples posts here, we're operating our systems in the 1% area.
The graphs are clearly covering a wide range of charge/discharge rates & we're operating at the low end of them.
Don't see what the problem is.
Good news is losses are small.

Also not sure why you keep posting disparaging comments about Batrium gear.
Batrium gear does seem to do a good job as designed & is well regarded here.
Understand if you don't run a system yourself it might be harder to see all angles with it & get it that some documentation could be improved.
But overall, IMHO, it's pretty much the best gear of it's type for practical live systems in the field vs laboratory level gear (don't get me wrong, lab gear is great - but not practical in the field & costs $$$$).
 
Redpacket said:
As clearly confirmed by several peoples posts here, we're operating our systems in the 1% area.
[...] Good news is losses are small.

That's merely aguess without any clear definitions of what the Batrium numbers mean and how they are computed, how accurate they are, etc.(none ofwhich is documented)

As I emphasized above, to precisely measure charge/energy losses requires precise equipment and careful control of (dis)charge processes. This is not something a consumer-level BMS is designed for (the (cumulative) errors in such BMS typicallyalready exceeds 1%).
 
gauss163 said:
Redpacket said:
As clearly confirmed by several peoples posts here, we're operating our systems in the 1% area.
[...] Good news is losses are small.

That's merely aguess without any clear definitions of what the Batrium numbers mean and how they are computed, how accurate they are, etc.(none ofwhich is documented)

As I emphasized above, to precisely measure charge/energy losses requires precise equipment and careful control of (dis)charge processes. This is not something a consumer-level BMS is designed for (the (cumulative) errors in such BMS typicallyalready exceeds 1%).
gauss163 said:
^^^ Li-ion charge efficiency is very high - typically over 99% for healthy cells. But energy efficiency is lower due to IR losses.

On one hand you're agreeing efficiency is high & then you're telling us we're guessing when our measurements confirm that?
As Daromer myself and others have said, typical charge & discharge rates in our systems are in the high efficiency corner of the graphs you posted.
The important point is the battery efficiency is good.

Sorry you don't seem to understand some of the Batrium functionality.
It is designed to reset cumulative errors when full charge is detected.
We're using it & it works.
It does not need to be super accurate, just functionally accurate enough for the purpose designed.
No one is saying it will be as accurate as lab gear. But then it costs way less than lab gear.
It's the practical vs academic thing again.
Enough already?
 
Redpacket said:
On one hand you're agreeing efficiency is high & then you're telling us we're guessing when our measurements confirm that?

You can't possibly confirm anything with numbers whose meaning is not properly specified. Since you don't seem to be able to comprehend this crucial point I don't think it is constructive to continue.

For readers interested in learning more about how charge (coulombic) efficiency (CE) is measured I have appended below an excerpt from a 2017 paper.

Wilhelmetal. said:
The changes in CE due to degradation processes through parasitic reactions are very small. Thus, CE studies of these processes require precisely set currents, high precision voltage measurements as well as strictly controlled sample environments [14].

The high sensitivity of these measurements under controlled sample conditions led to the discovery of a surprising reversible capacity effect. Studies found anomalous transient CEs with CE > 1 in high precision cycling experiments after storage [15,16]. This behavior was linked to anode overhang, acting as a lithium-ion source or sink depending on the charging or discharging scenario [16]. Anode overhang means, that areas of the anode active material do not have a cathode counterpart. This results from the negative electrode in a lithium-ion cell being designed slightly larger than the positive electrode. This is a common design feature to assure 100% cathode-anode overlap and to avoid lithium platingat the border area of the graphite anode [17].
[...]
Cycling was performed with a BaSyTec CTS battery test system. All tests were conducted at a controlled temperature 25 0.1 C using a BINDER KT170 climate chamber. For the coulombic efficiency measurements, additional 20 m? shunts were placed in the current path between the battery test system and the battery cells. The voltage drop across each shut were measured with an ultra-low noise, 24-bit sigma-delta analog-digital converter from Analog Devices (7193 CE). This enabled the measurement of the currents flowing into and out of each battery cell with a higher resolution and precision than the battery test system.

Recall that the crucial safety role of the anode overhang area was also discussed in the video I linked recently on the Samsung Galaxy Note 7 batteryfailures.
 
Key point: Practical vs academic. Real world vs laboratory.
We're build & operating systems & reading the numbers from the gear & it's working.
Of course it's laboratory work > testing > manufacturing > real world use of products.

gauss163 said:
Redpacket said:
On one hand you're agreeing efficiency is high & then you're telling us we're guessing when our measurements confirm that?

You can't possibly confirm anything with numbers whose meaning is not properly specified. Since you don't seem to be able to comprehend this crucial point I don't think it is constructive to continue.

Comprehension issues? Really? We're busy using our systems & the numbers make sense here just fine thanks.
You keep saying you don't understand Batrium. Build a system & use it.

gauss163 said:
For readers interested in learning more about how charge (coulombic) efficiency (CE) is measured I have appended below an excerpt from a 2017 paper.

Wilhelmetal. said:
...... research stuff ....

Not achievable in a practical environment.
In real world use, currents & temperatures are moving around all the time, eg weather night to day, clouds during solar charging, loads on/off & plenty of other "normal" occurrences.

gauss163 said:
Recall that the crucial safety role of the anode overhang area was also discussed in the video I linked recently on the Samsung Galaxy Note 7 batteryfailures.

"...anode overhang..."? What? Don't even know why you posted this - we have no control over cell construction?
Even if there is some link to coulomb counting, chargers pretty universally use voltage level (& current tail off in better ones).
Coulomb counting isn't typically (ever?) used for charger control.
 
This has been a very interesting thread. I hope it won't derail in a discussion over error bars on measurements.

I have learned that a Batrium BMS tracks the charge of a battery by measuring the current through a shunt over time while charging or discharging.
It is not known how accurate a single measurement of the charge is, because the tolerances on the shunt and the measurement circuitry on the BMS are unknown. The effect of environment temperature on the measurement is also unknown.
However, it seems like the achieved accuracy is good enough to estimate if a battery at a given time has enough charge left to boil water for a cup of tea.

My estimation is that when numbers of many charge cycles and discharge cycles are added and then compared, the inaccuracies in the measurements are averaged out to the point where we can draw conclusions about how efficient a certain DIY battery is.
If the errors in the measurements were biased, then it would be noticeable to the point where people say SoC measurement is drifting or recalibrating too ofthen.
 
Redpacket said:
Key point: Practical vs academic. Real world vs laboratory. [...] Not achievable in a practical environment. In real world use, currents & temperatures are moving around all the time

Charge/energy efficiency is most certainly a very practical SOH metric, and useful results may be obtainable even with minor variations in temperature.

Since you seem to have once again missed the point I will elaborate. Let's start with an analogy that is likely more accessible to most readers. Anyone who has used the Opus BT-C3100 analyzing charger likely knows well that its capacity numbers are often way off the mark.

In the early (firmware) days some users had no interest in getting "under the hood" to understand how accurate these (capacity) numbers were. They put blind faith in these meaningless numbers and built packs from them, etc. Later we discovered that the errors can be quite large (exceeding30%) which means that the Opus often misclassified bad cells as good, meaning that many packs were built with bad cells, and balancing was often far from what Opus wrongly implied it should be. As a result many users lost much work, packs had far less life, and possibly it even led to safety issues.

Some of these problems were fixed in later versions (e.g by reducing pulse amplitude soit no longeroverloaded the power supply), etc. But even with these patches the Opus is still less accurate than many other chargers because its design (PWM CC/CV) makes it more difficult to accurately measure various parameters. Even if you are so "practical" that you have no interest in under-the-hood understanding of the technical matters behind these limitations, it is still important to understand the ramifications they have when using the reported numbers when building packs etc.

There are analogous issues when measuring charge/energy efficiency, and the challenges are even greater since they require much higher precision. For example consider the following graph from a Nature Energy 2020 paper


image_jibqqx.jpg


Note that at 80% capacity retention (20% wear) a CE of 99.00, 99.70, 99.90, 99.98% yields 22, 74, 223, 1115 cycles resp. [99.80 is a typo for 99.70 in the graph]. In particular a tiny0.08% decrease in CE yields a hugedecrease incycles, from 1115 to 223. So CE needs to be measured very precisely in order to deduce accurate SOH information. Without such one may suffer the same problems as in the Opus analogy mentioned above. In fact very high precision CE measurements are used by Tesla's battery guru Jeff Dahn in order to empirically deduce chemistry improvements without having to do expensive lengthy cycling tests.

Redpacket said:
"...anode overhang..."? What? Don't even know why you posted this - we have no control over cell construction?

The point of posting that excerpt was to give readers some hint of the high precision needed for such measurements. Note that they had to augment the high-end BaSyTec battery tester system with an ultra low noise 24b ADC in order to get the needed resolution. So one shouldn't just assume that some BMS already has sufficient resolution for such, alongwith sufficiently precise control over the (dis)charge process, etc.

Further, the point of including the remarks about the anode overhangleading to CE > 1 is that you will likely encounter this if you perform CE tests, and knowing about that strangeness means you won't waste much time attempting to troubleshoot your methods when there is in fact no error (this puzzled many folks for a long time until the real reasonwas eventually discovered). This is very useful to know for anyone who is interested in practical CE measurements.

If you stopped your frequent ridiculous science bashing for only a moment you might be able to learn that there are methods of practical use here - just as there is for IR SOH metrics.
 
gauss163 said:
Redpacket said:
Key point: Practical vs academic. Real world vs laboratory. [...] Not achievable in a practical environment. In real world use, currents & temperatures are moving around all the time

Charge/energy efficiency is most certainly a very practical SOH metric, and useful results may be obtainable even with minor variations in temperature.

Since you seem to have once again missed the point I will elaborate. Let's start with an analogy that is likely more accessible to most readers. Anyone who has used the Opus BT-C3100 analyzing charger likely knows well that its capacity numbers are often way off the mark.

.......
Note that at 80% capacity retention (20% wear) a CE of 99.00, 99.70, 99.90, 99.98% yields 22, 74, 223, 1115 cycles resp. [99.80 is a typo for 99.70 in the graph]. In particular a tiny0.08% decrease in CE yields a hugedecrease incycles, from 1115 to 223. So CE needs to be measured very precisely in order to deduce accurate SOH information. Without such one may suffer the same problems as in the Opus case mentioned above. In fact very high precison CE measurements are used Tesla's battery guru Jeff Dahn in order to empirically deduce chemistry improvements without having to do expensive lengthy cycling tests.

The point of posting that excerpt was to give readers some hint of the high precision needed for such measurements. Note that they had to augment the BaSyTec battery tester system with an ultra low noise 24b ADC in order to get the needed resolution. So one shouldn't just assume that some BMS already has sufficient resolution for such, alongwith sufficiently precise control over the (dis)charge process, etc.

Further, the point of including the remarks about the anode overhangleading to CE > 1 is that you will likely encounter this if you perform CE tests, and knowing about that strangeness means you won't waste much time attempting to troubleshoot your methods when there is in fact no error (this puzzled many folks for a long time until the real reasonwas eventually discovered). This is very useful to know for anyone who is interest in practical CE measurements.

If you stopped your frequent ridiculous science bashing for only a moment you might be able to learn that there are methods of practical use here - just as there is for IR SOH metrics.

To be clear, I am all for science it delivers many things we have use & know today. We know many things as a result of detailed careful research level academic work.
Like I said above (which you seem to have missed)
Redpacket said:
Of course it's laboratory work > testing > manufacturing > real world use of products.

As you have been told bluntly before, this is a DIY forum.
You need to keep it practical & concise, instead you just spout more academic stuff some of which is irrelevant and not achievable in practical use.
DIYer's are not going to get laboratory gear or have stable test environments to measure to that precision.
We need to know is what works in the field.
And also what are the limitations of a given device or method, etc.

gauss163 said:
... Anyone who has used the Opus BT-C3100 analyzing charger likely knows well that its capacity numbers are often way off the mark.
....
That Opus charger is known to have issues. We all know it's an inexpensive device.
Yes by detailed work, its limitations have been found and we know what to watch for.
Market feedback has lead to product improvements, so that's a good thing!

We were discussing the Batrium systems accuracy.
What we were trying to flush out of the noise is "are there actually problems with it's coulomb counting SoC measurements or not".
Apparently there is not.

At the levels of measurement & science we have in the field:
- Batrium's SoC metering with full charge reset calibration seems to be working well.
- We appear to be operating in the best corner of the cell efficiency & longevity curves.

Any points you'd like to make about "anode overhang" need to be seriously more concise.
Perhaps you could summarize into one sentence what the influence might be for us in real world symptoms?
What you've posted so far is buried & useless to most readers.

Can you please make a serious effort to get practical & say how something can be done & concisely?

Here's a question which you might be able to dig into for us:
How is the SoC measurement done in EV's eg Tesla's, Leafs, etc in comparison and is there something usefully different there?
Similarly, how is the SoC measurement done in large commercial battery packs eg Tesla Powerwall's, LG Chem systems & others and is there something usefully different there?
 
^^^ Since you are clearly not carefully reading what was written, there will be no further replies from me to your strange comments - which have very little to do with what I said. Best of luck learning more about battery electrochemistry.
 
After 24 edits, not sure what I'm going to read when coming back to your posts.

I asked you for a concise answer. Pretty simple.
What effects might we see from "anode overhang"?

I suggested two related questions which you might have some knowledge of (not having a shot here, just trying to get answers & related info):
How is the SoC measurement done in EV's eg Tesla's, Leafs, etc in comparison and is there something usefully different there?
Similarly, how is the SoC measurement done in large commercial battery packs eg Tesla Powerwall's, LG Chem systems & others and is there something usefully different there?
Having some discussion about how these EVs & ESS systems do SoC would be interesting.
 
Gauss this is a DIY forum. Perhaps its time to write your replies so they do work with that instead of trying to write in such a way that only people working with Batteries understand?
 
^^^ That "this is a DIY forum" does not imply that no one here is interested in getting under-the-hood and learning more about battery electrochemistry. Nor does it imply that doing so has no practical implications.
 
Didnt say that. Please read my post again please.
 
As you'll recall, I operate in the middle voltage range of a 14s battery bank and Batrium doesn't have a way to 'reset' SoC when one does this. This makes SoC creep up each day... and so it's not accurate / useful for me - but I decided to record things explicitly and see what's happening.

I simply recorded the SoC at the same (resting) voltage at 6:00am each morning. This is a 1560ah battery bank @ 14s lithium-ion.

Started at 8/28@6:00am 50.1v and set SoC =29.3% explicitly
On 9/1@6:00am 50.1v the SoC had risen to 33.4%.

SoCwent up by 4.1% (64ah higher out of 1560ah battery bank). During this period,125kwh where drawn from the battery bank.

That's 64ah/125kwh= 0.512 ah/kwh 'gain in SoC'.

**I presume the gain is due to more ah going in (charge) that ah going out (discharge) - so this delta may be a measure ofcharge/dischargeefficiency. Not quite sure how to express the efficiency. How about this logic....

A kwh at 52v has 19.2ah. I got 0.5ah above, so0.5ah/19.2ah = 2.6% So I have a 2.6% efficiency loss?

Seems high but maybe its because its reflective of the overall system instead of just the 18650s in isolation.
 
Why are you mixing Ah and wh? Ah need voltage to know factual energy. Calculate losses with wh only instead. Just do wh charged vs discharged att known testing voltage. PreferBle above 3.9 or below 3.3 when calculatibg for have less error margin ;)
 
Back
Top