Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Passive BMS packs balance variance at lowest voltage
Hi again folks,

I have a question for those of you who have large off-grid installations and passive BMS units such as the Batrium.  What is the lowest voltage in a real world experience do  packs actually get discharged to and what do you find the balance of the packs to be at that voltage.  What variance in voltage are the different series of cells?  And of course I am thinking of packs that are well designed with matched resistance etc.   Do they really stay close in voltage at that low voltage?  If not, what is considered to be an acceptable variance?  

Thanks as always ...

I maintain a 3.5v/cell bottom cut-off (49v for 14s).  I do this primarily for long-life...  but I think 3.4v(47.6v for 14s) would still be pretty OK in terms of the discharge curve knee.   Below 3.4v-3.5v there isn't a lot of power as the discharge curve drops sharply.

I use Batrium...  and while I probably could try for <0.02v max difference when I balance, I typically just go for <0.05v max difference because its just not 'operationally' important.  But then, I turn balance off - e.g. no balance for months at a time.     My charge/discharge range is mild - limited to 4.0v/cell high and 3.5v/cell low with a average high of 3.90v.   At this range - the max difference varies from 0.03v to 0.06v thru the day's charge/discharge.  I believe this variance up/down is because I have 5 different types of cells that make up the packs - but they stay absolutely steady overall.   

Here's a typical picture of my 84 packs with no balance since May 25 when I added a new battery and did a balance to < 0.05v max difference.  

>Do they really stay close in voltage at that low voltage?
Healthy packs with very similar capacity will stay in balance.   I still don't understand why (as deeply as I'd like to know) - but they do!
I have my bottom at 2.9 for critical shutoff. At that stage that is way off from the others of course. My normal disconnect is at around 3.3V. NOTE i DONT cycle every day. we talk about 3-4 times a year i even get close to 3.3v Wink
I normally dont get below 3.75V.

The difference totally depends on how well balanced your packs are and how you meassure. At load or idle. At load i can see up to 0.3V difference at the bottom easy. Normal difference is close to 0 in terms of whats a problem. As Offgridinthecity said.
NOTE! My links supplied in this message may be affiliated with Ebay and by clicking on them you agree on the terms.
YouTube / Forum system setup / My webpage  Diy Tech & Repairs

Current: 10kW Mpp Hybrid | 4kW PIP4048 | 2x PCM60x | 100kWh LiFePo4 | 20kWh 14s 18650 |  66*260W Poly | ABB S3 and S5 Trip breakers
Upcoming: 14S 18650~30kWh
This is really helpful.  It is verifying what I find as well ... I look forward to other folks responding to confirm this.
I charge to about 4.05V and discharge to avg 3.40V. Just peace of mind that even if several cells should fail, the voltages will stay well within safe limits. And the cells should last for significantly more cycles. Besides, there's not much energy to be gained below that anyway.

Out of my 6x 14s100p batteries, the best currently shows 0.076V variance at avg 3.40V, the worst pack 0.139V.
Whenever I do maintenance, I look at that variance to guide me to the problem packs. Usually, there's a bad cell or two, and replacing those improves the issue.

There's no point in OCDing about balancing second life DIY powerwall batteries. Plenty of better ways to spend my time on, such as adding more capacity.
Modular PowerShelf using 3D printed packs.  60kWh and growing.
It also depends on your chemistry. In lithiums you have two common types, the LiFePo4 (LFP) and the regular lithium types (LMC, LCO, etc. that are used in your laptops, drills etc.). There are others like LTO too that are less common. They all have their mins and maxs that are slightly different.

I use the regular types so the operating range is typically 2.8v to 4.2v. My working range is from 4.05v (4.08v in winter) to 3.57v. I try to keep a 10% reserve and I see a steep drop in voltage past 3.5v.

My packs are made out of two different models, samsungs and panasonics only. They came from the same packs as I was fortunate to get the enough to make 20kwh worth. I weeded out any that had less than 90% capacity of the original ratings.

I don't bottom or top balance so my balance is more or less in the middle. My deviations will vary under 0.040v at the top and bottom. I use an active balancer but it never gets triggered since my packs always stay below the 0.040v.

Below is one winter where there were days of just cloudy and snow days where my reserves were depleted and went down to 3.45v when my PIP would automatically charge the batteries through AC until it reaches the minimum voltage.

Regarding the voltage (or SOC) range that you should use: it depends on how much you wish to optimize. If your pack is large enough that you can get away with using shallow cycles, then you can get up to 20 times the life (equivalent full cycles) by centering these shallow cycles at 50% SOC, for example see the study I cite here. This is how NASA manages to get extremely long battery life in space missions.  In particular this optimization is possible for many types of standby batteries.

Generally, a rule of thumb is that to maximize Li-ion battery lifetime you should minimize the time that the cells spend at extreme voltage and temperatures - since they (exponentially) accelerate internal degradation processes.
daromer likes this post

Forum Jump:

Users browsing this thread: 1 Guest(s)