When things go wrong... it can go really wrong!

daromer

Moderator
Staff member
Joined
Oct 8, 2016
Messages
5,463
So this now gets tricky as you see too :p You either need high accuracy temp sensors or you need to move the air slowly. So its a decision to take if you want to have slow or fast response. We dont know which one is better here neither since there arent many studies regarding it. But doing 0.1C is not an issue. And if you add some logic you can vary the flow as well.

When 1 cell gets bad you may have 0.5w or even 2w dissipated from that cell. You can even have less. Its not time to calculate some to see how much that 0.5W raises the ambient temperature :) As long as you have the packs properly arranged its easily noticeable over x time and volume of air at 0.1C resolution :)

Enough rambling here but its doable. Above is used in high current applications in general where you need to monitor many cells when pushing them. It doesnt replace sensors in the cells but a combination here do help
 

LithiumSolar

Administrator
Staff member
Joined
Oct 8, 2016
Messages
2,385
This gets my brain going :)
After some time in operation you would get a baseline of the temperature delta of a normal day.
In - Out vs ambient. Thor humidity in there too?

But depending on volume being measured the rise from 1 bad cell may not be measurable at bank level.
Maybe at pack level?

You would need very accurate and well-calibrated temperature sensors for this to work correctly I thinks.
 

Korishan

Moderator
Staff member
Joined
Jan 7, 2017
Messages
6,352
Not really, because what you are looking for is a deviation from the normal. If while under charge the temp sensor only goes up by 2C above ambient for a particular sensor, and under heavy discharge can raise to 4C, then you can graph that over time.
Then one week, the scale shifts to 3C during charge, and 8C under discharge, that sensor location is becoming suspect.

So basically you would monitor the sensor change over time, aka DeltaTemp.

The base information for a "good" run could be calculated fairly easily as the only time there'd be a huge difference is when there's an issue going on.
The graph could be tied with current load, charge or discharge. That way both stats could be monitored closely. If I were to do it, I would also monitor the coulomb difference as well. This would help narrow down a self discharger as well.
 

daromer

Moderator
Staff member
Joined
Oct 8, 2016
Messages
5,463
There is no issue at all finding temp sensors that will work for this. (Shouldnt say no issue. its a challenge but highly doable even for diy projects)

Korishan the problem is not deviation within 1 sensor. the challenge is that you need to compare between sensors and they drift compare to each other. There is ways to handle this as well. Also note that we dont talk about huge difference. We talk about very small ones and there is a threshold where we talk about jitter and noise vs actual data :)

But just by starting with a decent digital sensor with 0.1C resolution and a life of 10+ years and you are set. They arent costly. The sensor part is not the tricky thing here. Its how and when to measure. You need to look at some decent logic :)
 
Last edited:

completelycharged

Well-known member
Joined
Mar 7, 2018
Messages
1,046
Are temperature sensors a more complex approach to this when you all have BMS systems plugged in ?

Say an individual cell has early (heater) losses of 0.05A (0.18W nominal) then over two days this equates to a typical loss of a full cell out of the system (8.6Wh) and within the overall pack on one day this may not stand out but over a week it would be a glaring indicator that something is up ? How quickly does cell failure really occur, is it suddenly in the space of minutes or hours and days of progressive deterioration ?

Heating needs a voltage/energy loss to occur and show up.

Having automated (hidden) balancing is great but the loss of information is absolute as there are no readings as to real Wh needed to balance and over time how much energy a given parallel set of cells has lost (Joules self discharge).

Also as the cell get's closer to thermal runaway the losses would increase significantly and show up in a pack imbalance even more quickly. The whole issue is how accurate a voltage can the BMS actually measure and detect. The whole element of high accuracy measurement is what Jeff Dahn and Tesla cell development is all about.

Cascade failures tend to need a slow heater to allow for heating of surrounding cells and progressive degredation over time OR a tighly packed set of cells : google "NSWCCD-63-TR-2020/01" Lithium battery thermal runaway propagation

BMS systems today seem very dumb as to real monitoring and analysis of the underlying battery pack health and real status ?
 

nz_lifer

Member
Joined
Aug 26, 2020
Messages
67
Cascade failures tend to need a slow heater to allow for heating of surrounding cells and progressive degredation over time OR a tighly packed set of cells : google "NSWCCD-63-TR-2020/01" Lithium battery thermal runaway propagation

A good read :)
Bet they had fun doing those experiments and had a new found respect for all the lithium powered devices everyone has nearby.

Solution right here folks ;)
"Trigger cell was ejected from test pack preventing cell-to-cell propagation"

navy-test.PNG
 

daromer

Moderator
Staff member
Joined
Oct 8, 2016
Messages
5,463
completelycharged good thoughts. No, temp sensing isnt that complex. But its important to look at what
1. todays systems are built for
2. And the use case

Cant compare cheap ebike bms systems with 2 temp sensors for a small pack of 50 cells or a high end type system meant for new cells. We deal with something out of the normal data range. Making it tad more complex but still manageable

How fast this occur no one knows right now. There aint enough data about it since not many use 2nd hand cells like we do. do note that the tests out there are often only on 20 or 100 cells and then on new cells..

A good BMS monitors temperature but most of them are not good enough for what i talk about nor do they have the logic :)
A BMS today do what they are told to do
* Trigger on voltage failures
* Trigger on current failures
* Trigger on temperature failures

Then its up to set the criteria. What i talk about is nothing that exist on the market we have deal with here. It have as im aware of only been used on high current setups and some setups where its not doable to measure with normal temperature probes due to how the pack is designed :)
 

Redpacket

Well-known member
Joined
Feb 28, 2018
Messages
1,247
It would also be good to understand the cell sorting process the builder used here.
Did they test for low volts &/or high IR > bin before building the packs?
Pete, did the guy share any of that?
 

Wolf

Well-known member
Joined
Sep 25, 2018
Messages
1,320
Did they test for low volts &/or high IR > bin before building the packs?
1609850659324.png

Personally I think Redpacket's post should be nominated for the "Post of the Year Award"
Sometimes it takes a wakeup call like the one being discussed to re-evaluate how we build our batteries.
A Li-Ion cell no matter what form factor is an energy storage device. Think of it as a small gas tank. Petrol tank for the non US.
A gas tank stores gasoline which is a form of liquid energy. In its liquid state it is dangerous but also quite manageable as long as you follow certain guidelines for containment and safety.
The same goes with a Li-Ion battery. In its pristine state it is a very efficient energy storage device. Happily accepting electrons to store and on demand release them. Now it will do this give and take in a safe manner if certain rules and guidelines are followed. No question we see it everywhere in all kinds of appliances. Our watches on our wrists have a mini gas tank( li-Ion button cell) in them. I certainly don't have a fire extinguisher in my pocket just in case my watch catches fire. Now I know this is an extreme example but it proves the point. Li-Ion battery building is not a "Just throw it together and it will work" exercise. Careful planning and most important proper cell analysis is paramount.
I am not saying everyone should do this but I keep a record of every cell that goes into my battery. I know the initial voltage, the initial AC IR, the mAh results from a capacity test, the voltage and IR after the capacity test, the voltage and IR of the cell after a >25 day SD check and the voltage and IR of the cell as it goes into the final build sometimes a couple of months later. Any cell in the final check that doesn't pass these tests does not make it into my powerwall battery.
Overkill you say?
Maybe but I can tell you one thing I sleep well knowing what the condition of my cells are. Also in the future if I have a catastrophic failure I should be able to determine what cell/s may have caused the problem.

I also agree with monitoring your battery and the packs in a very close manner. I think Batrium does a very good job at this but as far as visualization over time the reporting is lacking. Not that the data is not stored just that it takes a bit of finagling to get the data into a visual format for easy analysis.
Weather and Sun have been somewhat dismal in the US NE but here is a chart of my battery and the packs for the last 7 days. Can you pull that up in a batrium interface? No, but it can be collected via UDP, node red, influx and grafana. If need be I can go back to the beginning of the battery installation and extract that data in visual format. If there was a pack within this battery that has a SD/Heater in it I think it would be obvious especially at night when the battery gets drawn down to cutoff voltage. I would see a large voltage drop over the evening stretch rather than a nice even correlation
Wolf
1609853360581.png1609853947006.png
 
Last edited:

OffGridInTheCity

Well-known member
Joined
Dec 15, 2018
Messages
1,348
@Wolf - I'm with you and so far, I'm sleeping fine.

I don't track each cell IR the way you do but I do store Batrium + Charge Controller inputs + Inverter output info in a database every 5 minutes since nearly the beginning. This includes min/max voltage of each pack each cycle (e.g. DOD / voltage range) and the min/max differences between packs (drift/self-discharge?) and max/average load on cells and balance mah(s) when I turn on balance and ....

From this, when I eventually do have a problem with pack(s) no longer balancing - I'll be able to visualize the history of when and how fast a suspect pack started deviating from the others. IF they all deviate at the same time - I'll still be able to compare their performance against earlier baselines.

My oldest 14 packs are at 862 cycles as of yesterday and no deviations yet - it might be a while :)
 

CrankyCoder

Member
Joined
Mar 11, 2017
Messages
88
From experience, if the busbar touched the cell, fuses vanish instantly.. but it's a good point to add a spacer. I'm going to 'rebuild' the battery today to try make some more assumptions. At this point, my official opinion is that Poor design & maintenance is the issue more on this later.
How would you recommend adding some space between cells and bus bar? (still making my way through all the replies)
 

CrankyCoder

Member
Joined
Mar 11, 2017
Messages
88
View attachment 23184

Personally I think Redpacket's post should be nominated for the "Post of the Year Award"
Sometimes it takes a wakeup call like the one being discussed to re-evaluate how we build our batteries.
A Li-Ion cell no matter what form factor is an energy storage device. Think of it as a small gas tank. Petrol tank for the non US.
A gas tank stores gasoline which is a form of liquid energy. In its liquid state it is dangerous but also quite manageable as long as you follow certain guidelines for containment and safety.
The same goes with a Li-Ion battery. In its pristine state it is a very efficient energy storage device. Happily accepting electrons to store and on demand release them. Now it will do this give and take in a safe manner if certain rules and guidelines are followed. No question we see it everywhere in all kinds of appliances. Our watches on our wrists have a mini gas tank( li-Ion button cell) in them. I certainly don't have a fire extinguisher in my pocket just in case my watch catches fire. Now I know this is an extreme example but it proves the point. Li-Ion battery building is not a "Just throw it together and it will work" exercise. Careful planning and most important proper cell analysis is paramount.
I am not saying everyone should do this but I keep a record of every cell that goes into my battery. I know the initial voltage, the initial AC IR, the mAh results from a capacity test, the voltage and IR after the capacity test, the voltage and IR of the cell after a >25 day SD check and the voltage and IR of the cell as it goes into the final build sometimes a couple of months later. Any cell in the final check that doesn't pass these tests does not make it into my powerwall battery.
Overkill you say?
Maybe but I can tell you one thing I sleep well knowing what the condition of my cells are. Also in the future if I have a catastrophic failure I should be able to determine what cell/s may have caused the problem.

I also agree with monitoring your battery and the packs in a very close manner. I think Batrium does a very good job at this but as far as visualization over time the reporting is lacking. Not that the data is not stored just that it takes a bit of fanagelig to get the data into a visual format for easy analysis.
Weather and Sun have been somewhat dismal in the US NE but here is a chart of my battery and the packs for the last 7 days. Can you pull that up in a batrium interface? No, but it can be collected via UDP, node red, influx and grafana. If need be I can go back to the beginning of the battery installation and extract that data in visual format. If there was a pack within this battery that has a SD/Heater in it I think it would be obvious especially at night when the battery gets drawn down to cutoff voltage. I would see a large voltage drop over the evening stretch rather than a nice even correlation
Wolf
View attachment 23185View attachment 23187
I do this too. I haven't built my pack, but i have a few thousand cells here. Every one of them has a barcode. Every test tracks starting voltage, resistance, capacity and temp. Immediately rejected by the tester if the resistance is too high or the temp is too high. Every time I test, the records are stored in the DB.
 

Dr. Dickie

Member
Joined
Sep 23, 2020
Messages
80
View attachment 23184

Personally I think Redpacket's post should be nominated for the "Post of the Year Award"
Sometimes it takes a wakeup call like the one being discussed to re-evaluate how we build our batteries.
A Li-Ion cell no matter what form factor is an energy storage device. Think of it as a small gas tank. Petrol tank for the non US.
A gas tank stores gasoline which is a form of liquid energy. In its liquid state it is dangerous but also quite manageable as long as you follow certain guidelines for containment and safety.
The same goes with a Li-Ion battery. In its pristine state it is a very efficient energy storage device. Happily accepting electrons to store and on demand release them. Now it will do this give and take in a safe manner if certain rules and guidelines are followed. No question we see it everywhere in all kinds of appliances. Our watches on our wrists have a mini gas tank( li-Ion button cell) in them. I certainly don't have a fire extinguisher in my pocket just in case my watch catches fire. Now I know this is an extreme example but it proves the point. Li-Ion battery building is not a "Just throw it together and it will work" exercise. Careful planning and most important proper cell analysis is paramount.
I am not saying everyone should do this but I keep a record of every cell that goes into my battery. I know the initial voltage, the initial AC IR, the mAh results from a capacity test, the voltage and IR after the capacity test, the voltage and IR of the cell after a >25 day SD check and the voltage and IR of the cell as it goes into the final build sometimes a couple of months later. Any cell in the final check that doesn't pass these tests does not make it into my powerwall battery.
Overkill you say?
Maybe but I can tell you one thing I sleep well knowing what the condition of my cells are. Also in the future if I have a catastrophic failure I should be able to determine what cell/s may have caused the problem.

I also agree with monitoring your battery and the packs in a very close manner. I think Batrium does a very good job at this but as far as visualization over time the reporting is lacking. Not that the data is not stored just that it takes a bit of fanagelig to get the data into a visual format for easy analysis.
Weather and Sun have been somewhat dismal in the US NE but here is a chart of my battery and the packs for the last 7 days. Can you pull that up in a batrium interface? No, but it can be collected via UDP, node red, influx and grafana. If need be I can go back to the beginning of the battery installation and extract that data in visual format. If there was a pack within this battery that has a SD/Heater in it I think it would be obvious especially at night when the battery gets drawn down to cutoff voltage. I would see a large voltage drop over the evening stretch rather than a nice even correlation
Wolf
View attachment 23185View attachment 23187
Due to my exceptional ignorance, I have not done this.
I have written the capacity and IR (when fully charged) on each cell,. And thanks to this place, when I build my pack, I am going to make a spread sheet that notes the location of each cell (pack and specific location within the pack) with the capacity and IR noted for that cell. That way, if I get a heater, or even if a cell burns to the point that I cannot read what was written on the cell, I can determine which cell went bad.
And you can bet that the next time I process cells (in the future), I will be doing exactly this--I would say it is just common sense to do this, but I won't--since I did not do it, it would reflect poorly on me if I said that.
 

Redpacket

Well-known member
Joined
Feb 28, 2018
Messages
1,247
To be clear, the "low volts" is when you first get the cell, what is it's initial voltage, not after any charging.
Ie open pack, test volts as the first test done.
If low, then bin/recycle.
Next IR test, again before any charging - although testing this later is OK, just maybe not as time efficient for cell processing.
If high, then bin/recycle.
 

CrankyCoder

Member
Joined
Mar 11, 2017
Messages
88
Due to my exceptional ignorance, I have not done this.
I have written the capacity and IR (when fully charged) on each cell,. And thanks to this place, when I build my pack, I am going to make a spread sheet that notes the location of each cell (pack and specific location within the pack) with the capacity and IR noted for that cell. That way, if I get a heater, or even if a cell burns to the point that I cannot read what was written on the cell, I can determine which cell went bad.
And you can bet that the next time I process cells (in the future), I will be doing exactly this--I would say it is just common sense to do this, but I won't--since I did not do it, it would reflect poorly on me if I said that.
check out https://www.vortexit.co.nz/ i have been working with Brett (the guy that runs that site) for almost 2 years on that database portal. It has the repackr stuff built in as well. No cost and we have even worked on arduino based network attached testers to streamline tracking the data.
 

Dr. Dickie

Member
Joined
Sep 23, 2020
Messages
80
To be clear, the "low volts" is when you first get the cell, what is it's initial voltage, not after any charging.
Ie open pack, test volts as the first test done.
If low, then bin/recycle.
Next IR test, again before any charging - although testing this later is OK, just maybe not as time efficient for cell processing.
If high, then bin/recycle.
That is exactly what scares me. I THOUGHT (assumed, always a mistake) that the charger I was using would tell me when a cell was too low (below 2 Volts), as I thought it had done that to me before. Turns out it does not. It does slow charge the cell to 2.5 volts then as normal. So, I have an unknown number of cells that started less than 2 volts that got processed (based on what I saw after I began being much more careful, probably less than 2%, but does it need more than just one to be a problem?).
I pulled all the ones that were below 1 V (as far as I know--I just wasn't that careful at the beginning--ignorance again,my cup runneth over). So, I now have an unknown number of cells that started off between 1 and 2 volts. I really, really, wish I knew which ones they were. However, at this point I can't go back determine which ones they were. So, I have to hope that IR, capacity test, and self-discharge will remove any bad cells. Not nearly as good as tracking each cell start to finish.
 
Last edited:

Wolf

Well-known member
Joined
Sep 25, 2018
Messages
1,320
So, there are unkown number of cells that were less than 2 volts that got processed (based on what I saw after I began being much more careful, probably less than 1%, but does it need more than just one?)
I do not want to give you a false sense of security but there are some factors to consider with low voltage cells (LVC). Any voltage below manufactures cutoff voltage which can be as low as 2V (US18650FT as an example) is certainly not optimal for the cells chemistry. But with all chemical reactions, time and temperature are a factor. Just because a cell has been recovered from a battery with a bad BMS that has sucked the life out of a cell and now is sitting at 1 or so volts does not necessarily mean it is damaged beyond repair. If I find a cell with <2V I look for a date code.
4 years is my limit. Why 4 years? It comes down to how often do we replace our Battery powered items. Or better yet the batteries that power these items. Laptop life in this throwaway society is generally 4 to 5 years. Also who knows what manufactures program into their BMS chips to cause batteries to lose capacity (iPhone). So my theory is that if a cell is older than 4 years and it is a LVC it usually also has high IR and I probably won't use it except maybe in a low risk single cell environment. Most likely it will go to the recycler though.
So what if the cell is <4 years old and is a LVC? Most likely it has not been in this state for a very long time and the chemical damage that can occur hasn't completely taken hold. The worst thing that can happen with a prolonged LVC especially <~0.7V is the dissolution of copper into the electrolyte. When the cell is recharged it re-plates itself causing high IR, low performance and SD conditions of which we would reject the cells for anyway. One way I can tell a LVC hasn't been severely damaged is that after a C/D/C cycle the IR will drop considerably usually 5mΩ to 7mΩ. This is of course if the initial IR of the cell falls within my "IR cheat sheet".
If after a 30 day rest the IR climbs by more than 0.5mΩ to 1mΩ I will be very suspicious of this cell and it most likely will also exhibit signs of SD behaviour.
@Dr. Dickie IR is your friend. If the cell has good IR you should be OK.

Wolf
 
Last edited:

daromer

Moderator
Staff member
Joined
Oct 8, 2016
Messages
5,463
Problem with low voltage is that the start to form dendrites. They are not visible but they are there and they cause fire in the end....
And what i know they are not visible to IR either. I cant find any datasheet that states that atleast.
 

Dr. Dickie

Member
Joined
Sep 23, 2020
Messages
80
So what if the cell is <4 years old and is a LVC? Most likely it has not been in this state for a very long time and the chemical damage that can occur hasn't completely taken hold. The worst thing that can happen with a prolonged LVC especially <~0.7V is the dissolution of copper into the electrolyte. When the cell is recharged it re-plates itself causing high IR, low performance and SD conditions of which we would reject the cells for anyway. One way I can tell a LVC hasn't been severely damaged is that after a C/D/C cycle the IR will drop considerably usually 5mΩ to 7mΩ. This is of course if the initial IR of the cell falls within my "IR cheat sheet".
If after a 30 day rest the IR climbs by more than 0.5mΩ to 1mΩ I will be very suspicious of this cell and it most likely will also exhibit signs of SD behaviour.
@Dr. Dickie IR is your friend. If the cell has good IR you should be OK.
Thanks, all my cells are 2018 (LE M25 with an R code).

PRETTY sure (as embarrassing as it is to say that--not positive) I pulled all cell below 1V.
The cells that I have tested initial IR (after I learned about doing that here a few weeks ago--thank you) has been about 39 mOhms. After charging, capacity testing, and re-charging, they have all been about 34-36 mOhms, so I haven't seen a 5 mOhms drop in those
I do, now have the IR written on my cells and all are under 37 mOhms--that is below the IR you gave me as a limit--or course you were talking about initial and I was too stupid to understand that . Again, I did not get initial mOhms IR on most (first 1.5 K or so) of my cells tested. But I did tested all of the cells once I got the IR tester. Some of them were a month maybe two after capacity testing), and the IR of them all was as I say 34 to 36 mOhms.
I have ordered labels, and plan to number each cell, get the capacity, IR (not initial but IR after capacity testing). I will test them again when I build, and I will pull any that have an IR increase as you suggest. Of course I pull any with over 37.5 mOhms--although I haven't found any yet.
My only wish was that I found all you folks before I started all this. I had the hubris of ignorance which had me sally forth into the great unknown blind to what I was doing.
As I have said most of my life, that great lyric: I wish, that I knew what I know now, when I was younger!
Ohh La La, I am going to be watching this pack like a hawk.
Thanks for all the help and info.
 
Top