Cell Testing

Joined
Sep 5, 2017
Messages
99
I am about to embark on cell testing, and would like to hear others' thoughts on appropriate ways to test, and indeed, what to test.

I was about to purchase an Opus, but have seen other testing devices and even a few Arduino projects. While the Opus is amongst the easiest (buy it, plug it in and press some buttons) I would rather make something, even if it turns out more expensive. So long as i learn something and/or the results are more accurate, that's great.

As this is a powerwall community I figure I should be testing my cells in anticipation of use in a powerwall, and not in a portable arc welder (or some such crazy, but potentially fun, project). My cells will be subject to low stress: incomplete charge, incomplete discharge, low current draw, low current charge etc.

First, is there a consensus on some numbers related to powerwall use? E.G. max charge voltage of 4.0, or 4.1? Min (or is it max) discharge voltage?

Second, based on these limits, what is a REAL capacity going to be like? A 3200mAh cell only gets that under ideal and stressful conditions. At the reduced voltage range, what should I be expecting? I don't even want to test the maximum capacity - just interested in what each cell will do in my powerwall.

My GUESS is that if I can test under a set of conditions that more closely match the use environment of a powerwall, I should probably be happy with any cell that gets 1500+ mAh.

So - what are your thoughts on the use environment, and what is an appropriate method/device with which to test this?

Thanks in advance, Dave
 
Take a look at MrContentins charger. His is a custom built charger based on the Arduino Mega. gives consistent results of capacity.

1) Consensus is top charge to 4.1 max. Better if it's 4.05. This increases cell longevity and cycle life. Discharge voltage of 3.4 - 3.2V. For the same reasons. Most stop at 3.4V as there isn't a whole lot of mAh after 3.4 - 3.2V

2) Basically, compare the mAh rating against the database. That tells you what it was supposed to be once it left the manufacturing plant. That gives you a guideline of what to expect. Then, I would say expect about 75% - 80% of that in returns. If a cell is rated at 2200mAh, and you get 1500mAh, use it for something else and not the powerwall. There's been a lot of degradation of the cell capacity.

3) Most will say to not put anything in the packs below 2000mAh. Or, maybe go as low as 1800mAh. This is solely up to you and what you have available to you. If you have 3000 1700mAh cells, then you can build a decent setup. If you have 3000 1400mAh cells, you still can do a decent setup. But just remember, the lower the average capacity in the pack, the more cells you will need in parallel to carry you any distance in terms of length of run time.

4) Discharging method for determining capacity is up to you. Some charge at the max they will have to handle, others discharge at a rate of what they will normally use. Max is usually 1Amp, and some go for .5Amps. Up to you. The key point is to be consistent with your testing. Don't test these batches of cells at 1Amp, and these others at .5Amp and expect them to be Apple-Apple comparison.
 
You are asking all the right questions and Korishan has given the right answers. In the end it is up to you. Sometimes there is a consensus, sometimes there isn't. But even if there is, if your use case differs, than it might not be suitable for you. While sometimes it is absolutely the right thing to do what others are doing, sometimes it is not that beneficial or even wrong. Important is to know what you want or need or simply have to work with. And you have to know what the implications are when doing certain things in a certain way.
 
Korishan said:
Take a look at MrContentins charger. His is a custom built charger based on the Arduino Mega. gives consistent results of capacity.

1) Consensus is top charge to 4.1 max. Better if it's 4.05. This increases cell longevity and cycle life. Discharge voltage of 3.4 - 3.2V. For the same reasons. Most stop at 3.4V as there isn't a whole lot of mAh after 3.4 - 3.2V
Okay - that makes sense.... but the specs from the manufacturer will be based on a full 4.27 or whatever charge, down to 2.5, at a certain current that maximizes the results.
Let's say a cell in these ideal testing conditions can produce 3200mAh. When we take the same cell and test with charge to only 4.05 and discharge to 3.4 at say 1Amp, then I expect I'll see 2500mAh, or something basically unrelated to the 3200. I also believe that in a random sample of new 3200mAh cells, the results of my 'low-stress' testing will vary widely depending on if the cell is a high discharge rate cell or low rate.
My thoughts are that I should be looking at the 'low-stress' results in order to make each pack. Is this correct?
I have seenMrContentins charger thread but will re-read.
You also mentioned degraded cells. Does the amount of degradation actually matter? By this I mean is a nominally rated 2200mAh cell that I test my way and get 2000mAh any 'worse' than a 3000mAh that also tests at 2000? Reason I ask about this (apart from the fact I don't know) is I really don't care what a cell could do when new - I only care what it will do in my 'wall.
As always, thank you for the help and guidance.
 
but the specs from the manufacturer will be based on a full 4.27 or whatever charge, down to 2.5, at a certain current that maximizes the results.
That all depends, each datasheet will have different specs. So, it's best to refer to the datasheet when determining your base.
For example, some show capacity by discharging at 1000mA, while another will base it off of 500mA. Both datasheets may say it has the same Starting, Ending voltage, but yet both cells will have different capacities. Or, they may show the exact same capacity. It depends on the chemistry. You will get a different reading from 1000mA load versus a 500mA load versus a 250mA load. Same cell, different ratings. It all depends on the discharge current. Also, some datasheets may give the capacity based on "charging" from 2.5V up to 4.2V, whereas another may show "discharge" from 4.2V to 2.5V. That's why ya gotta read the datasheet ;)

I'm not sure how you mean "low-stress". The only way you can get an accurate reading of capacity is to test it at the current you decide and stay consistent with all cells. Don't charge some at 1000mA and other at 500mA or 250mA. They will vary widely.

Degradation, yes, a cell that is rated at 2200mAh that is now 2000mAh will be in a better condition than one rated at 3000mAh and is now 2000mAh. That does not mean that the cell will not perform equally as the other cells. However, it may be closer to it's cycle life than the others as it's lost 30% of its rated capacity whereas the other lost only 10%. so, it is possible that it could fail sooner than the others. That's why we monitor the cells and keep track of any anomalies that occur to narrow any problem cells down.

You could also think of the cells as an engine with mileage on it. If you pit two rather identical engines against each other, but one has 100,000 miles whereas the other has 50,000 miles on it, the chances of the 100,000 mile engine failing is a lot higher, even if they are ran under the same exact tests. It's just that the parts have worn out more from the higher mileage than the other. However, it's also possible that if both engines are ran no harder than 2000rpms for the rest of their lives, they may live just as long as the other under the light load.

I hope this has cleared some of it up, and not muddied it any :p
 
Clear as mud - thank you!!

Seriously though - nice and clear.

Low Stress means charge only to 4.05, discharge to 3.4 at 500mAh, of course using consistent values for all cells all the time.

I would rather have a wall rated at 6kWh using this method than 7kWh calculated by wringing every possible electron out of each cell. I am sure that the capacity of a wall is more about marketing than reality.

Okay - time to start building a capacity tester of some description.
 
:thumbs up: (if we had one in the emoticons)
 
Kori and others have provide MANY good pieces of advice regarding testing cells.

I keep cells with low capacity, high internal resistance and/or elevated self-discharge OUT OF my powerwall packs. They may be good to be used in other things (maybe)... but these characteristics all need to be tested for.

A lot of people forget about another metric which is relatively simple to measure (whether with an infrared thermometer, or a more automated way using Arduino/temp sensors,etc)... that is HEAT!

Remember, we are working with USED cells and if they excessively heat up during the first discharge/charge cycle, then there is something seriously wrong with them internally. This may show up as high internal resistance, but not always. It might come back lower capacity during charge/discharge then the manufacture states, but not always. Remember, we have NO WAY of knowing how these cells were treated prior to us getting our hands on them. I don't use heaters in ANYTHING! In fact, I terminate my discharge/charge cycle if they start to heat up beyond my own threshold (you can decide how hot you want - I use 45 Celsius) - They go straight to the the recycle bin and OUT of my garage asap.

The health of a battery is based on these fundamental attributes:

* Capacity, the ability to store energy. Capacity is the leading health indicator of a battery
* Internal resistance, the ability to deliver current
* Self-discharge, indicator of the mechanical integrity
* temperature of cell during charge/discharge and during load

"Battery diagnostics has not advanced as quickly as other technologies and still appears to dwell in medieval times. No instrument is capable of estimating the state-of-health of a battery in a single measurement. Similar to a doctor examining a patient, or the weatherman forecasting the weather, battery testing entails looking at multiple attributes to get a clear health assessment. Although capacity is the leading health indicator, internal resistance and self-discharge also play a role. Suitable test equipment, understanding batteries and intuition are essential."

I came across the above quote in my travels online sometime in the past and it has stuck with me.

Whilst I'm not in the refurbishing business (I am not reselling anything), nor am I a doctor (Smirk), it feels like it sometimes as I have hundreds of pounds of recycled laptop batteries in my garage just waiting to be broken open and tested. The fact is that almost all of us are working with reclaimed second hand batteries for our walls.

There are many differences between new and aged lion cells we get from the battery packs.

You need to be able to test all the above before intuitively knowing whether they are "good enough" to use in your wall.

There are many things to learn about lithium ion batteries and the testing. Be safe and smart - learn about thermal runaways, and all the other safety issues. Trust your intuition ;) - and i have to say, don't believe everything you read. Just because it is in a forum or online somewhere like a "university" doesn't mean it is 100% accurate.

I'm a scientist, a long time DIYer, and a professionally trained engineer... I think I fully understands the risks.... that said, my build is moving right along - BUT, for me, the journey is the most fun! I'm learning as I go too. Read as much about batteries, what makes them work, what the industry does to keep them safe, and definitely remember to use your head.

Smiles. Howie
 
There is one way to meassure heat not by just doing the meassurement itself and that is time.

Because as we all know that Lithium cells are rather efficient. They generally are at or above 99% in efficiency. So heaters basically arent as efficient and thats why they get hot. (Due to numerous reasons)9

So lets say a normal cell that doesnt show any signs of heat takes 2h to charge. Then you take next cell and that one takes 2.5hours. (The cell NEED to have the same capacity reading) then you know that the 2nd one did waste more time in charging state and that also say to you that the cell in question needed 25% more energy to charge! That one will definitely be warm.

Of course just taking the temperature of the cell is the most accurate way but for instance I dont have time to test the cells that way (Yeah i know I do have temp sensors modified so i can rely on that... And I do.. But not everyone else can) then its just a matter of learning how long time does for instance a 1900mAh cell take to charge in generall.. Then you get your value where the limit is

Just a tip. And it actually works out pretty well. (I have compared this to proper temp readings)
 
daromer said:
So lets say a normal cell that doesnt show any signs of heat takes 2h to charge. Then you take next cell and that one takes 2.5hours. (The cell NEED to have the same capacity reading) then you know that the 2nd one did waste more time in charging state and that also say to you that the cell in question needed 25% more energy to charge! That one will definitely be warm.

Doesn't that also depend on the charging characteristics of the cell and how much time it spends in CC and CV mode? I think you can easily find two cells with the same capacity and they don't have the same charging time.
 
Dark: Yes that is true that it do change slightly on those types. This should be applied on a certain type and capacity. I have easily seen 25-50% difference in time on the ones getting hot so that have given a huge indication on which one that was hot during a charge.
 
If a cell is rated at 2200mAh and it charged at 500mA, then it "should" take only 4.4Hrs to charge. Regardless of what brand and chemistry the cell is. This is vary slightly, yes, but not by a half hour, I'm sure. So, let's say cell A takes 4.4Hrs to charge, right one point. And cell B takes 5Hrs to charge. We know there's a problem as it took 2500mAh of charge to reach it's rated capacity. During this time, it's possible the temp reading was only a few degrees warmer than cell A; nothing noticeable to the touch. Perhaps a temp probe would see it.

I actually didn't think of this type of examination of the cells, but it does make sense. I will incorporate it in my testing of the cells when I get to that point (soon, I hope sooon). Thanks daromer
 
If you do proper tests (I didnt)
Like actually doing temp on each compare to time and capacity that would be nice to graph it. I only used is as very rough test here but i keept a bit quiet about it since its not 100% bullet proof either since i never did those tests.
 
If cell A takes 4.4hr to charge at 500mA and cell B takes 5hr to charge it means cell B did not take the whole time the set 500mA before reaching the cut off voltage or a lost of energy by heating up while charging.
But is it not possible there is only a differenceofinternal resistance in case of a difference in charging time ?

Maybe also something to messure / monitor...?
 
I'm collecting the time to charge and the temperature (and also tracking starting/ending voltage, capacity) and a few other assorted metrics in my cell database. I'll graph it at some point and compare what I see for times vs heat. All good suggestions there. pictures will tell a good story. great idea.
 
Wim: yes the Ir do affect it to some extent for sure but if it affects alot its either
* Gets hot due to self discharge or issues
* Doesnt get charged in time du to a high IR causing low current charge = will have very bad values during discharge and not useable anyways.

So Even though the IR affects the charging time i would say that in the end its still a good factor to use.

How its related I hope someone can graph :)
 
Id like to be able to monitor the mAhs to recharge a cell vs the mAh to discharge it but as far as I can work out the Opus doesnt do it.

If that information was available it would be the second attribute to build packs with as its the main thing that drives a pack out of balance over repeated shallow cycles.

If packs are different mAh rating and you start off with them fully charged and only discharge to say 50% of the capacity of the smallest capacity pack then recharge them they will still be balanced at the top.

If the charge/discharge efficiency is different between packs they will quickly move out of balance.
Thats the main concern for me re Sanyos that get warm (not hot), that energy is lost from the system and drives imbalance.
 
I think FERCSA's FCDS or a BatLab would assist greatly with this analysis ... Any possibility of a concerted effort to build these units? Maybe backhaul all of the data into a central database somewhere? Just thinkin' out loud being a data geek and all (DB developer during the day). Ooo and graphs, gotta have graphs ... lol
 
neurocis said:
I think FERCSA's FCDS or a BatLab would assist greatly with this analysis ... Any possibility of a concerted effort to build these units? Maybe backhaul all of the data into a central database somewhere? Just thinkin' out loud being a data geek and all (DB developer during the day). Ooo and graphs, gotta have graphs ... lol

Yep - gotta have graphs... interactive preferably :)

For those interested, here are links to the above mentioned threads:

http://secondlifestorage.com/t-Constantin-PowerWall

http://secondlifestorage.com/t-64-128-cells-DIY-Multi-Battery-Tester-Charger-Discharger

http://secondlifestorage.com/t-FCDS-FERCSA-s-charger-discharger-station
 
Thanks for those links, I had been looking for @mrconstantin 's build but it eluded me.

So would there be real interest in such an initiative? Building a reference "DIYPowerwalls"charger/discharger/logger with a backhaul to a shared (but private)database? I am now thinking @mrconstantin 's design maybe a better reference build.

Thoughts?

Cheers!
 
Back
Top