18650 for 48v powerwall - charging, discharging and currents?

wattwatt

Member
Joined
May 21, 2018
Messages
55
The project is a 48v 24-ish kWh 14s245p powerwall where I'm using harvested 18650 cells with around 2000 mAh. Based on my house's average energy consumption, I'm looking at roughly 215 mA of continuous current per cell. The 48v inverter I plan to get has shut down voltages at 42v (~3.0v per cell/series) and 59v (~4.2v per cell/series). With this info, when testing the capacity of each 18650 cell, what voltage should I charge each cell to and what cutoff voltage and discharge current should I set for each?
 
When testing capacities, you should always use the full voltage range. This means charging to 4.20V and is discharging to either 2.80V or 3.00V depending on the charger you have. The preferred test current is 1000mA as well even though you calculated the typical use to be 215mA. You want to weed out any faulty or near-failure cells before them make it into your packs :)
 
Discharging at 1000mA does 3 things.
1) Shows pretty much the real capacity under standard conditions for a single cell
2) Shows what the cell "can" do under severe worse case situations
3) Will help weed out the defective/sub-optimal cells

Then in a power application you typically will only pull 250-500mA (usually under the heaviest of loads) and you know the cell can handle it.
This is kinda like a vehicle engine that "can" run at 6000RPMs, but you don't always run it that hard unless you absolutely need to. It'll be short lived if you do run it that high all the time, but will last 1000's of hours if ran 2000RPM or so. Cells are very similar, with the added bonus of not having to change the oil :p
 
Completely agreed with Korishan; the 1A test can filter pretty much anything between good and junk. You can even detect power tool high drain cells that are no longer in good shape. This is a special situation here, since the cells will be able to work even with 30A discharges, but if a power tool cell has less than it's original capacity when tested then it definitely will not handle a high discharge and should no longer be used at high discharge - just use it for something standard.
 
Korishan said:
Discharging at 1000mA does 3 things.
1) Shows pretty much the real capacity under standard conditions for a single cell
2) Shows what the cell "can" do under severe worse case situations
3) Will help weed out the defective/sub-optimal cells
Absolutely agree on 1, 2, and 3.

Regarding number 3 ("Will help weed out the defective/sub-optimal cells") there is one more thing you can do and that is check the IR of the cell. No need to go through all that rigorous testing to find out a cell won't perform up to the standard. I find a lot of people poo poo the IR test and say it takes too much time. It takes me 5 seconds to check the IR of a cell compared to 8 hrs of C/D/C testing to determine if a cell will perform up to its standard.

In Korishans analogy it's like taking a compression test on the engine before you run it at 6000 rpm. If you got a couple of cylinders not up to snuff you wouldn't run it that hard if at all.
Just saying.


Wolf
 
Back
Top