In an effort to eliminate bad cells, we usually charge and let it rest for a few days or more.
Then check to see if the voltage has dropped.
We reject cells that have dropped below 4.1 (for example, you can reject at whatever voltage you want)
The problem lies in the TP4056 chargers.
Sometimes the 4056 stops charging,and I pull the cell off the charger, and it will be at 4.1v
Other times, it will stop charging at 4.18.
Sometimes if you put in a cell that is at say 4.05, the TP4056 will not even start to charge.
I assume this is due to varying internal resistance of the cells.
So lets say I pull 2 cells off the charger and put in the "charged" box.
I come back a week later to discharge test.
I put the cells in the tester, both say 4.09v
BUT, one came off the charger at 4.10v and the other at 4.19v.
So one has dropped by 0.01v and the other 0.10v, ten times as much voltage drop.
Which is the healthier cell?
So if I didn't write the fully charged voltage on the cell (I dont), I have no way of knowing how much of a drop really occurred.
Also, I can often take cells that only charge to 4.1v and put them on my nitecore charger, and take them up to 4.19.
Now I know that the 1/10th of a volt means little in terms of cell capacity, but it does mean a potential voltage draw when assembled in a pack.
How do you guys handle this?
When assembled into my powerwall, I wont ever charge above 4.1 (for longevity), so does it even matter?
Also, what is your "reject voltage"?
Then check to see if the voltage has dropped.
We reject cells that have dropped below 4.1 (for example, you can reject at whatever voltage you want)
The problem lies in the TP4056 chargers.
Sometimes the 4056 stops charging,and I pull the cell off the charger, and it will be at 4.1v
Other times, it will stop charging at 4.18.
Sometimes if you put in a cell that is at say 4.05, the TP4056 will not even start to charge.
I assume this is due to varying internal resistance of the cells.
So lets say I pull 2 cells off the charger and put in the "charged" box.
I come back a week later to discharge test.
I put the cells in the tester, both say 4.09v
BUT, one came off the charger at 4.10v and the other at 4.19v.
So one has dropped by 0.01v and the other 0.10v, ten times as much voltage drop.
Which is the healthier cell?
So if I didn't write the fully charged voltage on the cell (I dont), I have no way of knowing how much of a drop really occurred.
Also, I can often take cells that only charge to 4.1v and put them on my nitecore charger, and take them up to 4.19.
Now I know that the 1/10th of a volt means little in terms of cell capacity, but it does mean a potential voltage draw when assembled in a pack.
How do you guys handle this?
When assembled into my powerwall, I wont ever charge above 4.1 (for longevity), so does it even matter?
Also, what is your "reject voltage"?