Charging Profile / Method

farmerjohn

Member
Joined
Apr 19, 2018
Messages
52
Hey All,

I am trying to determine the best charging profile for my cells

I have 42 leaf modules in 14s6pconfiguration and keeping my cell voltages between 3.5 and 4.10 volts. Each cell is rated for 60Ah

My charger supports both bulk (constant current) and absorption (constant voltage) as well as float

I've read that you are to charge lithium ion at constant current till the voltage reaches the absorb voltage then switch to constant voltage till the current falls off

Problem is all I find about that is when charging them to 100% full at 4.2 volts - not much info when charging to a point less than that

What I am doing now is doing constant current till it reaches 4.10 volts per cell then switching over to absorb.. but the absorb cycle never seems to end! they just keep sucking it back. I'm afraid that I am just basically floating them at that point.. but maybe I am just not letting it run long enough?

I get out close to the same amp hours that I put in (around 18Kwh).. so I am pretty sure I am doing things right - maybe just not optimal..

What do you guys think - how do you charge your packs?

Thanks

John
 
We are talking about a powerwall and you will be cycling the pack often so its not a big issue
Just make sure you set thresholds on max voltage and max current and let it be I would say.

If your charger doesn't end charging when voltage is reached and current is close to 0 its not the end of the day with a max voltage of around 4.1V or such. It will most likely just take a short time before no sun and they discharge again.

Do your pack self discharge?
 
I don't think this is a big issue ...

But what is very important is the voltage you charge to .... do you really need to go to 4.2 ... it will massively reduce the life of your cells ...

Charging to 4.2 new cells last 400 cycles ....charging to 3.93V they last 3,200 cycles ( with 2/3 capacity so equivalent to 2,000 4.2 V cycles )
http://batteryuniversity.com/learn/article/how_to_prolong_lithium_based_batteries

You get 5 times more useful life by charging to a maximum of 3.93V

4.2V does not mean the cell is "full" ....it is an ARBITRARY number chosen by manufacturers , they defined 4.2 V as 100% capacity .... If you like you can charge to 4.3V , then you will get 114% the capacity at 4.2V .... but only 200 cycles !!!


image_lzymrc.jpg
 
3.93V vs 4.1V
~3000 cycles vs ~1000 cycles
65% Capacity vs90% Capacity

Battery University Chart:

image_fqwrgo.jpg



So you choose how much capacity you want available. Most builders here run closer to 4.0 - 4.1V. Pretty much, if you are only getting about 60%, might as well stick with lead acid as going that low negates the power density available.
Also, charging to a lower amount means you have less available storage in case of something happening and you can't charge them back up (solar setup and there's a storm for several days, for example)

Also note, the cycle count in the charge is full discharge to the specific cells discharge voltage (most are around 2.8V). Not discharging that low also increases cycle count. A cycle from full to empty. If you only use 15% of the cells capacity, then that's not a full cycle and you are getting even better life out of it.
 
It seems to me this is the most important issue for powerwallers to consider...

Do you want your powerwall to last 2 years , or 10 years ....

Charging to 4.2 V you may have to build it all again in 2 years time ....

Charging to 3.93V you will need 50% more cells in your wall to get the same capacity, but it will last 5 times longer ...

These are the hard facts.

If you charge to over 4 V , but never fully discharge this is bad management ... going low does not reduce life ... it's high voltage (over 3.93) that kills off the battery.

It seems to me essential a powerwaller monitors the voltage of his packs , If it's all automatic he's losing out ...

Suppose someone has a wall , keeps it automatically charged to 4.1V from PV panels .... the cell voltage may dip to 3.6 overnight , but no lower, and then is recharged ... he thinks it's all working fine ... but in reality this is gross mismanagement by operating at higher voltages than is necessary his wall will have a much shorter life ...

Cycling Voltage range should be as low as possible , perhaps increase upper voltage in winter if not enough capacity.
 
You all make some good points about voltage vs life. I think I may lower my voltages once I increase the size of my pack

But my main concern and question that I have is HOW to best charge them

Say my target voltage is 4.10 and for easy numbers say it charges in the bulk phase at 100 amps. It can reach that 4.10 volts fairly quickly and switch over to constant voltage phase (absorb).

But just because it reached my target voltage does not mean it has reached full capacity at that voltage - correct? I still need to wait till the amperage drops off to 10% of bulk.. in this example 10 amps

I think I am correct - but not sure.
 
That's correct, I've done this on several "common" 18650 cells, the charging profile is the same as with 4.2V just lower absorbtion voltage.
 
If you are only charging to 4.10V, you don't need to switch from CC to CV. Most lithium cells switch from CC to CV at about 4.10 - 4.12V. So if you don't go over 4.10V, then you don't need to worry about the current curve ramping down over that last .1V. And if you only go to 4.0V, then you really don't need to worry about it.
This does make charging a lot simpler as you dont need to add in an extra algorithms to the mix (especially if you are making your own charger)
 
You are absolutely correct, farmerjohn. You have to switch to CV, no matter what your end of charge voltage is. If you don't then you haven't charged the cell to whatever your voltage is. The voltage will drop if you stop charging right after CC to your end of charge voltage.

Korishan, what you wrote there makes little to no sense ;) And I have a strong feeling of a deja vu as if we have gone through this already in the past. Switching from CC to CV always occurs at your end of charge voltage. Or at least close to it if the charger is a bit inaccurate. It doesn't occur at any predetermined voltage (as in through some kind of inherent property) and it also doesn't have anything to do with the cell.
If you are charging to 4.20V then it occurs at 4.20V. If you are charging to 3.90V then it occurs at 3.90V. The switch from one to the other is always tied to the end of charge voltage you have set.
 
My understanding of CC/CV agrees with DarkRavens. Also:
ozz93666 said:
4.2V does not mean the cell is "full" ....it is an ARBITRARY number chosen by manufacturers , they defined 4.2 V as 100% capacity .... If you like you can charge to 4.3V , then you will get 114% the capacity at 4.2V .... but only 200 cycles !!!
I have never heard it expressed like this, it is very helpful.
 
The charge voltage limit is to do with the chemical reactions, the mix on the anode/cathode and the aditives used in the cells because above the "recomended" upper charge limit additional electrochemical reactions occur which are irreversible and this is why your cycle life decreases.

Above 4.2V the actual amount of energy you get back out of a cell will be very low due to the additional one-way reactions. Granted you may push an extra 5Wh into a cell and think this is charge enerrgy but you may only get 0.2Wh out and 4.8Wh of one way electro chemical damage to your cells.

What Korishan said does make some sense as the switch to CV is only due to the internal resistance of the cells and internal volt drop, which prolongs the change at your target charge voltage in CV. With different battery chemistries if the internal battery resistance is low the time at CV interval can be very, very short and less than a minute and during this time a very very small amount of energy is actuallt charged into the cell, making the CV time of little use. The switch to CV is only to overcome various inefficiencies.

With 18650 cells, the chemistry and the way a lot of powerwalls are designed (long strings with variable resistances to every cell from the charge point) the resulting charge is never uniform and a CV period is required to overcome all these imbalances, charge all of the cells and allow the BMS to perform pasive balancing by dumping power.

Switching to CV at high currents at a 4V average cut-off can still potentially overcharge cells because the charge to voltage profile is no longer linear, especailly if the whole pack is energy imbalanced at each parallel set of cells. The BMS should ideally have control of the charge current and voltage and decrease the total charge current based on the highest cell charge voltage and not an average pack voltage.
 
Practically all the images of the charge curves for Lithiums look something like this:
http://batteryuniversity.com/_img/content/ion1.jpg


image_uemlct.jpg



As you can see, just before the current starts to drop, the voltage isn't quite 4.2V. It starts at about 4.1V. Almost every charge curve looks like this. So if this isn't correct, then why are they showing these curves?
 
I still have not understood why you should terminate charging when current drops below a certain amount. Afaik, that set amount is around 100mA for most modern cells. What's the purpose of terminating at that point? What would happen if you just kept a CV applied to the cell down to a few mA? I mean, when you have 100s of cells in parallel on a powerwall application and you top up with under 10A of CV, that's what happens, each cell is getting a below 100mA charge.
 
thanar said:
I still have not understood why you should terminate charging when current drops below a certain amount. Afaik, that set amount is around 100mA for most modern cells. What's the purpose of terminating at that point? What would happen if you just kept a CV applied to the cell down to a few mA? I mean, when you have 100s of cells in parallel on a powerwall application and you top up with under 10A of CV, that's what happens, each cell is getting a below 100mA charge.

Let's think this one through ....

If current is going into a powerwall indefinitely the voltage the cells are charged to will continue torise....Unless it's lost in self discharge.
If constant voltage is applied and no self discharge the current must fall to almost zero.

It seems to me there's no need for us to bother with fancy charging routines .... just bulk charge it , then switch off ...the voltage of the cells will dropa little from the max....so what , this means a longer life span .

If you want to charge to the optimum 3.93V , I guess you would bulk charge to about 3.98V , cut off charging and then the cell voltage would drop and stay constant at 3.93V

The chart above is misleading ...charge time is 4 hrs , then terminated at 100mA , but stage 3 must be years .... it will take years to self discharge to 3.7 ....

But to answer your question , at constant voltage it looks as if the current will be about 50mA @5hrs..... 25 mA @6hrs ..... 10mA @8 hrs ...1mA@20hrs ... so it doesn't make too much difference if you cut off at 100mA.
 
What I am asking is WHY should we terminate charge when the current drops to around 100-150mA. Every 18650 data sheet has a charging current cut-off to around 100-150mA. Why not leave the pack bulk charging to 4.1V (or whatever other chosen value per cell) FOR EVER?
 
thanar said:
What I am asking is WHY should we terminate charge when the current drops to around 100-150mA. Every 18650 data sheet has a charging current cut-off to around 100-150mA. Why not leave the pack bulk charging to 4.1V (or whatever other chosen value per cell) FOR EVER?

If I had to guess it's because this systemall developed for consumer usage chargers like opus ... the consumer or test facility needs to know when charging has effectively finished and they can use the cell ... also itwould require the charger to draw power , usually from the mains , and it will take many watts to produce that constant voltage supply even though it's delivering less than 1mA after one day, wasteful ....
 
Korishan said:
So if this isn't correct, then why are they showing these curves?

I'm not saying that this chart is wrong. But it clearly shows that what you are suggesting won't work and you are probably misunderstanding what you are seeing. You can see that there is no switch to CV before 4.2V. Dropping current before 4.2V can be done, maybe because it makes it easier to adjust the charging current to prevent an overshoot. But it doesn't have anything to do with CV,the voltage is still going up.
If you want to charge a cell to 4.1V nothing changes, the chart would look the same, just scaled down by 0.1V. And you still need the switch to CV at 4.1V or whatever your end of charge voltage is.

thanar said:
I still have not understood why you should terminate charging when current drops below a certain amount. Afaik, that set amount is around 100mA for most modern cells. What's the purpose of terminating at that point? What would happen if you just kept a CV applied to the cell down to a few mA? I mean, when you have 100s of cells in parallel on a powerwall application and you top up with under 10A of CV, that's what happens, each cell is getting a below 100mA charge.

It depends from cell to cell, the manufacturer specifies this. It's not always 100mA or around 100mA. It's usually 1/10 of the initial charging current. The purpose is keeping the charging process as short as possible. You could wait until it reaches 0mA but that would possibly take a very long time and in this time only very little additionalenergy is stored in the cell.
 
You can skip the CV process, but you will lose some capacity. A full Li-Ion charge would require CV regardless of final voltage. Here's a chargecurve I did to figure out capacity vs. charging voltage for example:


image_rzkhlp.jpg


This is kind of an extreme case because the battery has high IR and I'm charging at a high rate (0.8C) but you get the idea; if I stopped at the start of the CV phase I would only get about 720mAH(!) compared to 2124mAH for a full charge at 4.11V with 20mA termination current. Basically, 2 factors determine how important the CV phase is; internal resistance (high IR means the "real" cell isn't at the same voltage as you are applying at the terminals, but is lower by the IR * charging current), and charge rate (by Ohm's law, low current means regardless of IR, you will have lower voltage drop between the voltage at the terminals and the internal part of the cell, so it's closer to target voltage).

Here is a more relevant example of a top-upcharge curve for the Nissan Leaf using the on-board 3.3kW charger; it still spends 17 minutes in the CV phase even though the charge rate is low to start with and the battery has low internal resistance:

image_nhkvtv.jpg
 
Today I let it absorb longer till basically it went down to around 10% of the bulk current - then let it switch to float in order for my charge controllers to continue to support my loads

Here is what my charge curve looked like today for my Leaf powerwall


image_eiljeb.jpg


It continued to absorb energy till it basically just stopped absorbing anymore

If I were to just stop at bulk I would be missing out on quite a bit of stored energy

I appreciate all the feedback in this thread - it certainly has been helpful!
 
Back
Top