What's the difference between charge controller and buck converter? For microgeneration

harrisonpatm

Member
Joined
Jan 5, 2022
Messages
401
Short question is in the title. Allow me to give details as to what I am looking for. I have ideas for building microgeneration systems, plural, and I am looking for individual components. In addition, I am looking to fully understand what every component of the system does, so I've got plenty of learning to do. I could go on Amazon and buy a charge controller, crappy wind turbine, and a car battery, but that won't help me understand what I am doing.

So here's my hypothetical situation. Let's say I have a small 3s battery pack using 18650's, BMS protected from undervoltage. 3s, so nominal voltage of 11.1V, topoff charge of 12.6V. I want to charge it with a small solar panel or wind turbine (rectified to DC). I could find an appropriate charge controller... but I keep seeing DC-DC CC/CV variable buck converters. They can take 3-30V input, regulate the output to a set level using a potentiometer, and as long as you select one that can handle the amps of your solar/wind, you could charge your bank. So I set the output to 12.6V to top off my 3S battery... but no, if there's any variation of the output of the buck converter, I risk overcharging the battery; even 12.65V could damage the bank. Ok, let's set the output to 12V even, gives me wiggle room for error. Some models I have looked at are the OSKJ, or the XL4016.

There are two questions I have with this setup, regarding what could go wrong, as my main concern is battery safety.

#1 What happens to the buck converter when the input of my wind/solar stops, at night or when there is no wind? I am concerned that the battery voltage will feed back into the buck converter and damage it, as well as unnecessarily drain the battery.

#2 What happens when the battery is fully charged and the wind/solar keeps producing? Are there some buck converter models that open the circuit after reaching the set voltage, to prevent overcharging? I don't want to have to rely on the BMS to prevent that. I know that some solar controllers just open the circuit, while most wind controllers divert to a dump load when the battery is full.

Why am I even exploring this route? My concept is small, numerous, cheap, and reproduceable power generation systems working in tandem. Either 10-20 small turbines, each with their own battery, or 10-20 of them each with their own buck converter feeding into and charging the same large battery bank. I know that MPPT or even PWM controllers do a better job or maximizing the power going into the batteries; I'm not concerned with that. I also know that the buck converter wouldn't be the only element needed in between the generator and the battery; rectifiers, fuses, extra wiring, ect, all add up, and in the end if may be the same cost or even more than just outright buying a charge controller.

My sole concern right now is battery damage and safety. Given that, and regarding my two questions above, how feasible is this method of charging a battery? Thanks in advance!
 
1. It will have no power to function for the time without enough power. It might leech a little current from the battery, but not much and you should use a BMS anyway, so it cuts the output, if the cells get too low.

2. Solar - nothing, the sun will shine onto the solar cell and the voltage will still be present, just next to no current will flow. Keep the open circuit voltage lower than what the DC-DC Module can handle, or it will get destroyed.

Wind - your Windturbine will spin much faster, due to no load, possibly even rotating so fast, that it will disassemble itself while doing so. Hence why most proper WIndturbines have breaks or a big load like a very big resistor.

Solar Panels & Generators to a degree however, are much more effective with an MPPT controller. It searches for the optimal Voltage & Current curve to reach the max Wattage, while a simple DC-DC Module dosnt care about that and just pulls as much current as it can, lowering the voltage substantially and reducing the Wattage.

Going by what voltage cut-off you gave for your 3s setup, i think you haven't read up on how much li-ion cells dislike voltages above 3.95v, especially above 4.15v.

Personally, i rather have one unified system, that gets fed with various sources of production. Its just simpler to manage in the long run in my opinion.
 
Excellent reply, thank you. So #1, no issue #2, no issue with solar, issue with wind in high speeds. Separate dump load or mechanical slowdown needed in that case.

Agreed on MPPT being more effective. Currently I'm not concerned with "efficient," I am just looking for "sufficient". I know it's not efficient or optimal.

Can you please explain what you mean on my cutoff for 3S? If cells don't like voltage above 3.95V, then the 3S cutoff can simply be 11.85V instead of 12. What I was going by was charging profile that I have read for li-on: constant current and constant voltage to 4.2V, then cutoff with no trickle or float charge to damage cells. This is what I have read; I would welcome correction.

Also, when you say one unified system fed with various sources of production, this is also what I am imagining. Differently and more complicated, yes, but this is just a concept I am working on. Theory and practice are two very different things.
 
Can you please explain what you mean on my cutoff for 3S? If cells don't like voltage above 3.95V, then the 3S cutoff can simply be 11.85V instead of 12. What I was going by was charging profile that I have read for li-on: constant current and constant voltage to 4.2V, then cutoff with no trickle or float charge to damage cells. This is what I have read; I would welcome correction.
This is how they are normally being treated, resulting in a life-span of about 300 cycles till 20% capacity lost.

But if you start reading deeper into the matter, you'll learn pretty quick, how much the life-span gets increaes when you use a lower voltage as the charge-cut off.
Here is my thread having some source from tesla & how they treat their batteries:
Here is another great source, especially the "How to prolong battery life" section.

When keeping the cells between 3.4v & 3.95v, you can expect them to being able withstand above 1000 cycles, until they've lost 20% of their capacity, but you'll also have about 35% less useable capacity.
 
This is how they are normally being treated, resulting in a life-span of about 300 cycles till 20% capacity lost.

But if you start reading deeper into the matter, you'll learn pretty quick, how much the life-span gets increaes when you use a lower voltage as the charge-cut off.
Here is my thread having some source from tesla & how they treat their batteries:
Here is another great source, especially the "How to prolong battery life" section.

When keeping the cells between 3.4v & 3.95v, you can expect them to being able withstand above 1000 cycles, until they've lost 20% of their capacity, but you'll also have about 35% less useable capacity.
Ooo, excellent, thank you! I have read the page on your second link, about charge state and it's relationship to battery life. Understood. I guess I'll clarify, since I haven't build any of this yet: my value of 12V for a 3s bank was a concept value that I chose to demonstrate that choosing the right buck converter can allow me to manage my battery pack more accurately, albiet manually. For example, if I wanted to keep my cells at your values of between 3.4 and 3.95, you get longer battery life, but less capacity available. How to compensate? More batteries of course (within reason). Alternatively, if I became familiar with my system and its charge/drain capacities, perhaps I know that even if I set the buck converter to output 4.15V per cell, I will use it enough to continuously drain it to 3.4-3.95, and the 4.15V output set value is essentially there as a sort of "safety" to make sure that if my average load decreases, I won't risk an overcharge. Still would use a BMS of course.

Anyway. The question was for concept and safety. I will probably build a small system with a buck converter in place and take readings over a period of time. Just wanted to check on feasibility. I welcome any more suggestions.
 
if I set the buck converter to output 4.15V per cell, I will use it enough to continuously drain it to 3.4-3.95
You can't have 4.15V but only 3.95V, the two points have a copper wire joining them & they would & should be the same!

re buck converters vs MPPT charge controllers, it's not only the efficiency & voltages, it's about the load on the input side. Ie if the voltage going into a buck converter drops, it will draw more current & try & maintain the same output volts. An MPPT unit will back off the input current so as not to "overload" the source (& give the output less current too).
Connecting a solar panel to a buck converter it can "stall" at a low voltage into the buck & deliver low watts in full sun! An MPPT or PWM charge controller would let the volts on the solar panel rise more & deliver more watts (mppt better watts than PWM).
 
You can't have 4.15V but only 3.95V, the two points have a copper wire joining them & they would & should be the same!

re buck converters vs MPPT charge controllers, it's not only the efficiency & voltages, it's about the load on the input side. Ie if the voltage going into a buck converter drops, it will draw more current & try & maintain the same output volts. An MPPT unit will back off the input current so as not to "overload" the source (& give the output less current too).
Connecting a solar panel to a buck converter it can "stall" at a low voltage into the buck & deliver low watts in full sun! An MPPT or PWM charge controller would let the volts on the solar panel rise more & deliver more watts (mppt better watts than PWM).
Let me clarify what I meant: If I set the buck output to 4.15V to charge a single cell, the cell will go up to 4.15 and the buck will equalize and cease charging the cell. Yet, if I set it to 4.15V, while using the cell and continually using a load off the same battery, the cell may never actually reach 4.15 depending on the load. Can you explain what you mean with "the two points have a copper wire"? Which two points?

I understand your comment regarding MPPT's. I would need to test what happens in a real world setting to my generation souce when voltage drops; ie, when wind and sun goes down. However, one thing I do know what would happen, is that if I have a buck converter that accepts a range of voltages, say 6-40V, when the input of the generation drops below 6V, the buck converter does nothing, no output. That's to be expected. I also expect that if I have a 12v bank, for example, and I set the buck output to 12.5V (just for example), an input below 12v will also give nothing for the buck to output. Need to test this. I would probably need a combination buck/boost to deal with voltages lower than my bank's voltage, and at that point, I would just go with a dedication solar or wind charge controller.

Again, I want to explore this concept as a bare-bones controller setup, not for efficiency (MPPT already does that), just for sufficiency. Lack of efficiency of power generation in this theory is a given.
 
while using the cell and continually using a load off the same battery, the cell may never actually reach 4.15
This is called float charging, which is bad for lithium cells. It's best to charge them up, then deplete down to a trigger point, and then charge them back up, and repeat. Don't keep pulling a steady load on the cell while also giving it a steady input.
Current flow doesn't always flow through the wire from charger to load without going through the cell first. Just because the wires are connected in parallel, doesn't mean it won't go in/out of the cell. Input fluctuations will cause current to flow in/out of the cell on a continual basis, which is not good for lithium cells.
This can "kinda" be mitigated by putting a larger smoothing capacitor in parallel with the cell (not the one on the buck converter, not adequate). But this is kinda like putting a bandage on a wound that requires stitches. It'll help in the short term, but in the long term, there could be damage regardless.

The other thing to note about Buck converters, they are highly inefficient for the majority of their conversion curve. Some bucks are only 95+% efficient while under 95+% load. All the other times they are burning the energy off as waste heat.
This is where a PWM converter comes in handy as it doesn't waste so much energy to heat. These units are not the cheap $5 ones you can get on ebay/aliexpress. They do cost a little more as they usually have some sort of mcu built in. Not just a 555 timer or something similar.
 
This is called float charging, which is bad for lithium cells. It's best to charge them up, then deplete down to a trigger point, and then charge them back up, and repeat. Don't keep pulling a steady load on the cell while also giving it a steady input.
Current flow doesn't always flow through the wire from charger to load without going through the cell first. Just because the wires are connected in parallel, doesn't mean it won't go in/out of the cell. Input fluctuations will cause current to flow in/out of the cell on a continual basis, which is not good for lithium cells.
This can "kinda" be mitigated by putting a larger smoothing capacitor in parallel with the cell (not the one on the buck converter, not adequate). But this is kinda like putting a bandage on a wound that requires stitches. It'll help in the short term, but in the long term, there could be damage regardless.
Can you please explain how this is different in a setup with an MPPT or PWM controller? I don't mean this as a sarcastic comment, I really want to know.

From my understanding, in a standard setup, your battery bank is receiving current from a charge controller and delivering load, both of those in parallel. If the charge controller is not delivering any power, due to a cloudy day or no wind, the load is taking its current from the battery. If its sunny and windy, the load can come directly from the charge controller and the battery doesn't charge or discharge (this is an oversimplification). While I understand that using correct components are crucial, the user's selection of generator output and battery capacity to match their average load is what's actually putting the battery through it's paces. If I had 5000w solar feeding my 5000w average load, but my battery bank is a single 50aH car battery, then I'm the idiot who can't do math and of course the battery is going to stress out and die quickly.

Re your comment on input fluctuations, what does a MPPT or PWM controller do to mitigate that? I guess this goes back to my original question, as in, how does replacing a charge controller with a buck converter cause more input fluctuations. Also, you mentioned that it can kinda be mitigated by using a bigger capacitor. Couldn't it also be mitigated by using a battery bank with a much larger capacity than you actually need? Input fluctuations in that case would be more akin to ripples in a pool, rather than waves in a puddle.

The other thing to note about Buck converters, they are highly inefficient for the majority of their conversion curve.
This is granted. I know it hurts people's brains to say "i'm not concerned about efficiency," but in this case it's true. At least for now. I'll be concerned about efficiency later.
 
Can you please explain how this is different in a setup with an MPPT or PWM controller? I don't mean this as a sarcastic comment, I really want to know.
MPPT/PWM is smart enough to stop charging once a charge is completed, and then will only turn back on once a threshold has been reached. A buck converter won't be able to supply enough amps to over come the load, in most cases. And if you are paying for buck that can do 10A, you might as well pay the money to get an MPPT or PWM controller.
For instance, if the threshold is 4.0V, then the MPPT/PWM will push as many amps as needed to reach full charge, 4.2V. Then will back off and wait for the voltage to reach 4.0V and start back up again. A buck will not do this. It will start out putting charge current as soon as the cutoff voltage of the buck is reached.
So if you set it to 4.1V, then it will continue to output max/set current as soon as the voltage drops below 4.1V, and will stop as soon as voltage reaches 4.1V. There is no wiggle room, per se. This is considered trickle charging.
From my understanding, in a standard setup, your battery bank is receiving current from a charge controller and delivering load, both of those in parallel. If the charge controller is not delivering any power, due to a cloudy day or no wind, the load is taking its current from the battery.
Yes, correct.

If its sunny and windy, the load can come directly from the charge controller and the battery doesn't charge or discharge (this is an oversimplification)
Sort of. When the charge controller can maintain a balance, it's working under full charge load. Not a bad thing for the unit to work like that.
The charge controller will "adjust the output current" to suit the needs of the load. A buck converter can not do this. It is static. So you will get ripples on the battery side. The more expensive the charge controller, the more accurate this function is, and the faster it happens.

Re your comment on input fluctuations, what does a MPPT or PWM controller do to mitigate that?
The MPPT/PWM controllers will change the charging output current to suit the requirements of the load/charge, dependent on its configuration. For instance, if it detects that the battery voltage is 4.1V, and it's cut off voltage is 4.1V, and it detects a high amp load, it will ramp up Amp output to compensate, which lowers the ripple effect on the battery. PWM does this too, but not quite as effectively as an MPPT does.

A buck converter, can not change its output current on the fly. At least not the cheap units. You "can" hack some of them with an arduino or other mcu to add this feature, but why bother when you can spend the same money's on a proper charge controller?
 
In all of my reading so far, you have explained this the best, in a way that is actually understandable. Thank you, incredibly helpful. Hopefully you don't mind my followup questions!
For instance, if the threshold is 4.0V, then the MPPT/PWM will push as many amps as needed to reach full charge, 4.2V. Then will back off and wait for the voltage to reach 4.0V and start back up again. A buck will not do this. It will start out putting charge current as soon as the cutoff voltage of the buck is reached.
So if you set it to 4.1V, then it will continue to output max/set current as soon as the voltage drops below 4.1V, and will stop as soon as voltage reaches 4.1V. There is no wiggle room, per se. This is considered trickle charging.
#1. What if you had said MPPT/PWM with a threshold of 4.0, but your load is such that your batteries are always at 3.8, 3.9, 3.95, back down to 3.8... isn't this also a form of user-error trickle charging? Wouldn't you need to upgrade either your generation output or battery capacity? Or both? Or reset your controller's threshold? Because this is essentially what I would have buck converters set to do, with different values.

#2. You said that the MPPT "ramp up Amp output to compensate, which lowers the ripple effect on the battery." How much does it lower? presumably it never goes away. Simply gets lessened, mitigated. No battery lasts forever, after all
A buck converter, can not change its output current on the fly. At least not the cheap units. You "can" hack some of them with an arduino or other mcu to add this feature, but why bother when you can spend the same money's on a proper charge controller?
This is understood. Thank you.

#3. What do you think about my comment regarding a larger battery bank capacity? If I'm essentially using my buck converters to create trickle charging, creating ripple effect on the battery, will having a huge aH rating lower that ripple effect across the whole bank, lessening the negative health effects to the batteries?

I do think I understand that an MPPT uses deadband threshold values, bringing it up to 4.1V in your example, waiting till it goes down to 4.0, then applies charge to bring it up to 4.1 again. This is one charge cycle, as opposed to trickle charging being numerous cycles. However, what happens if during that deadband between 4.0 and 4.1, load and generation varies? You would potentially be trickle charging within that value. Load applied at 4.1, 4.09, 4.07, 4.05, sun came out, 4.06, 4.07, nighttime, back down to 4.05, 4.04... Isn't this also trickle charging?..... though as I type, I realized you answered this already: the controller won't apply charge current until it falls below the threshold. In that case, if this happens while the sun is out and the wind is blowing, wouldn't that be inefficient in wasting potential power generation?

Oh no, I mentioned efficiency! I said I wasn't going to do that!
 
What if you had said MPPT/PWM with a threshold of 4.0, but your load is such that your batteries are always at 3.8, 3.9, 3.95, back down to 3.8
Then you need a charge controller that can output more amps.
The reason why the battery would stay at 3.8-ish volts is because there isn't enough current to charge fully.
However, the issue really isn't about partially charging/discharge at around 80% capacity. It's when the cells are closer to 90% or higher capacity when it starts to become an issue. This is why I used 4.1V as the example ;)
Lower than that, then they aren't really being trickle charged, per se. I know it sounds like they would be, but not entirely. The cells at that voltage are able to have more wiggle room inside, if you will. It's harder for the dendrites to grow in that voltage range, and the cells are relatively happy.
But get at either end of their charge voltage, <3.2V or >4.15V, and you start getting to where the cells start to be angry under certain conditions.
The object is to extend the life the cells as long as possible, so being kind to them at the extremes is one way to help in this regard.

You said that the MPPT "ramp up Amp output to compensate, which lowers the ripple effect on the battery." How much does it lower? presumably it never goes away
For example, if you have a charge of 10A to the battery, and there is a load applied that pulls 15A, the MPPT will see a voltage drop on the battery, and it'll try (not always successful, dependent on quality of controller, conditions of solar/wind supply, etc) to go higher than 15A to start bringing the voltage back up. As long as there charger is supplying more amps than the battery+load requires, the ripple will essentially go away, unless there are surges. But that's a different topic.

What do you think about my comment regarding a larger battery bank capacity? If I'm essentially using my buck converters to create trickle charging, creating ripple effect on the battery, will having a huge aH rating lower that ripple effect across the whole bank, lessening the negative health effects to the batteries?
Increasing the "battery" capacity won't reduce the ripple in a way you think. The ripple will still be there, however, it'll just minimized by the more cells it is propagating through. So instead, for example, a ripple of +/- 50mA per cell on a 10p, you would have +/- 25mA on a 20p. The ripple is smaller per cell, but it's still there.
You really want to either have your load "lower" than your charge controller's max amp output, or have your load "higher" than your controller output. This will minimize the ripple effect the most. In my opinion.
Having the load "lower" will always allow for the controller to keep the battery charged up, and ready for when the controller can't charge; like when the sun goes down and no solar input.
Having the load "higher" will deplete the battery while in use. Just make sure that the load is disconnected/stopped before the battery is fully discharged (around 3.2V/cell) to allow the controller to recharge when available.

waiting till it goes down to 4.0, then applies charge to bring it up to 4.1 again. This is one charge cycle
A charge cycle is a full discharge/recharge. So, taking the cells down to about 3.2V and then recharging back to 4.1V would 1 cycle. Anything lower is a partial cycle. And to be honest, not sure if they cumulative or not. Meaning, if you have two half cycles (4.1->3.6->4.1 twice), does this equate to a full cycle on the cell in counting how many it will last?? I have no idea. But reducing the "full" cycle will drastically increase the life "span" of the cells. Daromer has cells in service I think for over 5 years now because he doesn't fully cycle them on discharge/charge. And these are reclaimed cells.
what happens if during that deadband between 4.0 and 4.1, load and generation varies? You would potentially be trickle charging within that value. Load applied at 4.1, 4.09, 4.07, 4.05, sun came out, 4.06, 4.07, nighttime, back down to 4.05, 4.04... Isn't this also trickle charging?
To be honest, I don't think this really would be an issue as the voltage would either pass through that range quickly (because of a small parallel pack) or take a long while to pass through (because of a high parallel count). it would be tough to be in that dead zone to cause issues with the proper controller.

if this happens while the sun is out and the wind is blowing, wouldn't that be inefficient in wasting potential power generation?

Oh no, I mentioned efficiency! I said I wasn't going to do that!
lol :p It comes back around.

It's only wasted power if it's collected and then lost to heat. So an MPPT and a PWM (a proper one) will not try to charge the battery until the threshold is reached. Because it detects the battery voltage, that part of the electronics is never activated that does the conversion.
However, with a buck converter, this would cause waste of energy because it would try to maintain output voltage, and any Delta difference between input/output it'll burn off as heat.
Well, the cheap buck converters at least. There are some that are smart enough to *not* engage the circuitry until needed, as in you can set a lower threshold for example. But these are not the usual "linear" buck regulators either. They are driven with FETs and other ICs. These are more expensive than the linear ones as there's more components. And because of this, the inexpensive PWM controllers are getting within spitting distance on price of those types of buck converters. So might as well spend the extra bucks to get at least a PWM.

Oooorrrr, you could go the complete route of learning the electronics yourself and build your own circuit :p Adam Welch has done this with his MUPPET project. Worth a good watch to learn more about how MPPT works if anything else.
 
So, taking the cells down to about 3.2V and then recharging back to 4.1V would 1 cycle. Anything lower is a partial cycle. And to be honest, not sure if they cumulative or not. Meaning, if you have two half cycles (4.1->3.6->4.1 twice), does this equate to a full cycle on the cell in counting how many it will last?? I have no idea. But reducing the "full" cycle will drastically increase the life "span" of the cells. Daromer has cells in service I think for over 5 years now because he doesn't fully cycle them on discharge/charge. And these are reclaimed cells.
That's kinda what I'm getting at, in a way. There does seem to be this window of battery life in which there is no consensus as to what a full charge cycle is, what constitutes one, ect...
Oooorrrr, you could go the complete route of learning the electronics yourself and build your own circuit :p Adam Welch has done this with his MUPPET project. Worth a good watch to learn more about how MPPT works if anything else.
I'm probably not going this full route, but this is kinda the point, actually! I want to see what's happening with just a buck converter, and I like to understand all the elements of the system.
 
That's kinda what I'm getting at, in a way. There does seem to be this window of battery life in which there is no consensus as to what a full charge cycle is, what constitutes one, ect...
I think, cycles are a rather bad comperable statistic. Total charged and total discharged electricity would be more usefull.
If i take the minimum cycle count & minimum capacity per given votlage from that side with a 3.6v 2.2Ah cell and do some simple (yet wrong, but good enough) math (3.6v*2.2Ah*Available Stored Energy*Discharge cycles). I get the following data:

2 376Wh @4.20v
2 .851Wh @4.15v
4 039Wh @4.10v
5 385Wh @4.05v
6 652Wh @4.00v
11 404Wh @3.90v

So a fully charged cell will get roughly 2 376Wh discharged totally, till it reaches 80% capacity left.
And one charged to 3.90v, will do roughly 4.8 times as much capacity cycled, till it hits 80% capacity.

That should make it pretty obvious, what you should choose under which circumstances.

And i dont wanna imagine how long a cell lasts, if you limit the voltage to 3.7v during the summer, when you have enough excess solar energy, so you dont need the full capacity.

As an comparison, an LFP 18650, has 3.2v and a good 1800mAh withstanding 2000 cycles with a 100% DOD. So it will result in roughly 11 520Wh till it has 80% capacity left. So NMC / LCO batteries are pretty comperable to LFP, when treated well.
 
Last edited:
And that's just with lowering the upper end of the charge cycle. Raise the discharge amount slightly, say from 2.8 to 3.2V, then the life can drastically increase. Just a few more cells in the groups, and adjust by a few tenths of a voltage, and the life span of the cells can be extended literally 5+ years, or even beyond 10 years, as long as temps and other external factors are controlled.
 
While the site is a good resource for the most part. The thing that gets me is they can't even get the abbreviation right. Case in point Screen Shot 2022-01-18 at 4.08.05 PM.png
afaik there is no Lithium phosphate4 chemistry right above this they mention Lithium Iron Phosphate from the looks of things they haven't update the Page in many years.
Later floyd
 
And that's just with lowering the upper end of the charge cycle. Raise the discharge amount slightly, say from 2.8 to 3.2V, then the life can drastically increase. Just a few more cells in the groups, and adjust by a few tenths of a voltage, and the life span of the cells can be extended literally 5+ years, or even beyond 10 years, as long as temps and other external factors are controlled.
Lots of great information. I like how while I have read all these resources before, it takes different viewpoints and responses to put it all into context. Thanks
 
While the site is a good resource for the most part. The thing that gets me is they can't even get the abbreviation right. Case in pointView attachment 26801
afaik there is no Lithium phosphate4 chemistry right above this they mention Lithium Iron Phosphate from the looks of things they haven't update the Page in many years.
Later floyd
Yeah. they have many issues on their site, but at least they are trying to put everything into the correct order in one place and not scattered around the internet.
 
Back
Top