I have been meaning to post my findings on SSG Charge battery performance for some time now, however life events proved to be overly distracting. I have been running SSG testing for about a year and a half, and during that time noted several experimenters on this forum complaining that their LAB charge battery performance had noticeably declined during their COP testing. I did not pay attention to these comments until I experienced similar results.
The following posts will present a general commentary and observations on this charge capacity degradation effect. All the SSG data is well documented over many pages of test notes. Please note that this is not a criticism of the SSG, but a close look at the impact of the spike phenomenon on the Charge battery by a curious experimenter.
My SSG testing started with four NAPA 8224 (CA 275 & CCA 230) spec garden tractor batteries, two brand new and two with ~50 mowing hours on them. The batteries were all conditioned and tweaked with a 2A12 or 1AU Energenx chargers. The new batteries, after conditioning, showed a bit more than 19 hours of discharge time down to a 12.2 volt threshold at a C20 rate of ~0.5 amps using a 6 watt automotive bulb, whereas the two used batteries came in at 15-16 hours of discharge time at the same rate. So, this was the baseline established for future comparisons.
Each set of one used and one new were designated for the Primary and Charge positions throughout all the testing over the past 15 months. This designation was a constant and the batteries were never switched between the Primary and Charge positions.
During this period the SSG testing used a single battery for the Charge side and twin batteries in parallel on the Primary side. So one Charge battery was typically being discharged, while the other was being pumped up by the SSG. The Primary batteries were always charged with either a 10A12, 2A12 or 1AU.
Diligent research on this forum in the early days of this SSG testing yielded the max recommended charge parameters of 15.3v and minimum discharge of 12.3v . Initially, these limits were closely adhered to over many runs (more than 20 per Charge battery), however over a period of time it was noted that the Ah capacity of the charge batteries was slowly degrading. Being a relative newbie to all this the initial reaction was "what am I doing incorrectly here". Head scratching time as the battery performance for both Charge batteries slowly diminished down to 6 or 7 hours and less of discharge time. Other experimenters had noted this effect and claimed that continual use of the SSG in Mode 1 operation was destroying their batteries over time.
The SSG in Mode 1 was thought to improve battery performance over multiple runs. So this was a puzzle that needed some thorough investigation. What evolved out of this was a parallel test program to determine what the real maximum charge and minimum discharge levels should be on a well used charge battery in order to have useful and valid results. The essential question here is why does this happen? Does the Mode 1 pulse charging alter/modify the basic plate chemistry? Can the Ah capacity be brought back to a reasonable level?
More on this soon, or to be continued...
Yaro
The following posts will present a general commentary and observations on this charge capacity degradation effect. All the SSG data is well documented over many pages of test notes. Please note that this is not a criticism of the SSG, but a close look at the impact of the spike phenomenon on the Charge battery by a curious experimenter.
My SSG testing started with four NAPA 8224 (CA 275 & CCA 230) spec garden tractor batteries, two brand new and two with ~50 mowing hours on them. The batteries were all conditioned and tweaked with a 2A12 or 1AU Energenx chargers. The new batteries, after conditioning, showed a bit more than 19 hours of discharge time down to a 12.2 volt threshold at a C20 rate of ~0.5 amps using a 6 watt automotive bulb, whereas the two used batteries came in at 15-16 hours of discharge time at the same rate. So, this was the baseline established for future comparisons.
Each set of one used and one new were designated for the Primary and Charge positions throughout all the testing over the past 15 months. This designation was a constant and the batteries were never switched between the Primary and Charge positions.
During this period the SSG testing used a single battery for the Charge side and twin batteries in parallel on the Primary side. So one Charge battery was typically being discharged, while the other was being pumped up by the SSG. The Primary batteries were always charged with either a 10A12, 2A12 or 1AU.
Diligent research on this forum in the early days of this SSG testing yielded the max recommended charge parameters of 15.3v and minimum discharge of 12.3v . Initially, these limits were closely adhered to over many runs (more than 20 per Charge battery), however over a period of time it was noted that the Ah capacity of the charge batteries was slowly degrading. Being a relative newbie to all this the initial reaction was "what am I doing incorrectly here". Head scratching time as the battery performance for both Charge batteries slowly diminished down to 6 or 7 hours and less of discharge time. Other experimenters had noted this effect and claimed that continual use of the SSG in Mode 1 operation was destroying their batteries over time.
The SSG in Mode 1 was thought to improve battery performance over multiple runs. So this was a puzzle that needed some thorough investigation. What evolved out of this was a parallel test program to determine what the real maximum charge and minimum discharge levels should be on a well used charge battery in order to have useful and valid results. The essential question here is why does this happen? Does the Mode 1 pulse charging alter/modify the basic plate chemistry? Can the Ah capacity be brought back to a reasonable level?
More on this soon, or to be continued...
Yaro
Comment