didnt charge enough?

Icebike

Senior Member
Apr 28, 2011
1,523
186
Despite my natural reluctance to rely on a sole source, Isadore has been my go-to guy for years on this subject, and I find MANY other sources rely (paraphrase as well as outright plagiarise) on him as well. This is also a useful article, despite the fact it covers several chemistries and was written about a decade ago (many of his articles are, btw.):
http://www.buchmann.ca/chap6-page1.asp
:
)


Well your Go To Guy never said this:

New cells will benefit from several full charge/full discharge cycles as the chemistry/electrons establish their favorite routes (paths of least resistance-aka 'break-in') under real load (constant voltage-C rate), which indeed happens across most battery chemistries.

Further, SOC and SOH are measurements made by the charge controllers, and are initially set by software to estimated values, NOT actual battery measurements. The charge controllers couldn't possibly know the State of Health unless they discharged totally, which I guarantee Acer does not take the time to do.

(Some large Laptop batteries have a charge measuring circuit embedded so you can simply push a button to get a measurement. But even these are best guesses) .

Production batteries are charged to 40%. The are not charged to 100% and then discharged to 40%. You made that part up. Nobody has time to do this when you ship a million batteries a month. It would serve no purpose, because the max and min charge states are not stored in the battery, they are stored in the charge controller's non volatile ram. So your full charge and full discharge would have to be done IN DEVICE, after assembly. I guarantee Acer doesn't do that either. Nor Apple, nor Samsung. They simply preload the Charge controller with conservative standard values, and let it learn the actual ones in normal usage.

The battery as protection circuits (CIDs) set the de-facto lowest/highest charge state, (because they take the battery off line). TThis is what charging controllers LEARN from. Most charger controllers do not actually let the battery's internal circuit trip, they step in ahead of time when they sense rapid drops in low-end voltage or very slow increases in high-end voltage.

The other inaccuracy common at Battery University is that you need to shut off a device to charge it because parasitic discharge confuses the charger. This hasn't been the case for the last 10 years, this is totally handled by modern charge controllers which are designed for always on equipment.
 

alphawave7

Member
May 1, 2011
93
3
Well your Go To Guy never said this:

Not that I know of. Google 'battery break-in' or' 'Li-Po Break-in' for further reading.

Further, SOC and SOH are measurements made by the charge controllers, and are initially set by software to estimated values, NOT actual battery measurements. The charge controllers couldn't possibly know the State of Health unless they discharged totally, which I guarantee Acer does not take the time to do.

(Some large Laptop batteries have a charge measuring circuit embedded so you can simply push a button to get a measurement. But even these are best guesses) .

Production batteries are charged to 40%. The are not charged to 100% and then discharged to 40%. You made that part up. Nobody has time to do this when you ship a million batteries a month. It would serve no purpose, because the max and min charge states are not stored in the battery, they are stored in the charge controller's non volatile ram. So your full charge and full discharge would have to be done IN DEVICE, after assembly. I guarantee Acer doesn't do that either. Nor Apple, nor Samsung.

Sounds plausible, but 40% of what? Perhaps it's better to say 'a given voltage' or 'partially charged'. I've no doubt QC suffers at the battery manuf, and I agree Acer and others don't futz with the cells whatsoever.

They simply preload the Charge controller with conservative standard values, and let it learn the actual ones in normal usage.

Which brings us back to our/OP's interest. Our disagreement seems to be the merits of 'break-in' procedures, and I would argue that several full charges and discharges are only going to improve the 'let it learn the actual ones in normal usage', both for the detection circuit, and well as the user him/herself wrt preceived monitor accuracy. Using a statistical metaphor, increased sampling improves accuracy.


The battery as protection circuits (CIDs) set the de-facto lowest/highest charge state, (because they take the battery off line). TThis is what charging controllers LEARN from. Most charger controllers do not actually let the battery's internal circuit trip, they step in ahead of time when they sense rapid drops in low-end voltage or very slow increases in high-end voltage.

Yes..the FET's. I believe they also trip with excessive temperature and/or resistance.

The other inaccuracy common at Battery University is that you need to shut off a device to charge it because parasitic discharge confuses the charger. This hasn't been the case for the last 10 years, this is totally handled by modern charge controllers which are designed for always on equipment.

Agreed. Isadore also takes hits for lack of formal education, and selling books as well as battery widgets and testing equipment. He also must be getting on in years, since most current writing is simply updates to his existing work.
 
Top