Li-Ion and Li-Polymer batteries have become the dominant source of power in today's portable electronic equipment. Their popularity stems from the fact that they provide the highest energy per weight and volume ratio. Other benefits of this battery technology also include durability and simpler system design because of the high cell operating voltage level (3.6V or 3.7V, nominal). Unless an application requires very high charge/discharge current (for example, power tools), the choice of rechargeable Lithium-based battery for a consumer device is a given.
The latest developments in this battery technology focus primarily on two areas: higher float voltages and higher battery capacities. Both of these industry trends attempt to address the increasing need for longer system operation at a time when the convergence (integration of multiple functions on one device) and performance of everyday consumer gadgets is draining batteries faster than ever.
Advancements in battery technology over the last few years have allowed a 10% average annual increase in capacity for a given battery pack form-factor. At the same time, batteries with higher capacities in absolute terms have also entered the market to enable acceptable system run-times in the myriad of new, high-complexity handheld devices that have entered our lives over the last decade. Having more energy available to the system is great news for the consumer, since it prolongs the device's usable operation. However, batteries that provide more energy to the system also require more energy to be fully charged. With traditional charging methods, this translates to significantly longer battery charging times, a non-desirable feature for the consumer market. Let's examine where this limitation is coming from.
The rate of charge or discharge of a battery is commonly expressed in relation to the actual battery capacity. The term used in the industry for describing charge or discharge current is the C-rate. For example, for a 1000mAh battery, a charge current of 1C (1000mA) will theoretically charge the battery in one hour.
According to most battery manufacturers, a typical battery is safely charged at a rate of 0.7C to 1.2C during the fast-charge phase. This is the charging phase at which the battery voltage is high enough (usually higher than 2.8V or 3.0V) for it to accept a high charge current level (see Figure 1). When the battery voltage is below the so-called pre-conditioning level, only a fraction of the charge current (typically 0.1C) should be provided to the cell for a safe replenishment of the deeply-discharged cell. Hence, in theory, as soon as battery voltage is higher than approximately 3.0V, system engineers would want to provide the desired 0.7C to 1.2C rate, which safely charges the battery in the shortest time possible. In practice, achieving this "optimization" poses a great challenge and as described in the following sections accomplishing this task requires that traditional charging circuit design undergo radical changes.
Figure 1: Typical Li-Ion charging profile