I often hear reporters and commentators use "energy" and "power" interchangeably, as if they were two words for the same physical parameter. I can understand that confusion to some extent. You have to know what you are talking about to use them properly, and energy and power are vague concepts to most of these folks. It's somewhat similar to the voltage/current confusion we discussed last week.
But it also worries me when I hear engineers use "energy" and "power" almost interchangeably. I assume that, in most of these cases, the engineer knows what the words really mean, so it's just a matter of verbal casualness.
Let's be clear. Energy is the ability to do work. Power is the rate at which the energy is being used and work is being done. To look at it another way, energy is the time integral of power. The standard unit of power is watts, and the standard unit of energy is joules, watt-hours, or some dimensional scaling of those basic units.
A battery and a lightning bolt represent very different poles of the power/energy continuum.
My concern is that, by being a little careless in the use of these ubiquitous and very necessary engineering terms, we risk getting fuzzy in our thinking about them. In many designs, the energy storage element (a battery or supercapacitor, for example) is filled by available energy at one rate, yet it is usually called on to deliver power at a much higher rate. Of course, there are reverse situations, where a burst of energy fills the reservoir and then is drawn down slowly over time. Regardless, you have two processes that are linked together but somewhat independent, though they must add up in the aggregate.
Consider an energy-harvesting subsystem used to charge the battery for a data logger. The harvested energy trickles into the battery over time, but that battery must deliver bursts of power to the logger when it is acquiring or transmitting data. If the integral of the accumulated energy doesn't equal the needed power burst, it's an unsustainable situation.
Or look at an AC line charger for a smartphone or laptop. It has to supply only a modest amount of power to charge the device within a reasonable time when it is not in use, but it has to have much more robust capability if it is to charge the device while it is in use. The former situation is determined more by the energy the charger can deliver over some broad time. The latter one is an operational scenario, and thus it is more driven by power.
That's why it is important to use these two closely related parameters carefully and appropriately. In most situations, we accumulate energy, but we spend it as power in order to get useful work done through a combination of mechanical, chemical, or electrical means. This blurring is also what worries me about many green energy plans, however well intentioned. When you go through the energy capture/power spend reckoning, there's a real imbalance that must be acknowledged and addressed.
Have you ever been in a situation where the energy and power numbers were way out of balance, or misunderstanding and miscommunication about these two critical parameters led to design problems?