There are lots of options for energy-harvesting sources, such as localized heat or vibration, but one of the most pervasive possibilities is to grab some of that stray RF field energy that is all around us, from low frequencies into the gigahertz range. Hey, if you don't use it, it truly will go to waste. It will be absorbed by any materials in its path (causing imperceptible but widespread heating) or dissipated into space (perhaps continuing forever on its journey at 3 x 108 meters/second).
That's why a recent development at Duke University looks interesting. It involves the use of highly engineered metamaterials to build a 900 MHz-to-DC transducer, shown in Figure 1. The researchers say their device achieves 37% efficiency. That's very impressive, especially when you consider that good solar cells reach around 10% and (for obvious reasons) are not usable 24/7.
Duke engineering students Alexander Katko (left) and Allen Hawkes show a waveguide containing a single power-harvesting metamaterial cell, which provides enough energy to power the attached green LED.
I have a quibble with the Duke University press release about this development. In trying to make the concept tangible to the audience, the writer says that, by using the metamaterial cells in series (see Figure 2), the device was able to produce an output of 7.3 V, which is higher than a standard USB port.
This five-cell metamaterial array developed by Duke engineers converts stray microwave energy, as from a WiFi hub, into more than 7 V with an efficiency of 36.8% -- comparable to that of a solar cell.
Even though that is factually correct, there's the implication that this metamaterial panel can act as a USB charger or similar power source. That would be nice. But anyone reading this column knows that, even though it could deliver that voltage, the current level would be low. There isn't that much RF energy passing through in the capture field. The full technical paper in Applied Physics Letters shows that the researchers produced about 100 mA into a 70-Ω load, which is very impressive.
The 7.3 V tag made me think about the love/hate relationship engineers have with voltage and current and thus with energy and power (the rate at which energy is delivered). Sometimes the specific value needed is determined by the laws of physics. If you want to ionize a gas (such as a neon tube) or jump a spark gap, you'll need several thousand volts but low current. When you want to do real work such as driving a motor, you'll want more current to deliver the power -- at a higher voltage to reduce I2R losses and increase overall efficiency.
By contrast, the voltage and current needed for a smartphone is dictated by the ICs that were designed for very low voltage, due to the imperative of low power consumption. In general, low-single-digit voltages are tough to work with efficiently -- not because of resistive losses, but because unavoidable diode drops of 0.6 to 0.8 V can take a big bite out of your available source voltage.
How do you choose the voltage and current values to use? As in most engineering situations, the answer is clear: It depends. For some situations, such as gas ionization or the smartphone, you have little choice; the numbers are dictated by physics, available components, or industry standards. In other cases, the engineer has the flexibility to choose (within limits). It's a matter of finding the voltage/current combination that works best in terms of power delivery, system efficiency, available components, safety requirements (which kick in at different levels), and cost.
Have you had to analyze and select operating voltage and current levels? How did you come to a decision on balancing the unavoidable tradeoffs?