Don't forget to have some form of transient protection for the energy burst you'll get from a nearby lightning strike.
If you're near enough to a high voltage hydro line you can light up as many florescent bulbs as you want without any circuitry whatsoever. With a little ingenuity you could power your whole house. But that would be stealing hydro power.
Actually, doing the numbers shows how even being relatively close to a very high power TV transmitter won't give you much energy harvesting potential at all. Unless you can build a spherical antenna around the tower, I suppose.
UHF Digital TV in the US is limited by the FCC to a max of 1 MW of effective radiated power. So, let's say you're in the beam of a 600 MHz transmitter, a mere 1 mile away, with a unity gain antenna. How much power can you dream to harvest from that 1 MW signal, with a circuit tuned to 600 MHz?
At 1 mile, with direct line of sight, channel loss is 92.15 dB. So the maximum theoretical amount of power you can harvest, with this unity gain receive antenna, is a mere 0.6 mW. Increase the distance to 1.5 miles, and you're getting to nothing much useful harvested, for powering devices.
Probably the biggest problem when thinking about 'harvesting' wireless (or acoustic - it's just the same problem) energy: there are a lot of frequencies. To harvest you have to rectify at some point and the mean amplitude of the superposed frequencies tends to be 0 :(
Another thought: harvesting acoustic energy would be really great. Currently we've got noise protection embankments here (Germany, Europe) but noise reduction by energy harvesting would be really great. But the problems stay: tuning and mean amplitude.
The thing I like about using RF is that there is so much around, from broadcast etc that is truly "wasted". The thing I don't like is that it is a very weak source despite its pervasiveness, as you point out. And as another commenter pointed out, if you are not careful, the random field you absorb may be the energy that someone else's handset or similarr needed to see, so they are unfortunately blocked.
The old story: there are no free lunches due to the laws of physics, but some lunches are less expensive that others or cause less indigestion.
I did a cursory review of the APL paper. These guys surely do not revolutionize physics and there is no real need to worry about WiFi network performance. To clarify things I'll give what I've found:
1. The 7.3 Volts are OC. Thus power is 0 at that point. 2. They fed 24.25 dBm (ca. 266 mW) into a waveguide, thus loosing no significant amount of the power fed. 3. Assuming that the optimum load resistance of ca. 70 R has anything to do with 100 mA current is not their fault. BTW: the picture shows a green LED shining not too bright. That would be more like .. 10 mA @ 2.4 V -> 24 mW. 4. They state a conversion efficiency of ca. 14.2 % for a single cell and 36.8 % for a 5x1 array. That's surely not bad for coreless wireless transmission, but think about what a transformer design could have achieved. 5. Luckily they did an experiment: the simulation gave them an efficiency of twice the values they could confirm by experiment. That's how myths are born :)
Anyway, I hope they had fun.
Dreaming of powering sensors that way ? Stick to the µW of power consumption as you would not want to supply a field of 2 mW/cm².
The place that this technology will shine is for ubiquitous sensors. I've seen proof-of-concept sensor applications using such an approach. It puts remote sensors (strain gages) on bridges to safegard against events like the collapse of the I-35 bridge across the Mississippi River in Minneapolis a few years ago. This obviates the issue of how you distribute power to all the sensors, provides the wireless comms and likely leads to SoCs for the Internet of Things that have inherent power supplies. Imagine smoke detectors that never need new batteries! Thermostats in every room of your house and zone control by room not tied to where they put the LV wires when it was built.
" It will likely produce a dropout of the field in the immediate area of the device. Widespread use of these absorbent resonators could make the wifi non functional for some users. "
That would be amazing, but I'm sure that's not true. In a typical communications link budget, 10dBm is delivered to the transmitting antenna. The signal is radiated in all directions. A good antenna across the room will be lucky to capture -40dBm (i.e. ten millionths of the transmitted power). It will follow the free-space loss equation. You can get more if the receiving antenna is in the near-field (less than 5 inches for wifi). But the notion that a receiving antenna is going to suck all the power and deliver it to the receiver is wrong. That would be an amazing thing for the communications industry.
To me the application for this is to direct a "safe" level of RF energy at the device that needs power. You direct maybe one watt to it and hope to recover a milliwatt. The rest of the power goes to heating up water molecules. Kooks currently complain that 100mW transmitters cause health problems. Wait until we're intentionally causing 1W to couple people's dinner tables to power tablets and phones.