There seems to be some excitement growing around the subject of wireless charging, with the various industry associations, standards setting bodies and companies jostling for position. The names and groups include the Wireless Power Consortium, Alliance for Wireless Power (A4WP), Qualcomm, Samsung and Intel.
I remember getting a little excited about the topic myself a few years ago when a company called Splashpower Ltd. emerged out of Cambridge England. But the company and I were a bit ahead of the market and the company has disappeared.
However, the more I study the topic the more I am inclined to say forget it. Do the right thing and use a wired charger for reasons of ecology. That's because wireless charging is not as energy efficient as wired charging. Last I heard a typical wireless energy transfer efficiencies are about 70 percent going up to 80 to 85 percent with careful design, more copper and better shielding. But it is hard to imagine it ever being as efficient – or as green – as wired charging.
There is a counter argument that runs thus: If users of multiple separate chargers leave them plugged in 24 hours a day and 7 days a week, even when they are not charging an appliance they will each consume power. A single wireless charging platter that copes with multiple pieces of equipment, switched on and off appropriately could represent a power saving.
I don't buy it. Instead I would say do the right thing and plug in a wired charger for the time it is needed and then disconnect it.
There are parallels with the invention the standby button. Philips –which has now got out of consumer electronics – used to claim this invention and that it was a great boon to mankind because of the power it saved.
I reckon the opposite is true. Until the standby button was invented people turned off appliances that were not in use and I even unplugged them. Once the standby switch was invented televisions were left on overnight and sometimes are not switched off for months or years at a time, all the time drawing some power.
Finding enough energy to do all the things a global population of 7 billion wants to do at a cost it can afford is a major challenge to humanity so "wasting" energy should become one of the big sins of our era.
Of course, one person's waste is another person's convenience and yet another's necessity. But in general I would say do the right thing, and don't use an inherently inefficient technology when there is a more efficient established alternative.
P.S. This maxim also applies to writing inefficient software that makes unnecessary fetches from memory or carelessly uses processor cycles to achieve overblown functionality or graphics. But the authors of code for mobile applications at least have a commercial imperative to do the right thing.
The thing that you leave unmentioned is that the wireless ecosystem that was supposed to come with wireless charging, i.e., 60GHz fat pipes, is also dead in the water as a consequence. If you have to plug in your smartphone to charge, why wouldn't you use the same plug to stream video to your TV?
Many people believe the killer app for wireless charging is for the automakers to embed it in the center consoles of cars. From an energy use perspective, there is no waste there -- the electricity will be generated anyway as a byproduct of the engine running.
For home or office charging, the concept of a charging mat that plugs into the mains doesn't seem like a huge benefit to me. But if charging mats get embedded into office desktops or kitchen countertops -- with a permanent connection to the mains -- it starts to get more interesting.
As for efficiency, your comment about "do the right thing and plug in a wired charger for the time it is needed and then disconnect it" sounds great, but I suspect few people actually do that -- not because of lack of concern about wasting energy, but more out of concern that chargers have a habit of disappearing when not plugged in.
Standard wall wart chargers could indeed be made more efficient by becoming smarter, but that seems unlikely. Who is going to add intelligence to a $5 wall wart in the interest of energy conservation, and which consumers will pay extra for that?
I really don't understand value proposition for wireless charging. I still need to bring my device to be charged in close proximity to the wireless charger. How is that different than connecting it to a wired charger? Are we really that lazy that we want to save ourselves the effort of pushing into the plug?
I think the argument goes something like, I have 4 phones in the house, a tablet (or two), a cordless phone (or two), a laptop, a remote, several battery operated kids toys and a plethora of chargers to support them all. If we all played nice and built a one size fits all charger, I think your argument applies (which is what wireless charging is trying to be).
Why do you believe that efficiencies won't get better with further development like they do everywhere else over time?
Think of the possibilities if every AA or AAA battery in your house was wirelessly rechargable. I think that's eco friendly enough for all, rather than the billions that get dumped in landfills every year? Many many more products out there simply don't come with chargers than those that do ...
The argument that the electricity generated in the car is somehow free is surely spurious.
If you put a load on -- to charge phones, or run the air conditioning -- you reduce the miles per gallon.
You're right, it's not completely free, but it should be negligible compared to the load placed on the alternator by the rest of the car's electrical systems -- headlights, heater, audio system and the various electronic control systems.
A/C is different -- a substantial mechanical load placed on the engine when the compressor is on -- and indeed running the A/C has a measurable effect on mpg.
It would be an interesting calculation -- the actual mpg cost of charging a cell phone -- but I suspect that driving habits have a far greater impact on mpg, for better or worse, than charging a phone.
Peter does make a good point, though. Negligible is true enough, perhaps, but alternators do place a load on the engine.
Many years ago now, one of our neighbors had a brand new car, and complained that the battery was running down. The charging light was coming on when the car was running.
Since the car was brand new, my first reaction was to tug a little on the fan belt. Yup. It wasn't very tight.
A simple adjustment solved the problem. Point being, that alternator wasn't turning, just because the belt wasn't tight. It was in place okay, but it was slipping. That's engine load.
I agree also with iniewski. Are we really that lazy?
A one size fits all wired charger would be a good thing, although it goes contrary to the best interests of individual gadget-producing companies. Who would much prefer to add the revenues generated by a proprietary charger, and possible future replacements, to their bottom line.
I was astounded, for example, to discover that I couldn't find a replacement charger cord tip to fit an old appliance we had. Radio Shack sells a bunch of these in assorted sizes, and yet not a single one worked. I wonder why.
Anyway, that aside, I'd say wireless charging is bound to waste energy compared to wired. Not just to radiation, but also to the extra circuits needed for transmission and reception.
You'd think that a load sensing shutoff on a wired charger would be a doable do. Wouldn't be free, but in huge numbers, should amount to much of a price hike.
A quick search on the subject of mpg cost of running an alternator reveals calculations varying from 1 to 1.5 HP when the alternator is putting out 50 amps -- a substantial load for a passenger car (excluding EVs of course). Let's assume this means a properly working alternator, no belt slipping, etc.
On a modest 4-cylinder with 100 HP, we're talking 1-1.5% HP loss to run the alternator at a 50A load. A 5 watt cell phone charger will pull less than 0.5A from that 12V regulator. Let's be generous and call it an amp -- it's still only 2% of the alternator output, which translates to 0.03% of the HP loss due to running the alternator.
In the usual manner of engineering approximations, I'll go out on a limb and say that a 0.03% loss of HP translates to a 0.03% loss of mpg and that you will never notice that.
So when it comes to charging your phone or even your big tablet in the car, charge away! The cost in extra gasoline is almost immeasurable.
Frank, an efficiently designed car, running at 50 mph, requires about 12 to 14 HP to maintain a steady speed. So if the alternator needs 1.5 HP to turn (thanks for the number, I didn't have it at hand), that's a substantial 10-12 percent of the total output of the engine.
Car and Driver used to post two figures with their road tests, some time ago. The HP required at 50 and at 70. Obviously, the number is higher at 70.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.