There seems to be some excitement growing around the subject of wireless charging, with the various industry associations, standards setting bodies and companies jostling for position. The names and groups include the Wireless Power Consortium, Alliance for Wireless Power (A4WP), Qualcomm, Samsung and Intel.
I remember getting a little excited about the topic myself a few years ago when a company called Splashpower Ltd. emerged out of Cambridge England. But the company and I were a bit ahead of the market and the company has disappeared.
However, the more I study the topic the more I am inclined to say forget it. Do the right thing and use a wired charger for reasons of ecology. That's because wireless charging is not as energy efficient as wired charging. Last I heard a typical wireless energy transfer efficiencies are about 70 percent going up to 80 to 85 percent with careful design, more copper and better shielding. But it is hard to imagine it ever being as efficient – or as green – as wired charging.
There is a counter argument that runs thus: If users of multiple separate chargers leave them plugged in 24 hours a day and 7 days a week, even when they are not charging an appliance they will each consume power. A single wireless charging platter that copes with multiple pieces of equipment, switched on and off appropriately could represent a power saving.
I don't buy it. Instead I would say do the right thing and plug in a wired charger for the time it is needed and then disconnect it.
There are parallels with the invention the standby button. Philips –which has now got out of consumer electronics – used to claim this invention and that it was a great boon to mankind because of the power it saved.
I reckon the opposite is true. Until the standby button was invented people turned off appliances that were not in use and I even unplugged them. Once the standby switch was invented televisions were left on overnight and sometimes are not switched off for months or years at a time, all the time drawing some power.
Finding enough energy to do all the things a global population of 7 billion wants to do at a cost it can afford is a major challenge to humanity so "wasting" energy should become one of the big sins of our era.
Of course, one person's waste is another person's convenience and yet another's necessity. But in general I would say do the right thing, and don't use an inherently inefficient technology when there is a more efficient established alternative.
P.S. This maxim also applies to writing inefficient software that makes unnecessary fetches from memory or carelessly uses processor cycles to achieve overblown functionality or graphics. But the authors of code for mobile applications at least have a commercial imperative to do the right thing.
You're right, it's not completely free, but it should be negligible compared to the load placed on the alternator by the rest of the car's electrical systems -- headlights, heater, audio system and the various electronic control systems.
A/C is different -- a substantial mechanical load placed on the engine when the compressor is on -- and indeed running the A/C has a measurable effect on mpg.
It would be an interesting calculation -- the actual mpg cost of charging a cell phone -- but I suspect that driving habits have a far greater impact on mpg, for better or worse, than charging a phone.
The argument that the electricity generated in the car is somehow free is surely spurious.
If you put a load on -- to charge phones, or run the air conditioning -- you reduce the miles per gallon.
I think the argument goes something like, I have 4 phones in the house, a tablet (or two), a cordless phone (or two), a laptop, a remote, several battery operated kids toys and a plethora of chargers to support them all. If we all played nice and built a one size fits all charger, I think your argument applies (which is what wireless charging is trying to be).
Why do you believe that efficiencies won't get better with further development like they do everywhere else over time?
Think of the possibilities if every AA or AAA battery in your house was wirelessly rechargable. I think that's eco friendly enough for all, rather than the billions that get dumped in landfills every year? Many many more products out there simply don't come with chargers than those that do ...
I really don't understand value proposition for wireless charging. I still need to bring my device to be charged in close proximity to the wireless charger. How is that different than connecting it to a wired charger? Are we really that lazy that we want to save ourselves the effort of pushing into the plug?
Many people believe the killer app for wireless charging is for the automakers to embed it in the center consoles of cars. From an energy use perspective, there is no waste there -- the electricity will be generated anyway as a byproduct of the engine running.
For home or office charging, the concept of a charging mat that plugs into the mains doesn't seem like a huge benefit to me. But if charging mats get embedded into office desktops or kitchen countertops -- with a permanent connection to the mains -- it starts to get more interesting.
As for efficiency, your comment about "do the right thing and plug in a wired charger for the time it is needed and then disconnect it" sounds great, but I suspect few people actually do that -- not because of lack of concern about wasting energy, but more out of concern that chargers have a habit of disappearing when not plugged in.
Standard wall wart chargers could indeed be made more efficient by becoming smarter, but that seems unlikely. Who is going to add intelligence to a $5 wall wart in the interest of energy conservation, and which consumers will pay extra for that?
The thing that you leave unmentioned is that the wireless ecosystem that was supposed to come with wireless charging, i.e., 60GHz fat pipes, is also dead in the water as a consequence. If you have to plug in your smartphone to charge, why wouldn't you use the same plug to stream video to your TV?
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.