There seems to be some excitement growing around the subject of wireless charging, with the various industry associations, standards setting bodies and companies jostling for position. The names and groups include the Wireless Power Consortium, Alliance for Wireless Power (A4WP), Qualcomm, Samsung and Intel.
I remember getting a little excited about the topic myself a few years ago when a company called Splashpower Ltd. emerged out of Cambridge England. But the company and I were a bit ahead of the market and the company has disappeared.
However, the more I study the topic the more I am inclined to say forget it. Do the right thing and use a wired charger for reasons of ecology. That's because wireless charging is not as energy efficient as wired charging. Last I heard a typical wireless energy transfer efficiencies are about 70 percent going up to 80 to 85 percent with careful design, more copper and better shielding. But it is hard to imagine it ever being as efficient – or as green – as wired charging.
There is a counter argument that runs thus: If users of multiple separate chargers leave them plugged in 24 hours a day and 7 days a week, even when they are not charging an appliance they will each consume power. A single wireless charging platter that copes with multiple pieces of equipment, switched on and off appropriately could represent a power saving.
I don't buy it. Instead I would say do the right thing and plug in a wired charger for the time it is needed and then disconnect it.
There are parallels with the invention the standby button. Philips –which has now got out of consumer electronics – used to claim this invention and that it was a great boon to mankind because of the power it saved.
I reckon the opposite is true. Until the standby button was invented people turned off appliances that were not in use and I even unplugged them. Once the standby switch was invented televisions were left on overnight and sometimes are not switched off for months or years at a time, all the time drawing some power.
Finding enough energy to do all the things a global population of 7 billion wants to do at a cost it can afford is a major challenge to humanity so "wasting" energy should become one of the big sins of our era.
Of course, one person's waste is another person's convenience and yet another's necessity. But in general I would say do the right thing, and don't use an inherently inefficient technology when there is a more efficient established alternative.
P.S. This maxim also applies to writing inefficient software that makes unnecessary fetches from memory or carelessly uses processor cycles to achieve overblown functionality or graphics. But the authors of code for mobile applications at least have a commercial imperative to do the right thing.
In fact wireless charging of electric cars will also happen. (They have buses like this at bus stops now - for a developmental version of these buses.) And the charging coils will be in the streets - this will save tons of weight as only minimal battery capacity will be needed.
(An emergency small gas generator can be added to the car in case street coils are broken or not there in places.)
That is partially true Denis but only in cold places and only at cold times. I live in Canada (cold place by any standard) and use heating only 4 months. On top of that natural gas heating is 3x less expensive than electrical here. So I have no plans to use wireless charging while charging my iPhones to heat my house in the process anytime soon ;-)
How above the EM energy emitted from the charging pad, especially inside a car? Can anyone tell me that the EM wave running freely inside the car cabin will not have any negative health effect upon the people inside the car? In particular, some of these people could be infants or babies. If multiple power hogs have to be charged at the same time, the energy of the EM wave running around are substantial. I feel that the wireless charging is like repeating the "no new wires, wireless" which the whole tech world was evangelizing 12 years ago!
This power-saving business is a big joke to people who are heated by electric heaters.
During winter, when energy is scarce (hydro-electric is off due to water freezing, solar energy is low, and power requirements are high), people who use electricity to heat their homes and unplug their chargers or turn off their lights will only increase the power consumption of their heaters. In the end, you'll want N kW of power to heat up your home at all times, and whether that power comes from your electric-heaters or various plugged-in electronic devices will not change a thing: all energy ends up as heat.
And in summer, energy is abondant enough (solar is more efficient, hydroelectric is on, and power demand is low) for you to be able to waste some of it.
This power-saving business only applies to people who heat their homes with something else than electricity, and in my country, gaz-based central heating has become more or less illegal in new buildings due to explosion hazard, so all new buildings are electric-heated.
My point: as long as we don't look at the full picture, "saving energy" by turning off a light bulb has no meaning.
As I look at the morass of charging wires running around my office to various objects, there's no way I'm going to unplug all of them when they are charged up, because the net result will be too many ending up uncharged, reducing battery life and ultimately costing a whole bunch of energy to replace. And, unplugging the chargers when a device is charged means adding a slight hazard (multiply this by hundreds of millions of people and it's not so slight): that of electrical shock and that of fire from disturbing the outlet strip and it's environment.
This whole obsession with energy efficiency has reached the level of a religion, with its tenets unquestioned. Energy generation is there to improve our lives, and the piddly energy savings from unplugging wall warts (or not using the less efficient near field chargers) is just not enough to justify the effort except for either the innumerate or the energy obsessed. If the near field charging reduces the clutter (not to mention the fire hazard), I'll use it even if it's 50% efficient.
Bad logic: wireless chargers consume at least much as wall warts when plugged in, and if their value proposition is that it's easy to just drop your phone on a pad (i.e., the wireless charger is always plugged in), then it's even worse.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.