There seems to be some excitement growing around the subject of wireless charging, with the various industry associations, standards setting bodies and companies jostling for position. The names and groups include the Wireless Power Consortium, Alliance for Wireless Power (A4WP), Qualcomm, Samsung and Intel.
I remember getting a little excited about the topic myself a few years ago when a company called Splashpower Ltd. emerged out of Cambridge England. But the company and I were a bit ahead of the market and the company has disappeared.
However, the more I study the topic the more I am inclined to say forget it. Do the right thing and use a wired charger for reasons of ecology. That's because wireless charging is not as energy efficient as wired charging. Last I heard a typical wireless energy transfer efficiencies are about 70 percent going up to 80 to 85 percent with careful design, more copper and better shielding. But it is hard to imagine it ever being as efficient – or as green – as wired charging.
There is a counter argument that runs thus: If users of multiple separate chargers leave them plugged in 24 hours a day and 7 days a week, even when they are not charging an appliance they will each consume power. A single wireless charging platter that copes with multiple pieces of equipment, switched on and off appropriately could represent a power saving.
I don't buy it. Instead I would say do the right thing and plug in a wired charger for the time it is needed and then disconnect it.
There are parallels with the invention the standby button. Philips –which has now got out of consumer electronics – used to claim this invention and that it was a great boon to mankind because of the power it saved.
I reckon the opposite is true. Until the standby button was invented people turned off appliances that were not in use and I even unplugged them. Once the standby switch was invented televisions were left on overnight and sometimes are not switched off for months or years at a time, all the time drawing some power.
Finding enough energy to do all the things a global population of 7 billion wants to do at a cost it can afford is a major challenge to humanity so "wasting" energy should become one of the big sins of our era.
Of course, one person's waste is another person's convenience and yet another's necessity. But in general I would say do the right thing, and don't use an inherently inefficient technology when there is a more efficient established alternative.
P.S. This maxim also applies to writing inefficient software that makes unnecessary fetches from memory or carelessly uses processor cycles to achieve overblown functionality or graphics. But the authors of code for mobile applications at least have a commercial imperative to do the right thing.
The thing that you leave unmentioned is that the wireless ecosystem that was supposed to come with wireless charging, i.e., 60GHz fat pipes, is also dead in the water as a consequence. If you have to plug in your smartphone to charge, why wouldn't you use the same plug to stream video to your TV?
Many people believe the killer app for wireless charging is for the automakers to embed it in the center consoles of cars. From an energy use perspective, there is no waste there -- the electricity will be generated anyway as a byproduct of the engine running.
For home or office charging, the concept of a charging mat that plugs into the mains doesn't seem like a huge benefit to me. But if charging mats get embedded into office desktops or kitchen countertops -- with a permanent connection to the mains -- it starts to get more interesting.
As for efficiency, your comment about "do the right thing and plug in a wired charger for the time it is needed and then disconnect it" sounds great, but I suspect few people actually do that -- not because of lack of concern about wasting energy, but more out of concern that chargers have a habit of disappearing when not plugged in.
Standard wall wart chargers could indeed be made more efficient by becoming smarter, but that seems unlikely. Who is going to add intelligence to a $5 wall wart in the interest of energy conservation, and which consumers will pay extra for that?
The argument that the electricity generated in the car is somehow free is surely spurious.
If you put a load on -- to charge phones, or run the air conditioning -- you reduce the miles per gallon.
You're right, it's not completely free, but it should be negligible compared to the load placed on the alternator by the rest of the car's electrical systems -- headlights, heater, audio system and the various electronic control systems.
A/C is different -- a substantial mechanical load placed on the engine when the compressor is on -- and indeed running the A/C has a measurable effect on mpg.
It would be an interesting calculation -- the actual mpg cost of charging a cell phone -- but I suspect that driving habits have a far greater impact on mpg, for better or worse, than charging a phone.
Peter does make a good point, though. Negligible is true enough, perhaps, but alternators do place a load on the engine.
Many years ago now, one of our neighbors had a brand new car, and complained that the battery was running down. The charging light was coming on when the car was running.
Since the car was brand new, my first reaction was to tug a little on the fan belt. Yup. It wasn't very tight.
A simple adjustment solved the problem. Point being, that alternator wasn't turning, just because the belt wasn't tight. It was in place okay, but it was slipping. That's engine load.
I agree also with iniewski. Are we really that lazy?
A quick search on the subject of mpg cost of running an alternator reveals calculations varying from 1 to 1.5 HP when the alternator is putting out 50 amps -- a substantial load for a passenger car (excluding EVs of course). Let's assume this means a properly working alternator, no belt slipping, etc.
On a modest 4-cylinder with 100 HP, we're talking 1-1.5% HP loss to run the alternator at a 50A load. A 5 watt cell phone charger will pull less than 0.5A from that 12V regulator. Let's be generous and call it an amp -- it's still only 2% of the alternator output, which translates to 0.03% of the HP loss due to running the alternator.
In the usual manner of engineering approximations, I'll go out on a limb and say that a 0.03% loss of HP translates to a 0.03% loss of mpg and that you will never notice that.
So when it comes to charging your phone or even your big tablet in the car, charge away! The cost in extra gasoline is almost immeasurable.
Frank, an efficiently designed car, running at 50 mph, requires about 12 to 14 HP to maintain a steady speed. So if the alternator needs 1.5 HP to turn (thanks for the number, I didn't have it at hand), that's a substantial 10-12 percent of the total output of the engine.
Car and Driver used to post two figures with their road tests, some time ago. The HP required at 50 and at 70. Obviously, the number is higher at 70.
I really don't understand value proposition for wireless charging. I still need to bring my device to be charged in close proximity to the wireless charger. How is that different than connecting it to a wired charger? Are we really that lazy that we want to save ourselves the effort of pushing into the plug?
I think the argument goes something like, I have 4 phones in the house, a tablet (or two), a cordless phone (or two), a laptop, a remote, several battery operated kids toys and a plethora of chargers to support them all. If we all played nice and built a one size fits all charger, I think your argument applies (which is what wireless charging is trying to be).
Why do you believe that efficiencies won't get better with further development like they do everywhere else over time?
Think of the possibilities if every AA or AAA battery in your house was wirelessly rechargable. I think that's eco friendly enough for all, rather than the billions that get dumped in landfills every year? Many many more products out there simply don't come with chargers than those that do ...
A one size fits all wired charger would be a good thing, although it goes contrary to the best interests of individual gadget-producing companies. Who would much prefer to add the revenues generated by a proprietary charger, and possible future replacements, to their bottom line.
I was astounded, for example, to discover that I couldn't find a replacement charger cord tip to fit an old appliance we had. Radio Shack sells a bunch of these in assorted sizes, and yet not a single one worked. I wonder why.
Anyway, that aside, I'd say wireless charging is bound to waste energy compared to wired. Not just to radiation, but also to the extra circuits needed for transmission and reception.
You'd think that a load sensing shutoff on a wired charger would be a doable do. Wouldn't be free, but in huge numbers, should amount to much of a price hike.
Yes, we are lazy. Having a pad to throw my phone on at the end of the day is much easier than getting that little micro USB connected. On the other hand, I am very stingy with my money, so I will plug the little connector into the phone rather than pay extra in energy costs.
The universal charger already exist: through USB port. I got a smartphone, a tablet and a normal old mobile phone. All got the USB-charger port. I got an USB adaptor for the wall/car... since then I charge only through USB port.
The efficiency question will never go away, but it will become less and less or a factor. Consider the required battery capacity of a phone. Years ago, very large batteries were required. The battery size dropped over time as the electronics became more efficient and the transmit power dropped.
Battery capacity took a big saw tooth back up with Smart phones, but it will eventually work its way back down again. My pre-smart phones typically could go about a week without being plugged into a charger. My smartphone needs to be charged every day.
Once we work back to that level, then charging energy will be a much smaller factor. There are ways to deal with phantom power too. Consider a system where the wireless charging plate would draw a few micro amps when not charging anything. It may even be possible to completely disconnect it from the wall. I would venture a guess that most corded chargers are just left plugged into the wall.
How above the EM energy emitted from the charging pad, especially inside a car? Can anyone tell me that the EM wave running freely inside the car cabin will not have any negative health effect upon the people inside the car? In particular, some of these people could be infants or babies. If multiple power hogs have to be charged at the same time, the energy of the EM wave running around are substantial. I feel that the wireless charging is like repeating the "no new wires, wireless" which the whole tech world was evangelizing 12 years ago!
If wireless chargers, through their convenience, catch on and become ubiquitous; and they are standardized to charge a multitude of devices; and if by eliminating the mechanical plug and socket, the wireless chargers have much longer lifetimes than the wall-warts and USB cables they replace: Then might not wireless charges reduce resource waste, and the energy wasted in the manufacture and distribution of countless charging devices that currently come included with almost every commercial electronic device? Perhaps wireless chargers though less efficient at their task, are potentially greener over their lifetimes when all factors are considered.
It IS possible that wireless chargers are greener than badly designed and badly deployed wired chargers that are left plugged in drawing a little power 24/7.
I agree there is the issue of a single piece of equipment made, packaged and shipped versus multiple chargers.
Bad logic: wireless chargers consume at least much as wall warts when plugged in, and if their value proposition is that it's easy to just drop your phone on a pad (i.e., the wireless charger is always plugged in), then it's even worse.
As I look at the morass of charging wires running around my office to various objects, there's no way I'm going to unplug all of them when they are charged up, because the net result will be too many ending up uncharged, reducing battery life and ultimately costing a whole bunch of energy to replace. And, unplugging the chargers when a device is charged means adding a slight hazard (multiply this by hundreds of millions of people and it's not so slight): that of electrical shock and that of fire from disturbing the outlet strip and it's environment.
This whole obsession with energy efficiency has reached the level of a religion, with its tenets unquestioned. Energy generation is there to improve our lives, and the piddly energy savings from unplugging wall warts (or not using the less efficient near field chargers) is just not enough to justify the effort except for either the innumerate or the energy obsessed. If the near field charging reduces the clutter (not to mention the fire hazard), I'll use it even if it's 50% efficient.
This power-saving business is a big joke to people who are heated by electric heaters.
During winter, when energy is scarce (hydro-electric is off due to water freezing, solar energy is low, and power requirements are high), people who use electricity to heat their homes and unplug their chargers or turn off their lights will only increase the power consumption of their heaters. In the end, you'll want N kW of power to heat up your home at all times, and whether that power comes from your electric-heaters or various plugged-in electronic devices will not change a thing: all energy ends up as heat.
And in summer, energy is abondant enough (solar is more efficient, hydroelectric is on, and power demand is low) for you to be able to waste some of it.
This power-saving business only applies to people who heat their homes with something else than electricity, and in my country, gaz-based central heating has become more or less illegal in new buildings due to explosion hazard, so all new buildings are electric-heated.
My point: as long as we don't look at the full picture, "saving energy" by turning off a light bulb has no meaning.
That is partially true Denis but only in cold places and only at cold times. I live in Canada (cold place by any standard) and use heating only 4 months. On top of that natural gas heating is 3x less expensive than electrical here. So I have no plans to use wireless charging while charging my iPhones to heat my house in the process anytime soon ;-)
In fact wireless charging of electric cars will also happen. (They have buses like this at bus stops now - for a developmental version of these buses.) And the charging coils will be in the streets - this will save tons of weight as only minimal battery capacity will be needed.
(An emergency small gas generator can be added to the car in case street coils are broken or not there in places.)
You can't violate the laws of physics. There is no way to wirelessly transfer energy that doesn't leak. And the larger the gap the larger the leak. A charging mat will have a large gap by necessity. And the leak does not become heat, it is a radio wave, that will leave the Earth taking its energy with it. The idea of charging a car or bus from the road scares me, that is a six inch, unsheilded gap! People talk about the convenience of these kinds of things (undoubtedly true), but we already must find an alternative to fossil fuels, and are struggling to find safe energy sources to replace them. We are also struggling to up our infrastructure to allow electricity to take over for fossil fuels. Now you want to add a 20% hit to that?
No 180 degree turn.
I was excited about Splashpower because it was a UK startup and something for me to write about in a novel area. Didn't mean I was going to use the product.
Similarly if a company emerged today with an innovative wireless charging offering I would be interested to learn about it and write about it.
I came across Splashpower right at the beginning as I used to work with its first first CEO. I agree that the potential of wireless charging seemed fantastic. They were talking about pads in coffee shops, hotels, bars, cars, bedrooms, kitchen worktops etc. The ability to walk into almost any room and drop all my devices on a pad was (and still is) very attractive from a convenience point of view.
I think it foundered, and possibly will continue to founder, on two things. One is the need for ubiquitous pads; the other is the need for universal and seamless interworking across products and manufacturers.
I suspect that the first will never happen due to the cost and the second because of competing interests. I am happy to be corrected by events on both counts!
I admit that I had not considered the "green" aspect before but it is clear that any wireless charging solution is going to be less energy-efficient (for all sorts of reasons - transmission loss and being left on all the time to name but two). Peter's aspiration to used a wired brick and always unplug it when not in use is admirable but 99% of us (myself included) will _never_ do it!
All of you talk about the inefficiencies, but none have given any hard numbers. You are all going on gut, which is mostly good, but best you get some hard facts from an actual pad or pad company with each of the competing technologies (inductive or resonant, or etc) before you start spouting a religion.
I have designed wireless and it will never be as efficient as wired everything being equal. Wired can more easily detect load and shut itself down. If the manufactures can use a standard for wireless they could create a standard for a couple of common plugs for different power levels. I believe parts of Europe require common chargers for cell phones. Think of all the land fills saved and the energy saved in not making extra chargers. Also no EMI pollution.
Though in terms of energy efficiency, wireless charging is not that good. Under those cases where energy efficiency is not of major concern, it is still unforgettable.
1) In some situations, we cannot apply contact charging, for example, those electronics embedded in human bodies. To power those devices under these applications, wireless charging is, at least, one of the choices.
2) When a lot of charging devices are considered, the amount of chargers can sometimes be reduced with wireless charging. We need different adaptors for devices of different voltages. A wireless charger only provides the primary side coupling coil while the devices with secondary coils determine the voltages required. This caters for devices with different operating voltages.
3) For wireless charging, no connector and so, no plugging/unplugging mechanical reliability issue.
So, a small portion of wireless charging may still be required.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.