The original purpose of PoE was to allow VoIP phone instruments to get all the power they need from the Ethernet Cat-5e cable. Rather than having to connect a separate power cable. So this is just an update, to provide more power over the Ethernet cable.
The initial iteration was really simple. It used spare conductors in the 4 twisted pair Cat-5e to carry power. This is when Ethernet only needed two of the pairs. But then starting with 1000BASE-T, where all 4 pairs are needed for data, they had to get a little more clever. So basically power becomes the DC component on those twisted pairs.
No magic here, Some Guy. PoE is something that only applies from one layer 2 switch port to the host attached at the other end of the cable. It's not like you're running power between switches and routers, over the network.
So, the only concern here would be how much aggregate power a single edge Ethernet switch can deliver, to its attached hosts. That's all.
If you have a building full of equipment, presumably you have more than just one edge Ethernet switch connecting to these end systems.
From the energy research that I've done, the nominal you need at a minimally configured "guest" workstation is 106W for a monitor, phone (PBX-connected), cell phone charger and laptop charger. (163 if you want lights included.) You can get 100W from a USB 3.0 cable and 15-25W from PoE.
If I have to power the Ethernet hub at that workstation from AC mains to get PoE to all those devices, where does PoE help? If I have to power EVERY cubicle with AC mains to run a hub, how does that scale? That's not enough to power a single hub by USB 3.0. And if I'm going to use USB 3.0 why wouldn't I just use it for everything instead of PoE?
I can certainly see the benefit of having 1 cable running between host (Internet Gateway) and a device (e.g. an Infotainment device). However, given the space of a vehicle, I am not sure the finanical and production benefit. Another question needs to address is the energy loss. It is too soon to say there is any benefit of improving the power rating but it is definitely worth following.
I think PoE will be a point to point connection from the Hub to Endpoint. Hub will distribute power over Ethernet Cable to the Endpoints and the Hub may be powered by the PGE Power Port. If the Endpoints ar not power hungry, I believe Hub can also be powered by Ethernet Cable like USB.
It's my assumptions, I need to read the spec though :)
It is a very late initiative for promoting PoE, security cameras and authentication devices in the offices, industries and domestic environment are very much cluttered with network and power connections, at-least use of PoE will eliminate the power adapters and in turn the power supplies wiring. PoE is noting new but it is not promoted up-to extent it was required.
The thing I could never figure out about Power of Ethernet is how does it scale beyond 1 device. Over a 26 AWG wire in Cat6 cable I can power a device at 50W, but how do I power a whole building worth of devices to scale it to the killoWatts that an office building consumes? If I'm running 120 or 208 out to a bunch of distributed power supplies at every cubical or in parallel with every Ethernet cable ... I just don't see how it pencil's out.
Can someone help us with how you get around the conclusion that PoE may be great for 1 device, but brain-dead for a whole building's worth?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.