Breaking News
Comments
Newest First | Oldest First | Threaded View
<<   <   Page 2 / 2
cristian.olar
User Rank
Rookie
Programming shift
cristian.olar   7/17/2014 11:26:14 AM
As computing is shifted more and more to the devices closer to the sensor this will also translate into a need for more and more research work to happen at embedded level. If I look in the world today, most processors closer to the sensors interfacing to the external world implement relatively simple algorithms, maybe just some FIR filter or such and leave the decisions to other stronger APs, or in this case the cloud.

What I see is that usually the main processors (or, let's talk about the cloud now) are based on some architecture which supports programming in very abstract ways, while the processors closer to the sensors are usually used still in a very "bare-metal" way to squeeze the maximum optimization possible in terms of power usage, speed etc.

So, this shift in locating the processing power I think leads to either the researchers of today having to abandon part of the abstractions they were used to and get closer to the metal or, alternetively, we might be at an era of a new strand of development towards improving tools for embedded in general, which would enable research labs and universities to program embedded devices at the same abstraction levels they would be used to.

The interesting problem I see with the latter is that while main processors tend to usually stay within one or two famillies, which enables development of their tools to be allocated to scores of teams around the world, the processors closer to the sensor are a lot more diverse. Many of them having dedicated instruction architectures therefore requiring separate toolchains in principle. In practice though we are seeing the rise of tools which try to abstract the architecture away to some extent like LLVM compilers are for example. So this new trend of switching processing closer to the sensor will, I think, open new doors for such portability tools.

In any case, the future will be interesting to witness.

 

vali_#1
User Rank
Rookie
Future is here!
vali_#1   7/17/2014 11:24:14 AM
Great Vision that connected a new range of dots!

It makes sense. Smart devices from our pockets get geared up with more and more sensors. In parallel, the Internet of Things herald a new tsunami of data. This won't fit all in the netwrok bandwidth on the way up to the cloud and it would not be intelligent to push it that way either. On the other hand, the natural push would be to make the smart devices really smarter.

What would make a device smarter if not extra intelligence at a reasonable cost? A reasonable cost meaning also a reasonable power consumption.

Is AI the way to go? What neural network architecture/solutions will make this happen? What would be the HW - SW breakdown to minimize latency but provide as much intelligence as possible to cloud? Is this a new era for data mining too?

In the end I like this parallel to human brain: "Latency is also an issue for the human brain, where distributed processing powers our reflexes without involving the frontal cortex." Yet another dot connected! How many dots left to connect until we replicate the human brain? How about a uber-cloud overseeing other brains replicas 'socializing" with their data?

<<   <   Page 2 / 2


EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

Creating a Vetinari Clock Using Antique Analog Meters
Max Maxfield
55 comments
As you may recall, the Mighty Hamster (a.k.a. Mike Field) graced my humble office with a visit a couple of weeks ago. (See All Hail the Mighty Hamster.) While he was here, Hamster noticed ...

EDN Staff

11 Summer Vacation Spots for Engineers
EDN Staff
11 comments
This collection of places from technology history, museums, and modern marvels is a roadmap for an engineering adventure that will take you around the world. Here are just a few spots ...

Glen Chenier

Engineers Solve Analog/Digital Problem, Invent Creative Expletives
Glen Chenier
11 comments
- An analog engineer and a digital engineer join forces, use their respective skills, and pull a few bunnies out of a hat to troubleshoot a system with which they are completely ...

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
45 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Flash Poll
Top Comments of the Week
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)