With Moore's law coming to an end and exponential growth slowing down for chips, perhaps in the near future we will only see a handful of micro chip manufacturers and semi conductors in the world. Only the biggest companies are able to pay the high price of research, and the others might be forced out of the market altogether.
Simon - http://www.starrausten.com
I think it's more of a realization of business economics. Today's entry level system would to considered a super computer just 10 years ago. With the explosion of on-line work and games and the current economic calamities, processor prowess is no longer the envy of all users like it used to be. Very few are in need of the cutting edge CPUs and the profits are dried up in this arena, so development will slow down, not solely due to flaws in Moores law.
For general semiconductor technology, new system and chip packaging must pave the way for silicon optimization. For processor specific implementations, novel ways to incorporate memory into the intrinsic architecture must be found.
I recently threw out an old IEEE Proceedings mag from the early 80s which had a couple of articles explaining on sound theoretical grounds why Moore's Law would soon cease. Apparently, reducing features below 100nm was fundamentally impossible.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.