Yawn. We already know Intel's 64 bit chip was delayed to 2015, a year after 64 bit ARM chips are starting to ship. And by the time they launch their dual core version, there will be quad core ARM versions. Sorry, Intel. You missed the boat. Again. Better luck in 2018.
It doesn't matter what instruction set it has. What matters is whether it is built to be low power. Seeing DDR3 is a red flag. If you want to build the next gen microserver, you need a wider, more power-efficient, short-wire interface from the CPU to the DRAM stack.
The software in servers is typically bottlenecked on the memory. If you want more power-efficient servers, you have to lower the nJ per byte on those interfaces.
Good point. Even TI didnt see a busienss playing third to Qualcomm and Nvidia (and Exynos, and SE and).
I imagine Intel must have been the first one to knock on Amazon Kindle's door when TI ended Omap for tabs.
Not so well. TSMC is on track for FINFET, many are doing ARM architeture license (it is like multi CPU groups). Other than process technology advantage, intel have nothing. Toomuch of process, Kills innovations...
What are the new key enablers are coming out of INTC and MSFT R&D labs. When INTC moved out of Santa Clara for development, innovation died...
Having said that we want to make sure US companies to be op top list...so it is vital for Intel manufacturing to be number 1.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.