Yawn. We already know Intel's 64 bit chip was delayed to 2015, a year after 64 bit ARM chips are starting to ship. And by the time they launch their dual core version, there will be quad core ARM versions. Sorry, Intel. You missed the boat. Again. Better luck in 2018.
It doesn't matter what instruction set it has. What matters is whether it is built to be low power. Seeing DDR3 is a red flag. If you want to build the next gen microserver, you need a wider, more power-efficient, short-wire interface from the CPU to the DRAM stack.
The software in servers is typically bottlenecked on the memory. If you want more power-efficient servers, you have to lower the nJ per byte on those interfaces.
Good point. Even TI didnt see a busienss playing third to Qualcomm and Nvidia (and Exynos, and SE and).
I imagine Intel must have been the first one to knock on Amazon Kindle's door when TI ended Omap for tabs.
Not so well. TSMC is on track for FINFET, many are doing ARM architeture license (it is like multi CPU groups). Other than process technology advantage, intel have nothing. Toomuch of process, Kills innovations...
What are the new key enablers are coming out of INTC and MSFT R&D labs. When INTC moved out of Santa Clara for development, innovation died...
Having said that we want to make sure US companies to be op top list...so it is vital for Intel manufacturing to be number 1.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.