It could be, Peter. Either way it points up the fact that in the current tablet / handset market ARM is the 800 lb gorilla and Intel is the newcomer. The fact that it is a very well-heeled newcomer means that it can get some accomodations from established support players like SanDisk. It will be fun to watch to see if that translates to market success.
@resistion: Like you said it could be just issues of learning and flexbility needed. If so , that might mean that highly structured designs like fpga/easic would have major benefits when moving to next node. I wonder how far the easic structured asic form asic efficeny and how far a benefit in a single node would get them.
Great question. Flash has used DP since 3x nm, they may have driven costs down little by little, layer by layer. DRAM also introduced layer by layer. Non-memory just getting started. I'm also surprised to hear complaints but maybe it's all about flexibility. Memory designs are restricted anyway.
I am really wondering how much of this is actual engineering optimization, as opposed to synchronization of marketing plans. Intel gets to claim an advantage in flash access time (Show me the numbers!) while SanDisk gets de facto design wins. I am also leery of single-sourcing for a component. That being said, maybe there is something here. I will be interestd in seeing third-party tests of this claim.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.