TI got out of the smartphone market
I'm less sure about the ST/ST-Ericsson move but assume it was about that company's financial woes more than anything else.
Meanwhile apparently most of the big SoC makers have decided they don't want 28 nm WideIO 1.0 memory at 12G but 20nm WideIO 2.0 at 24G.
"rise above Moore's law"? I think not. the issue here is that numerous segments need 2.5d today, primarily because no one can afford the power and pins to drive enough memory. since this is a power-based argument, actual 3d stacking is mostly irrelevant: the need is many/wide pins from cpu to memory, and no cpu (except perhaps in a phone) can afford to be stacked, dissipation-wise.
memory-wall, meet interposer. that's what's on the table. it's not a fab issue either - denser single chips don't eliminate the need for 2.5d integration - if anything, it gets worse.
a refreshingly honest report on the status of 3-D stacking / 2.5 D modules that exposes the reality of this overly mechanical attempt to rise above Moore's Law or circumvent a bank balance too lean to afford a 14 nm Fab.
Though the partisans of 3-D stacking have for quite a few years now been drooling about the bonanza of the billion units a year Smart Phone industry adopting their technology "any time now", given the laws of physics & economics thats probably one of the last application that would ever happen. Per historical precedence the normal order of adoption for this complex process would be : military, medical, supercomputers, servers, graphics-heavy consumer systems like game systems and perhaps only then Tablets and Smart phones.
And even after all the hardware mfg. issues of 3-d stacks are solved, the architecture and programing issues of hooking up the right chunk of memory to a specific processor core in zillion - core processors would remain just as much a challenge as in any other parallel computer.
2.5 - d modules with Si interposers would provide good enough interconnect density and thus significant improvement in Bandwidth / Power eff. but these Interposers need to be much cheaper than just by 50 % ( Glo Fo target ? ) to prevent the time - tested strategy to integrate the whole shebang on a single chip.
Regarding 2.5-d modules with cheaper organic substrates, the improvement in bandwidth will be severely restricted by interconnect density possible even at future geometries ( 8 um L&S, 30 um dia via ) or no. of layers ( cost ). RC delay and signal skew in resulting long lines between chips would be significant. So not much of a improvement in Bandwidth or power efficiency there.
Elegant electrical methods to work around architectural and physical limits of package level integration are in development and would very likely precede the complex TSV based 2.5 or 3-D processes.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.