Your "it will happen in 5 to 10 years" reminds me of a telecomm project I worked on, many years ago, with a German company. Qualifying the individual components, then the new system standalone, then in context with our customer's network took longer then it took us to develop our system. And we were lucky to be first with our advanced system, because the customer had no interest nor resources to qualify a second system vendor.
Why do I tell you about this ancient story? Because interposers do not only help you to increase speed while reducing power, they also reduce your development time (compared to SoCs) ...and may make the difference between being first to market or being a very, very distant second.....Herb
I agree with you, your FIRST interposer design will take longer than the SoC route, because you need to make yourself, your company and key people at your customer(s) familiar with this new technology .... That's exactly the reason why many companies quietly are currently developing interposer solutions and why IP vendors like Rambus and Invensas publizise their interposer capabilities now.
The benefits of deploying a new technology the first time are typically not that great, but mastering a new technology and deploying it widely, ahead of the competition, that's how you can get ROI's others can only dream about....
In the mid1990s the big IP-reuse wave in SoC design started and grew, against the voices of many sceptics. Now, 20+ years later, IP reuse of legacy code, even integration of 3rd party IP is possible now and a must use to compete effectively.
The modularity DIE-LEVEL IP brings to 2.5D or 3D-ICs brings, will offer us even bigger advantages, especially in reducing development cost, time and respin risk. Just look at an early example: Xilinx' rapidly growing family of products, combining FPGA die with other functions on an interposer.
seems likely all the recently minted experts and champions of Advanced Packaging technologies like 2.5 d interposers and 3 d die stacking, who had been making wild predictions that every iPhone SoC / GPU amd DRAM would soon come in one of these modules, have finally wised up and shifted their sights to heterogeneous integration ( e,g. electronic to optical ) where the higher costs are more acceptable because there are no good alternatives.
get reaady for another hype cycle by paid shills posing as technologists
@ Rick : this 2.5 / 3d hype nonsense has gone on for far too long. I am ready to take up your earlier challenge to write something up for EE Times. I should be able to cobble something together ( with a dash of Math perhaps ) during the break. If that still makes sense then let us have your e-mail address at EE Times.
BTW, Xilinx is producing 2.5-D parts with plans for a widening variety of products, admittedly low volume and high price.
Huawei is working with Altera on offerings. Micron has its Hyper Memory Cube sampling. SK Hynix has a similar design. IBM is working with Micron on its own designs and several companies including even the relatively conservative Broadcom are talking about 2.5-D switches with silicon photonics in 2015-16.
May I invite you for an in-depth discussion during a nice dinner or an extended lunch? Of course, my treat! I would like to learn from you how I can better contribute to innovation and also understand your outlook for our industry.
Please give me a few days notice when you tell me when and where we can meet.....Herb
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.