I am not sure how to do the selection process, as per today's technology and architecture it will require to process for selection as well. But since the thought has emerged in the brilliant minds some new technique will be developed for it.
Let's just say it will be 'more normal'. It's all about economics. If you are going to sell millions of chips, you will probably opt for an expensive NRE and low recurring costs. If you are going to sell hundreds of thousands of chips, you may opt for 3D and use an older node to reduce NRE, but with higher recurring costs. If you are only going to sell thousands of chips, you might as well figure out how to use FPGAs to accomplish your goals...
In the case of DRAM, I think the game might be a bit different, because you can charge a premium for providing higher density. So I might expect a DRAM maker to go 3D on the latest greatest node for enterprise and HPC applications, while using that same die in 2D for PCs and cell phones...
Since wafer costs are going up faster than transistor size is going down, it will eventually make economic sense to do chip stacking. Sources say that the $/transistor goes up when going from 28nm to 20nm - that is the first time this has ever happened. So, going 3D may make sense despite the yield loss and the packaging costs. You can double you density without going to the next node...
keep harping on the demise of Moore's Law. And their numbers are growing.
But the last Fab / Foundry standing will get all the consumer electronics business as integration at a die level is still going to be cheaper than at package level - at least to integrate chips of compatible Fab technologies.
How many actual products are out there with 2.5-d or 3-d die stacking technology ( 1 for sure maybe 5 tops ? ). IBM has been talking about it for over 7 years. Do they use it for their own products ( other than building a few samples for SemTech or Micron ) ?
The real niche for integration at the package level by 2.5-d or 3-d die stacking is in heterogeneous systems using incompatible technologies. Have n't seen much action other than camera modules for mobile phones. Perhaps IoT will boost the numbers, even include RF.
But for IBM it will have to be mostly opto - electronic I/O for high speed servers or integrating chips made at various Foundries at finer nodes ( to take advantage of FinFETs ?).
IBM's volume, cost structure & strategies ( that ultimately decide the technologies they will push ) are not always relevant to mid to low end systems down to Consumer Electronics.
Also the HKMG gate first or last episode at 32 nm is still too recent to forget !
Yes it is rightly said that the information from all different sources are putting enormous amount for processing requirement on the silicon and the present day architecture is not designed to ignore a single bit of information, identifying and ignoring it selectively will be a beneficial approach, may be this will one a new area of research.
Chip stack packaging can help to reduce costs of components, but it is not the same as shrinking technologies. The additional costs of complexity and defects does not give the same payback. Are there any other ways to reduce cost? Most packaging advancements going forward are for size, not cost. The biggest need for our customers is to shrink the footprint for portable electronics. Of course, cost is always an issue and needs to be addressed as best as possible. Otherwise your competition will own the socket.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.