REGISTER | LOGIN
Breaking News
Blog

Monolithic 3D Ready to Give IoT its Own Scaling Path

Game-changing 2.0 @ IEEE S3S
NO RATINGS
Page 1 / 2 Next >
View Comments: Newest First | Oldest First | Threaded View
mendicant98
User Rank
Rookie
Advantages of porous silicon
mendicant98   9/28/2015 8:30:07 PM
NO RATINGS
At Semicon West, CEA-Leti's CoolCube speaker showed an interesting graph of the performance advantage of 3D integration. For instance, the performance of a hypothetical, 2-layer 14 nm process, will be better than that of a single-layer, 10 nm process. CEA-Leti is working out what cost advantages may accrue.

Porous silicon as a layer transfer technology offers substantial cost advantages, relative to ion-cut-type layer transfer methods. Relatively high-cost ion implantation is no longer needed for high-quality, precise layer transfer. And, high temperature processes (to anneal the implantation-induced damage) are reduced.

But porous silicon has additional advantages, seen in the light of emerging MEMS and IoT technologies. Porous silicon processing has been around for decades. The first interesting application occurred when physicists noted that porous silicon had useful light emission properties. Shortly thereafter, porous silicon engineering was used to create Bragg reflection structures; some of these techniques have found their way into high-volume silicon photovoltaic device manufacture.

Independently, porous silicon processing techniques expanded the MEMS toolbox in terms of fabricating mechanical cantilever and membrane structures. More recently, supercapacitor structures, and high-energy-density batteries, have been devised, based upon porous silicon.

Expect, therefore, to see a long run of new innovations, at a very fast pace, as these innovative process technologies collide, and through their synthesis provide useful and cost-effective devices.

Or_Bach
User Rank
Author
Re: 3DM - Wow
Or_Bach   9/28/2015 4:22:51 PM
NO RATINGS
Sure, power is the limiting factor these days, and it is trure to all scaling technologies. Due to the dominating aspect of interconnects on power, dimension scaling is drasticly effected, hence the emerging concept of "gray silicon"

3D scaling provides shorter and less capacitive interconnects along with many other integration optios to help manage the power challenge.

And while on the subject the S3S is best place to learn about advance technolgies to manage device power issue, as it covers the field of SOI and SubVt along with M3DI.

So I am looking to see you there next week.

AKH0
User Rank
Author
Re: 3DM - Wow
AKH0   9/28/2015 10:12:16 AM
NO RATINGS
That was the exact rational when we discussed merging of three conferences back in 2012: Well established IEEE SOI conference (held for nearly 40 years) brings experts on layer transfer, as well as recent SOI device and circuit development especially on FDSOI (which is meant to deliver ultra low power operation) and RFSOI (which is an integral part of connected devices); subthreshold microelectronics bringing experts on ultra low voltage operation; and 3D putting all these components together to deliver a unique solution.

savitale
User Rank
Rookie
Re: 3DM - Wow
savitale   9/28/2015 9:02:44 AM
NO RATINGS
Traditional scaling probably will be over in 2-3 nodes. 3D scaling may get you more transistors, but you won't be able to use them without imporvements in power consumption. This is where energy-efficient / low-voltage computing comes in.

Bruce Doris
User Rank
Rookie
3DM - Wow
Bruce Doris   9/26/2015 10:26:49 AM
NO RATINGS
What a great technology !

 

I thought scaling was coming to an end but now i see there

is another dimension to  work on.

 

I am looking forward to the short course and presentations

on 3DM at the S3S Conference.

 

if you dont have time to attend the entire conference you can register

for the short course only and learn a great deal about this exciting topi

and meet try experts in the field.

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed