I have the suspect than only Intel and TSMC will are able to deliver high volumes of FinFet silicon in upcoming years.
This Samsung move gives me the idea that both GloFo and Samsung are in a desperate state and they are not able to yields a good 14/20 nm process, in spite of their announcements. Actual Samsung 20nm is in idle state it seems.
I hope they will fix their problems because FD-Soi do not scales down to 10nm and FinFets are a feasible solution at 10nm, 7nm and 5nm (with some stretch)
I'm eager to know what you mean by scaling to 10/7/5nm. No one in the industry is using these numbers as being the gate length. 14nm FinFET is parking the gate length at 30nm or more. 10nm FinFET will be ~25nm. The real challenge at below 20nm node is making contact to transistors and routing the signals and dual/tripple/... patterning that comes with it. As far as the scaling is concerned, FDSOI actually gives a clearer path when compared to FinFET. You need to drop the power dissipation from node to node. Otherwise you can't squeeze the circuit. FinFET is HOT (literally), simply because it has more current and more capacitance per area. Despite all claims out there promissing 30% or more power reduction with FinFET, there is absoloutly ZERO Si data to confirms this.
Sounds like a tachnology to look out if you are developing IC's for the wearable applications. Wearable have a great demand for low sleep mode power consumption and it is very critical considering the huge market in future.
My point was about foundry not Intel. I agree Intel has been shipping products for a few years and I admire that. But when I look at the data I don't see FinFETs claims fulfilled. Historically each technology node is expected to give at least 30% power advantage at constant frequency. This is pure scaling and has nothing to do with transistor being better. If you compare 45nm and 32nm products from Intel, across the whole range there is 35% power reduction. Now if you compare 22nm and 32nm there is maybe 20% power saving at high end and almost nothing at lower frequencies. This is even short of the 30% gain you would gain by merely scaling of the node. Now where is the additional gain -- 30% or more that everybody claims with FinFET?
Yes, FinFET is hot! Ask Qualcomm or any fabless that has gone through the full design cycle.
Self heating the way every body attributes to FDSOI was a problem with PDSOI that had a thick BOX. A 20nm BOX in FDSOI does not change the heat transfer that much and there is experimental evidence for that at product level.
As for the cost, I agree major players will choose the cheaper that meets their performance target. However, I believe everybody knows between Samsung and Intel who is more cost sensitive. All comments floating around comparing cost of FinFET vs FDSOI go back to Intel's claim of FDSOI being 10% more expensive than bulk planar. If I accept that claim, my estimate of finished wafer cost at Intel 22nm would be $3500 - assuming $350 higher wafer cost for SOI. If so, why bother fabbing at TSMC or Samsung? Intel can do cheaper!
A Book For All Reasons Bernard Cole3 comments Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...