In the absence of published data, I would have to concur with Mr. Robert Maire's observations, including his point on using wafers with resists. But I wouldn't go as far as saying "All that happened was that wafers were moved from the input FOUP to the output FOUP!" I think there was more accomplished here than that!
IBM did admit that the testing scope was limited to "power level and reliability" studies of the EUV source. How ever, I disagree with the statement "Putting resist on the wafers would have had no value whatsoever", the EUV study would have gone much further using wafers with resists.
I am not sure if the announcement warranted the run up in stock prices!
After enough EUV-heated wafers (heating from absorption), the wafer chuck being warmer expands some ppm, which matters over 300 mm. In fact, applies even without resist, but resist patterning involves overlay.
That is only about 3 joules per wafer, assuming 9illuminated features are 25% of the surface area. On a 100 micron wafer the wafer is going to expand about 1 part per million. However, the chuck may be 100 times more massive, made of a low expansion material, and probably on a temperature controlled mount. It seems like this aspect could be routinely controlled? The wafer would need to be at a repeatable, predictable temperature within 0.02 C accuracy to hold expansion across a 2cm field to 1nm - any design for high accuracy lithography, regardless of energy source, must aim for that level of control.
The companies in semiconductor industry are trying to push the boundaries of the nature to get the latest technologies for high end IC manufactuirng. IBM definitely wants to keep their place without a question.
IBM did indeed have a minor breakthrough exposing 637 wafers at 20mJ/cm2 in 24 hours with a machine running at 77% uptime. IBM admited from the outset this was NOT enough to make EUV viable for the 10nm node. Everyone, IBM included, is still HOPING EUV will be ready for 7nm node at the earliest.
The real breakthoughs will be getting 100+ wafers/HOUR out at something more like 50mJ/cm2 with good, useable wafers patterned at the dimensions needed for10-7nm node work.
Let's cross our fingers some brilliant engineers can make that happen in the next 12 months!
@Rick: You may have been to Nikon's Lithovision (held adjacent to SPIE's Lithography conference) in the past on this EUV topic. As far back as 5 years (I have not been there in the last two years), attendees were told EUV is almost here and migrating to the next tech node is guaranteed to work! I am sure you heard about this proclamation for 28nm onward to what is now 7/9nm nodes. Yet EUV still remains a mirage!
Dose has to double each successive node (like from 20 to 40 mJ/cm2 going from 28 nm to 20 nm), so power must double at same time. ASML unfortunately has flat levels in the roadmap, and it's still behind. Note that resist may also have to change each node to accomodate the different dose levels.
@resistion: 6 sigma has about 2 faults expected per billion. Allowing for empty space, an advanced SOC (2 to 4 sq cm) will have about 1e11 features at 20 nm. So, you would never make a correct chip at 6 sigma. 7 sigma is about 2.5 faults per trillion, so probably good enough for 20nm on a moderate to advanced chip. 8 sigma, at 1.25 faults per 10^15, I concede to be more than is needed.
Published claims that 6 sigma is good enough may be fuzzing the issue because the threshold between 6 and 7 is fuzzy. If you assume that the rest of the system is also a little overspec'd then the 2 values per billion which fall outside 6 sigma stand a good chance of being rescued by other parts of the system being usually well within their spec limits.
Your point about doubling power density as you halve the feature area makes some sense, but only once you reach the statistical limit. At 5,000, the 7 sigma limit is about 10%. Not difficult to design for that. At 1,000 the 7 sigma is about 21%. Tricky, but not impossible. At 250, the range is 43%, which is probably a practical limit. So, at 5,000 we are not yet bounded by shot noise.
However, there is another problem heading in the direction you point, and that is resist resolution. Just as in conventional photography, the finer details require a slower exposure. Going to 10nm is probably going to require a less sensitive resist in order to support the finer features. Not sure if this scales inverse linear, or better, or worse.
Also note that today's EUV is 13.4nm and it seems unlikely it will go below 14nm. It might make better 14nm than 193nm can do, with less restrictions on feasible geometry, but it is not clear if the optics can ever support sub-wavelength resolutions. So it will be back to the drawing board for a whole new generation of EUV at a more extreme wavelength (7?) if we want to keep going down in that technology. And that will double the photon energy, halving the number of photons. Probably the resists will be less sensitive.
Got a citation for that? I'm curious why 10% would not be good (as the 6 or 7 sigma limit, not as sigma itself). Seems like it leaves plenty of margin to distinguish between exposed or non-exposed resist. If you have pointers to a discussion of the finer points and gotchas of a 10% spread, that would be much appreciated!
It's the tone of the announcement that could be misleading. They could have more simply announced reaching 40W, which is consistent with 30 WPH or 600 WPD. It seems the cost of running real resist wafers was too much, which is actually another big concern.
Thanks for the reference! Not sure if slide 37 is the right one? Perhaps you meant 32/33? And compare to slide 43 where 40 vs. 20 mJ demonstrate that there is a fairly wide latitude in exposure. Admittedly, that assumes they crank up the power, which reduces thruput.
You are right this has not been a problem with current UV where there are about 15x more photons. However, EUV should be able to expose in one step what EUV needs 2 or 4 overlayed steps to achieve, so in a sense the shot noise is going to eat into margins donated by not having the errors of multiple steps.
For sure, you are right this kind of problem is new and will get worse. It works counter to using more sensitive resists, it works counter to higher thruput, and it works counter to finer resolutions.
@TanjB, the exposure latitude is for 10%CD off target at best focus (by most references). Of course there is also a focus window, which normally accounts for another 10%. When not at best focus, the exposure latitude can go down significantly; some of the graphs already show that. Some features like lines or arrays can have enhanced exposure latitude with some type of off-axis illumination, but for random metal or poly layouts, there is always going to be some trouble spot with much reduced exposure latitude or depth of focus.
You're right about the trade-off of multiple steps with overlay, so there has been some move toward more self-aligned patterning, but this actually restricts the layout quite a bit.
So there are still many patterning options being studied, but I think a greater concern is the demand for smaller geometries, in terms of how many companies will actually design/manufacture sub-20 nm.
IBM had been trying to use NXE3300B and it might be true that they got it going. It might be hard to debate it is genuine or not. Analysts: Forecasters: Astrolgers: Similar category, cannot take their words seriously or accurately. Missing technical details.
Congratulations to the analyst for a substantive, technical, "truth to power" analysis. It may be the first I've ever seen. Usually the analysts just take the buzz on the street and wrap it around the corporate press release. As far as IBM's motivation for "results were deliberately misleading", I'll be interested to see what unfolds. Was it negligence or misdirection because they don't want competitors replicating their approach and builkding upon their early work. I can't see why IBM would have errors in their data when they understand better than anyone else what they're doing.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.