The companies in semiconductor industry are trying to push the boundaries of the nature to get the latest technologies for high end IC manufactuirng. IBM definitely wants to keep their place without a question.
That is only about 3 joules per wafer, assuming 9illuminated features are 25% of the surface area. On a 100 micron wafer the wafer is going to expand about 1 part per million. However, the chuck may be 100 times more massive, made of a low expansion material, and probably on a temperature controlled mount. It seems like this aspect could be routinely controlled? The wafer would need to be at a repeatable, predictable temperature within 0.02 C accuracy to hold expansion across a 2cm field to 1nm - any design for high accuracy lithography, regardless of energy source, must aim for that level of control.
After enough EUV-heated wafers (heating from absorption), the wafer chuck being warmer expands some ppm, which matters over 300 mm. In fact, applies even without resist, but resist patterning involves overlay.
@TanjB, the exposure latitude is for 10%CD off target at best focus (by most references). Of course there is also a focus window, which normally accounts for another 10%. When not at best focus, the exposure latitude can go down significantly; some of the graphs already show that. Some features like lines or arrays can have enhanced exposure latitude with some type of off-axis illumination, but for random metal or poly layouts, there is always going to be some trouble spot with much reduced exposure latitude or depth of focus.
You're right about the trade-off of multiple steps with overlay, so there has been some move toward more self-aligned patterning, but this actually restricts the layout quite a bit.
So there are still many patterning options being studied, but I think a greater concern is the demand for smaller geometries, in terms of how many companies will actually design/manufacture sub-20 nm.
Congratulations to the analyst for a substantive, technical, "truth to power" analysis. It may be the first I've ever seen. Usually the analysts just take the buzz on the street and wrap it around the corporate press release. As far as IBM's motivation for "results were deliberately misleading", I'll be interested to see what unfolds. Was it negligence or misdirection because they don't want competitors replicating their approach and builkding upon their early work. I can't see why IBM would have errors in their data when they understand better than anyone else what they're doing.
Thanks for the reference! Not sure if slide 37 is the right one? Perhaps you meant 32/33? And compare to slide 43 where 40 vs. 20 mJ demonstrate that there is a fairly wide latitude in exposure. Admittedly, that assumes they crank up the power, which reduces thruput.
You are right this has not been a problem with current UV where there are about 15x more photons. However, EUV should be able to expose in one step what EUV needs 2 or 4 overlayed steps to achieve, so in a sense the shot noise is going to eat into margins donated by not having the errors of multiple steps.
For sure, you are right this kind of problem is new and will get worse. It works counter to using more sensitive resists, it works counter to higher thruput, and it works counter to finer resolutions.
A Book For All Reasons Bernard Cole1 Comment Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...