EUV, especially for NVM, is a giant head fake, orchestrated principally by Samsung. Their stated corporate strategy has always been to use their overwhelming presence and financial strength to club their competition to death, financially. EUV is, in theory, a perfect manifestation of that strategy. Imagine an NVM fab with, say, 10 EUV tools, at $100M each, running at 50WPH max. NVM is the most elastic of devices, chasing the Nirvana of Harddiskland, and cost is the barrier. The only technology with a chance of enabling NVM to threaten hard disks is imprint. And Toshiba has figured this out.
My understanding from talking to someone at the company about why they thought ReRAM needed EUV is that they felt in the high-density array, the ReRAM element would need a selector, i.e., an isolation device like a diode, at every cross-line intersection. This is not so easy to do in the BICS architecture. To make up the cost difference, the diode-based architecture needs to make use of the smallest allowed design rules, much narrower than used for BICS. In turn, this architecture would use fewer levels than BICS.
But already some groups are exploring eliminating this isolation device. If one of the approaches works, then the BICS architecture may be applied to the ReRAM.
At this year's SPIE EUV conference, Zeiss added a couple more concerns regarding the larger angles needed for higher resolution ~10 nm. To compensate by increased demagnification would result in the need to either increase the mask substrate size or stitch together multiple exposures.
Last year the shot noise issue was already brought up. Conventional diffraction limit eliminates longer wavelength single patterning, while quantum shot noise eliminates shorter wavelength single patterning (as the minimum dose eventually cooks the resist). Hence DSA was especially popular at this year's SPIE as a possible cheap multi-patterning solution.
The wikipedia article on EUV (http://en.wikipedia.org/wiki/EUV) seems to imply that there are a lot of barriers to it succeeding. Not only is power way too low to get a high throughput (and the efficiency of the light source is shockingly low) but the high energy of the photons induces effects that reduce the sharpness of focus (eg secondary electrons are scattered into the resist hence exposing bits that are meant to be unexposed). It does not sound to me that EUV is viable on this basis. It is also implying that by the time EUV is deployed that even EUV might need double masking, which reduces its appeal still further.
Does anyone know more about this?
I hear EUV not viable for logic either for at least next 5 years.
Real Wafers processing per hour under manufacturing conditions is less than 10 for a ~100€ tool.
To overcome shot noise EUV power needs to be significantly above 200W for logic.
Curious that a technology story is fed from a CFO - a new reality first for the industry. Whatever happened to SanDisk's promised introduction in the 2H 2013? And isn't this a question for Toshiba where the real development is being done?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.