In the absence of published data, I would have to concur with Mr. Robert Maire's observations, including his point on using wafers with resists. But I wouldn't go as far as saying "All that happened was that wafers were moved from the input FOUP to the output FOUP!" I think there was more accomplished here than that!
IBM did admit that the testing scope was limited to "power level and reliability" studies of the EUV source. How ever, I disagree with the statement "Putting resist on the wafers would have had no value whatsoever", the EUV study would have gone much further using wafers with resists.
I am not sure if the announcement warranted the run up in stock prices!
IBM did indeed have a minor breakthrough exposing 637 wafers at 20mJ/cm2 in 24 hours with a machine running at 77% uptime. IBM admited from the outset this was NOT enough to make EUV viable for the 10nm node. Everyone, IBM included, is still HOPING EUV will be ready for 7nm node at the earliest.
The real breakthoughs will be getting 100+ wafers/HOUR out at something more like 50mJ/cm2 with good, useable wafers patterned at the dimensions needed for10-7nm node work.
Let's cross our fingers some brilliant engineers can make that happen in the next 12 months!
@Rick: You may have been to Nikon's Lithovision (held adjacent to SPIE's Lithography conference) in the past on this EUV topic. As far back as 5 years (I have not been there in the last two years), attendees were told EUV is almost here and migrating to the next tech node is guaranteed to work! I am sure you heard about this proclamation for 28nm onward to what is now 7/9nm nodes. Yet EUV still remains a mirage!
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.