Only this industry would spend billions on 200 mm fabs, then shut them while they spend more billions on fabs for larger wafers, and while selling the resultant ICs they make for a dollar or a few dollars. In some ways, this is ongoing insanity, seems to me.
My question: at what point does the merry-go-round have to stop?
I think that unless you have reasonably high margins - like Intel for instance, it doesn't make sense. The business model for most semiconductor companies hasn't worked well for the last decade at least. The average return on investment is very low.
Like any other manufacturing industry, there comes a point where it makes more economic sense to shutter existing facilities and build new than to continue using the old. Of course, this is usually based on economic projections, which may or may not be accurate, because if the manufacturer waits too long, the competition has gone off ahead (and certainly isn't going to wait). With today's technological advances, we see this happening more and more often.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.