Remember, those early days had problems not just with lithography. They were learning a lot about materials, too, and the scaling down produced multiple changes in the understanding and behavior of devices. In 1975 I'm not sure if they had even discovered the problems of sodium traces in equipment poisoning the integrated circuits. They had to learn to grow uniform thin layers of many materials, change to copper wires (copper is also tricky if it gets in the wrong places), invent new kinds of insulator, investigate various regimes of impurity, .. etc. Every generation of shrink has been a broad learning problem, it is not simply optics.
2. At the early stages(1975-1997) - where research was cheaper than today - It might have been the economic tradeoff, or it might have been the regulatory effects of moore's law. We'll need to dig deeper to know which is which. But maybe one hint that we could've done better is the fact that there were 20 companies all keeping up with moore's law, versus today's few companies struggling to keep with the law, being late ,etc.
3. Yes i do believe we could have build useful stuff earlier. If you look at the applications people had at the research level at 1975 or earlier in basic form, you'll see lots of the stuff we use today: computer mouse/windows/printers/games/personal computers/3d cad/simulation/high-level-languages. I'm sure visionary people at that time have seen the potential.Surely more transistors would have greatly helped.
As to the question of how - At the time they had the ibm system/360 , so i guess they could've managed to design an interesting chips with a few million transistors(which fits 250nm) ,even if it's mostly lots of memory(maybe with a basic cache) and wider buses and fast transistors and floating point ALU's - stuff known at the time. And reasonable charactereization of transistors seems possible at that time.
1) it is definitely an observation not a law (like Boyles law) or a theorem (like evolution)
2) it is a tradeoff between economics and technology ie essentially balancing 'how much better do we need to make something in order to sell it?' and 'how much will it cost to make that change?', with a liberal complication of how long it takes to make the change versus where the market expectation will be at the time
3) so yes we could have stepped processes at a faster pace BUT the cost of the steps would have been too high to justify taking them. Who needed a microprocessor with a billion transistors in the 1970's, for instance? If no one needed it then there would be no point in getting the technology necessary to produce it at that time. Alternatively if you produced a 4004 on a modern process it would be so tiny that would in itself cause problems for instance it would not be able to drive any sensible load at the 15v it was supplied with.
4) look at the costs of going to larger wafers today - we are delaying going to 450mm because the incremental cost is too high - and that change looks to an outsider as if it is relatively simple but it does mean redesigning every machine that carries the wafers.
Zvi, i want to ask you a historical question as a semi expert:
I read somewhere, that in the 50's we could have had 250nm/180nm. It wasn't that far technically. But moore came, set the pace of the industry with his law and we got 250nm only in 97. Does it make sense ?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.