I noted the point about chips being funded by non-US non-VCs from places like China and India. This is definitely a trend to watch out for, as the source of the money tells us a great deal about the intentions of the project in any situation.
I have been saying same for a long time. I do not see the value in funding semiconductor companies. For all the troubles, no one likes them. Instagram was worth $1B with 5 staff with no bank account while Analog Devices with thousands of staff, $2b+ revenue, great products, history of innovation cannot cross $15B. If people value web firms more and VCs are in this to make money, the best roadmap is to fund web companies.
>> This is definitely a trend to watch out for, as the source of the money tells us a great deal about the intentions of the project in any situation.
The #1 reason is that in the last 10 years, no semiconductor company has had an exit of up to $1B. Dropbox is now worth $10B. So, if that trend continues, there is no reason to waste capital in semiconductor when web companies do better. China runs state capitalism and that means the profit is not just the motive. It is government money and I see them leading that path in years to come.
I will disagree, the profit IS certainly a motive for Chinese chip companies. If you will make your homework and try to research what are the best selling and quickest growing chip companies in PRC's domestic market, you will see that there are no state funded companies in the top.
A ~20 out of top 50 mainland China's chip companies reached multiple of 3 of tangible book value within less than 3 years since their start of operations, which is quite fast in comparison to what we see here.
I can only agree with you in the point that there is no opportunity for conventional types of investments in China as PRC's legal environment is implicitly disallowing foreign VCs operations.
Only Govt. / public funded R&D can keep the fire burning in obscure areas and keep disruptive new science from getting suppressed by vested interests.
Though T. J. Rodgers does keep rolling with the punches ( P-SoC ), he seems to think too highly of his contributions. After all he is still walking on the path created by others like Bob Noyce that was first charted with Govt. / Taxpayer money.
The "disruptive new science" should have it's share of attention, but close to earth start-ups do bigger success. The biggest progress in SoC industry in last decade was not in "disruptive" areas, but in SoCs for low-end consumer electronics and went largely unnoticed.
Also, if Intel's "first product" was in 1969 and the Apollo mission were from 1961 to 1975, then that begs for many questions your assertion, like, which Apollo mission do you refer to below? And how many device did those rocket use? Enough to sustain Intel's quick early growth? That's not likely, even at the out-of-control prices that govt. pays. That leaves direct funding then. NASA directly funded Intel's develpment of the products listed above? How much? Were there any other significant investors or would Intel have failed to launch (pun intended) without NASA's (ah.. I mean our father's) money?
One of the main reasons for this situation is that there is a confusion between IMPROVEMENTS and true innovation. Improvements driven by Moore's Law have run their course. Thus, making CMOS smaller, albeit extremely successful, attained its peak and the FinFET/SOI paradigm is the exit point. Meanhile, devices are clearly stuck in the need for better memories, as the high scale of SoC integration is clearly the future, It is also a serious innovation bottleneck when we have FPGAs without Nonvolatile Memory at the higher node, poor endurance and no real power savings of High end FLASH and embedded devices with need of simpler and low cost nonvolatile memories. Add to that the needs to have lower cost solid state drives and the last hooray of DDR4-DRAM, and yes we seem to have no place to run. In the last 30 years, as DRAMs and FLASH had their run, the only two memories that were truly novel did NOT come out of Silicon Valley. These were the FeRAM (FRAM) and more recently STTRAM. Now, we have a few possibilities with CBRAM, CeRAM and a plethora of questionable electrochemical RRAM ( although I believe that all RRAMs based on filaments, including CBRAM (although it seems a bit more reasonable until you see their video in Youtube, showing a SILVER electrode injecting SILVER CLUSTERS!!!
Since I had something to do with FeRAM, a technology that never had enough funding and yet it has already put over 1.5 Billion chips in the Market (Ramtron, Fujitsu and Panasonic, now TI), I can say that today FeRAM is a good and mature response to embedded. Some confuse FRAM with FeRAM, so here is the difference: FRAM is based on PZT and FeRAM is based on SBT - Symetrix developed SBT for Panasonic and that is an extremely low power scalable device which can be made down to 25 nm thick and sports 10X less Power and more speed than FRAM). See next post - they limit size of text in EETIMES.
Today you may hear about CeRAM. And as a biased promoter of this technology, I invite you to take a look at symetriscorp web site. Why did I switch from FeRAM to CeRAM. FeRAMs can be further scale but CeRAMs are already here to produce a serious game changer in nonvolatile memories: CeRAM is an RRAM that uses Many-Body Physics - yes, that confusing world for those who only studied semiconductor and simple band theory and go around thinking that everything is a semiconductor. These guys are in the top positions in the Moore's Law based Semiconductor Companies. They do not understand that there has never really been a Semiconductor Based NVM. Shocking? yes, nonvolatility requires hysteresis of something inside the material: In FeRAM/FRAMs it is the polarization of ions; in STTR, is the polarization of spins, in FLASH, is the interface polarization of trapped charge in the floating gate, and in RRAMs of the filament type, including CBRAM, it is the local polarization of charge due to structural deformations between the filament and the electrodes - it may be charge trap dominated or just redox reactions. Let us not forget the other strutural changes device in the infamous PCM and the "Memrsitors".
Since there are only three intrinsic material components in nature (as visibly demonstrated in Maxwell's equation) - conductivity, dielectric constant and magnetic permeability (R,C,L), we have to look for HYSTERESIS (zero field/voltage with latent storage) in these properties. FeRAM/FRAM used C, STTRAM, L, and RRAMs and CeRAMs use sigma (conductivity). Now sigma is one of those size fits all grossly misunderstood properties. And, when sigma goes from zero (insulator) to a finite value (Metal like), everyone has a pet theory - and "filaments" have many poetic theoretical descriptions. But, Physicist for exactly 50 years (1963-2013) know what I am talking about. Metal-Insulator transitions that modify conductivity can be taylored to be essentially quantum switches with incredible speeds and low power. And, being quantum in a very fundamental way (the control of 1 electron entering and leaving a single orbital, and thus violating band theory drastically (known since 1937)), there is a chance that switches at 10's of femtoseconds speed with essentially extremely low power exists. For now, we can use these 1-2 masks devices with 100% CMOS friendliness for NVMs - they are simple to make have these bare (no high nodes done yet) characteristics: Storage Temperature of a Memory state= >400C;current density at operation: 3000A/cm2 (can be adjusted higher); Reading endurance - Virtually endless (tested to 1E12, no change); writing endurance: hard to measure but over 1E11 is predicted (hard to measure because of large areas in our devices -to low impedance for pulse testing) and operating temperatures from 4K (-260C) to 150 C (or more) - that is OPERATING, not just storing.
So, just like the individuals in the panel missed ferroelectrics when I wrote the first proposal in 1983, they may miss this one too. Why? because they are advised by old Detroit style know it alls that have a hammer and every problem is a nail. I spent 6 years developing the technology of CeRAM (Correlated electrons RAM) with my team and achieved issued patents issued world wide. Now is time to go into development. Now we will start to publish. Yes, it can do array only, 3D embodiment and it is over many materials platform. It is contact agnostic (Not just Platinum) and can be made at 250 C ALD deposition. Never heard of it? ask your CTO to explain Many-Body electron-electron interactions as the basis of switching and storage in transition metal oxides. If he can't, you are in Detroit in the 70's. Innovation is only true with Science- anything else brings only one or two generations of devices.
The decline of VC in semiconductors is not new, VC companies look for the big fast reward, whether dot.com or cloud risk is OK as long as there's a potential for big fast returns. Semiconductors is too mature now, too much work for too slow a return. There is clearly room for innovation with the diminishing return of improvements from shinking geometries.
If the hypothesis is that semiconductor innovation requires VC investment in semiconductor manufacturing, I disagree. I believe that besides a huge increase of R&D by the larger existing semiconductor providers, a new model of innovation is the proliferation of small IP developers with bright new ideas. Many won't make it, but some offer large returns as their licensee portfolio grows or they get gobbled by a large semiconductor market leader.
I know of one that has been in development for years and has product launch on March 5th, but remains in stealth mode prior. I believe it would clearly evidence a counter argument. If possible to contact me directly I can discuss with you prior to launch?
Yes, this may be different model. Since pioneers of this industry have earn good money, they themselves may be investing in new innovation. They may not need VCs as much as new technology. Also fabless model makes it possible to innovation in any part of world like China, India or east european countries.
>> If the hypothesis is that semiconductor innovation requires VC investment in semiconductor manufacturing, I disagree.
I am not sure anyone is talking about fab. We are discussing fabless strategy where innovation in circuits and systems rule and dominate. If you check very well, we are not attracting a lot of dollars in the design innovation phase. No one is talking of the production innovation - that is not a VC kind of game.
The wealth is now in the vertical system play and the wealthy leaders in this space have an opportunity to capture innovation with a spin-in approach. It also makes sense with the trend towards specialisation - they chose the innovations that meet their very specific needs. There can be no Internet of Things without the Things. Start-ups need to look to the Apples and Microsofts as new venture money and these behemoths need to embrace the best of the start-ups as spin-ins.
Not sure I can agree with your statement about IoT, in fact I believe the "Things" are mostly about items consumers DON'T purchase but are part of the infrastructure like vending machines, parking and utility meters etc., and certainly the Apples of the world aren't going to be (very) involved designing the occasional "connected" refrigerator that needs to ask for the filter on the icemaker to be replaced. The real challenge with IoT will be REAL good security for next to no cost, if we averaged it across the industry I wonder how much was spent per PC before the industry came up with the "kind of decent" security we have now? IoT security will certainly have to be an order of magnitude better for a few percent of the cost, that'll be the real challenge and even funding for THAT will be darned hard to come up with, it's just that if we don't there'll be "Target-sized" security breaches of everyone's electric meters all over the media every week or so until it gets fixed correctly.
Interesting observations about how semiconductor companies are investing more and more in software development, which they have to give a way for free in order to sell chips, yet at the same time VCs have all moved on to web companies...whose only real product is software.
IMO, a silicon company that keeps the details of how to program its chips proprietary so that it's the only software source is not in a position to complain about how much it "must" spend to develop that software. IMO they should release the programming details so that the FaiF open source community can do its magic. Cypress almost does this with PSoC: almost all the register bits needed to program it are in public documents, though a key part is closed. Broadcom documents are usually completely closed.
Actually it is not true that innovation (or VC funding) is not happening in chip industry. The innovation (or funding) is shifting from one domain of chip industry (digital SoC) to another domian of chip industry (power electronics, RF, MEMS etc).
There are substantial little scope of innovation in digital or mixed signal SoC targeted for communication and consumer market. Those markets are saturated by big players. Investment requirement for SoCs targeted for those market is very high making it not attractive to VCs.
But there are new upcoming market like automotive, medical etc where need of SoC is there. The difference is that the technology which dominates those market are different like RF, MEMS, power electronics etc. The investment for those SoCs (and even for fabs) are low. The 21st century will belong to RF and MEMS in semiconductor industry
Just because the economics of Venture investing in semiconductor startups (usually) don't make any more sense, that doesn't mean that IC technology, applications, and product ubiquity aren't advancing (often, still, into uncharted waters).
Especially when you consider the cost vs return potential offered by web/mobile app development, it becomes virtually impossible to justify investing in a semi venture that will take a) $50mil to develop a new product (minimum), b) another $100mil to deploy it, while getting the next product generation developed, and c) doing all this before the opportunity window closes, due to one, or more, of the better-capitalized behemoths in the industry notice what you're doing and (if it seems successful) get a competing product to the next technology node before you can.
There **are** niche opportunities out there. But finding a profitable one with a big-exit potential, that simultaneously can be developed without spending $200mil, is like looking for that needle in a pile of needles.
If, on the other hand, most Venture Capital investors weren't so reluctant to embrace an IP business model, there might be more opportunitiy to develop & deploy vlauable technologies, and let the startup's customers bear the cost and risk of Silicon development.
Every development has a cycle. Computer gets faster and faster everyday. A regular user probably don't feel a different between a 3 years old laptop and today's laptop. However, more products and information are moved to the Internet. Those require software engineers to develop. In addition, today's web-based application isn't so easy anymore. A software engineer can hardly go by with knowing only 3 languages. Today belongs to software engineer. Yet, for example, when there is a breakthrough of semiconductor such as using different material than silicon, hardware engineer will thrive again.
There are few of these types of shows that I regret not being able to attend. This one looks like a glaring exception to that rule. I would love to have been there to hear these movers and shakers speak.
>> We are looking into a Plutonium chip which we expect to self power a SOC and external memory.
You may need to use any description as most may be turned off by knowing Plutonium is inside your product. What if there is fire and the thing explodes? Are they safe?. Think of the reason we have MRI as a name to hide what makes people uncomfortable.