Rick great piece as always. First off, I'll say Sanghi has been a rain cloud for some time, but his success as a CEO is unquestioned. I think he's correct in saying there will be more mergers among established semiconductor companies. Companies in general get to a point where they have so much culture and history that they lose the ability to take risks. This is not just in the semiconductor business.
Will there be a very interesting world of semiconductor development in the coming years? Absolutely! Think about it: One data point right now is systems companies in the cloud and consumer electronics spaces are building their own semiconductor design teams because they can (and must) optimize silicon for their own vast and enormous businesses.
At the other end of the spectrum, there are startups. After I bumped into Sanghi at the ACE Awards Tuesday night I then bumped into Andreas Olofsson, CEO of Adapteva. You've written extensively about him and his company and its Kickstarter funding story.
Sitting in between are EDA and IP companies, whose businesses need to change. They're acting on it because everyone knows we need silicon and systems innovation and everyone knows engineering teams need to be more productive. We abstract more, shift value propositions from one sector to the next as the industry matures.
Microchip CEO Steve Sanghi said "The days of forward-pricing chips based on expected advances in process technology are over. The industry has to change its practices -- you have to make money today, because no one will let you make it tomorrow."
Intel recently announced job cuts of over 5,000 employees and closed Fab 42 in Chandler, AZ for the foreseeable future. Intel stated the reason was they could not respond to the customer's requirements for tablets and low-power systems on a chip (SoC). In a recent job posting Google said "Our computational challenges are so big, complex and unique we just can't purchase off-the-shelf hardware. We've got to make it ourselves." Really.
I think it is a sad day when chip companies are mothballing fabs, laying off employees and eeiking out miniscule rasies to their design engineers while their potential customers are designing their own chips. I disagree with Steve Sanghi's comment when he says "The go-go days of double-digit revenue growth are over as the chip business settles into a middle age measured by mid-single-digit annual growth." Why can't Intel, Microchip and all the other chip makers respond quickly, creatively and cost effectively to the changing demands of the chip market? Is it because they are still using the chip design tools and processes of the past?
Gabe Moretti commented in his article The Approaching Discontinuity, "The world of EDA is about to change. The subtle signs are there for all to see, and the coming reality is so different to be scary to some. Thus better not to talk about it. The changes will include how ICs are designed, developed, and verified. They will involve designers, tools developers, and manufacturers, and force an integration that the EDA industry has not experienced so far."
So why aren't EDA companies building better tools? Everyone in the industry knows this is needed, but where are the solutions? Do the entrenched EDA tool suppliers simply think they don't need to change? Are the chip design teams and CAD groups so focused on saving their jobs today that they don't see how this will damage the bottom line tomorrow? When Google announces it is going to circumvent the entire industry and do it themselves, shouldn't chip makers hear a voice in their head saying "Danger Will Robinson. Danger!"?
Without pressure on EDA companies to start building software the chip designers need to stay competitive sales will continue to erode and more layoffs will ensue. If the chip companies are going to survive they are going to have to make big changes immediately including dragging their current EDA suppliers kicking and screaming into the 21st century . . . or find one who is already there.
It's not design. As you have mentioned his quote in your para 1 "....based on expected advances in process technology are over...." Process Technology is the keyword here. His views are perfectly in line with other articles appearing these days about Moore's law coming to an end. Double-digit growth can not be sustained indefinitely.
After many years in the industry watching customers try to do their own design and coming back asking us to do it for them, I think Google will probably fail at their quest to build their own. They will probably under estimate the cost and under resource the effort. Just throw a few software guys together and have them design the chip we need. Seems like a good idea to get what we want. Problem is that they will have to make and learn from all the mistakes that the large semi companies have been through over the years. The delays and costs will eventually kill the effort.
<<Just throw a few software guys together and have them design the chip we need>>
Your prognosis is not how it is being done. Those system companies hire the same design engineers previously employed in the semiconductor companies that are now laying off engineers, they buy their wafers from TSMC, buy their tools from Cadence, and license their ARM cores just like the semiconductor companies do. There is no secret sauce anymore. It's all available off the street. Don't kid yourself.
I had several conversations with chip, EDA and IP people at EE Live and the one thing I got out of it is very few know what is going on in the market. There are EDA guys that have built great tools, but they rely on marketing practices that died 20 years ago, so no one really know what they have (including them). The semi guys are squeezing their tool and material suppliers to cut costs and, at the same, time, killing technology advances that could do the cost-cutting job for them, and the IP guys are chasing after each others customers and ignoring potential new markets for thier ideas.
I think Sanghi is probably right about the future, but it doesn't have to be that way.
Sanghi says that the model is broken. Here is an example why.
A 19 year-old person can go on the internet, buy a bunch of 50 cent parts overnight, build a prototype system with Bluetooth, Wifi, Arm processors, etc. and test the product interest on Kickstarter. If he's lucky, he'll sell $50,000 worth of product, pull in another $300,000 from keen investors who monitor those sites, and build a sustaining business in less than two years selling a few million dollars in hardware. And the semiconductor companies that enable this--the ones that get 50 cents--well they get a whopping $10,000 only if and only when the young person's company makes a few million.
The economics of this broken model don't make sense. Intensive research and development with multiple PhDs on the staff, sophisticated multi-million dollar tools, and complex supply chains have little value anymore. It is all reduced to 50 cent parts.
Now that it is back to basics for the old-school semiconductor companies, selling pieces parts, one $70,000 purchase order at a time, it's time to rethink how those 50 cent parts are sold. The idea that the price of a semiconductor device should be based upon the cost of goods is out-dated. What's the cost of goods sold for software downloaded on the internet? Music downloaded? Who knows. Who cares. But somehow Wall Street establishes billion dollar valuations for those IP companies without worrying about COGS. But in the semiconductor space it is all about gross margin; the multiple on cost of goods sold--a business model that died in the middle of the dot-com revolution.
Scott:I don't get something. There must be some reason companies don't just price their components higher, and it's not wall street , becaue wall street wouldn't mind higher profits.
Maybe the microcontrollers and bluetooth devices are mostly commodities and are mostly built into commodity products. If you won't sell cheap, some ther guy will.
Heck , i if i want i can just go get a bluetooth core to build my bluetooth design on. There are 42 such cores availble at . Imagine what that must do to prices.
But you as an MCU seller, says: not true. I just added this great peripherials for this expensive chip. I am not a commodity. So the board designer buys a cheaper chip and uses software, a 50 cents FPGA or a million other tricks to solve his problem for cheaper. Yes maybe that won't be the most optimal design but still, it'll be cheapr and most designs are cost constrained. And anyway , in a minute and half a competitor will rise.
As for your comparison with software: the marginal price of most software is ZERO. most of the price in those billion dollar software companies is in community and network effects. The x86 companies have managed to create those effects into their chips and hence their value, but most chip companies can't.
Some other business models for apps are the free-to-play model , which manipulates users psychologically into paying for virtual goods. I trully hope there won't be such model applicable for chips.
Hi Alex, you're right. The prices are set by supply and demand. When parts all do roughly the same thing, they are a commodity. You're not going to buy the $1 part when you can get a $0.50 one for the same price. It seems like the parts should sell for a lot more. Once upon a time a transistor was pricey. As soon as one competitor figures out how to drive the cost down, it's offered at a lower price and it's a race to the bottom.
It's very hard to add value to products if the spec is all that matters. One purse that holds 1 L doesn't sell for the same price as other purses that hold 1 L because there is a great difference in perceived value. In electronics this difference is largely zero... trust that the part will work as advertised may still be worth something.
The only way a chip is going to command top dollar is if it can do something important that no other chip/solution can do.
Absolutely Wall Street is working correctly. If you sell a complex product whose price is based upon the cost of goods rather than the perceived value, you are selling something anyone can do. That's the problem that needs to be solved.
A company can ride out the innovations of the original entrepreneurial team only so far. But in the semiconductor business, most of those guys have left the building (like Elvis). Even Sanghi, according to this article, is planning on leaving soon. What is left is generally people who keep that "stone wheel" turning, but that only lasts so long before a new generation of entrepreneurs further up the food chain take over.
The semiconductor companies are exactly where the module companies were in the 1980s when higher performance was available on an IC, many times for less money. And the module companies displaced the discrete component companies before them. Now we are at the point in technology evolution where it is all about systems.
Today, most systems are unique, so it is not generally possible to define a system level IC that is used by many—a conflicting requirement for successful semiconductor products. But that too will eventually change much like the world of mobile computing where only a few companies provide sufficiently adequate solutions. And as you point out, if one has a successful system definition, one can go out and do the IC design themselves for less money using readily available resources. So where does that leave the generalists? The semiconductor companies. Well, if you don't have a secret sauce, you either go mix up a new sauce or plan on obsolescence. Just like the discrete guys and module guys before them.
I'm not sure it's even about systems. Look at pebble , the smartwatch company. Yes it'll make some money , but in short while ,archos is going to offer a $50 smartwatch and suck out a lot of the money from that market, unless pebble has unique value that's hard to copy. Maybe its micro-apps will be that, maybe not.
On the other hand, it's not rocket science to build/invent a smart watch(they build the prototypes using arduino), and they did made some money(and had a potential chance for ackuisition) , so it could be a good return on investment. Not VC like return(they got spoiled by app companies), but still good.
So maybe those are the right expectations for a system company ?And a chip company ?
And the missing piece(for chips) is low cost of invention ?
Chips are commodity ingredients. Flour, eggs, and sugar are low-priced commodities. Once someone correctly mixes them together, they can get a branded cookie in nice packaging that tastes great and is ready to eat! No one is going to pay more than bottom dollar for the cookie ingredients but the cookies are another story.
If chips had a low cost of invention, all that would do is bring the cost down. Competition would rapidly destroy the extra margin that could have been realized by a low development cost.
Even with low cost of invention, somewhere between invention and product, the value of the development should be high. Here "value" should be how well it enables the product to be priced higher, e.g., unique or unsurpassed functions or capabilities.
The big problem is that it's hard to increase the perceived value of a semiconductor. Any part that meets the spec is OK. It's not like a handbag where the intangibles can set the price anywhere between $5 and $50k.
You are right that only unique capabilities that are valued by customers will be able to command a respectable price. The problem is that most capabilities can quickly be copied by competent engineers. Patents don't amount to much when there are many ways to skin a cat. Only the execution of an idea is patentable, so that only matters when there's one possible execution.
Most electronic gadgets could be considered miraculous when judged on their own merits but in a brutally competitive market economy they become commodity junk, soon to be obsolete very quickly.
Maybe we should be optimistic that as it is harder to advance to new technology (because of cost), it is also harder to copy. Whereas the cheaper mature technologies are quite easy to copy. In this sense, for the choice between moving forward and staying behind, your point is well-taken that choosing to stay behind may actually not be the safe choice.
Don't forget that advancing technology doesn't necessarily drive the price up. We get far more powerful gadgets for less money than ever! It's also possible for advanced technology to be undervalued. Look at PCs... speed increases used to be voraciously consumed but now people are keeping old PCs for longer than ever; they simply don't need an upgrade.
Interesting that you bring up the fact that people are keeping old PC's longer than ever because I've just had to replace one at work and one at home recently. We upgraded the one at work a few weeks ago because it was just too slow; I'm talking >1 minute between mouse clicks - unusable! The one at home, a ~10 year old generic PC that I use in my home lab running Win 2000, would not install software for programming an RF synthesizer eval board because of a missing .dll, and then when I installed the software for my new USB microscope, it wouldn't run, IIRC also due to a missing .dll. So I sent that PC to Goodwill since the price I could get for it wouldn't be worth the bother of selling it and replaced it with my 8-year old HP tower running XP. Maybe it was Win 2k that was the problem?
The point in both cases is that we upgrade to new hardware not necessarily because we need the faster clock rates or more cores or whatever because we are doing more and need more computing power but because we're running into software compatibility and performance issues. This doesn't bode well for us in the chip business.
Fortunately for me and my employer, the falloff in rate of increase of need for computing power has little effect on us. The need for faster, better electronic communication shows no signs of falling off anytime soon. Mostly we're worried about keeping up with demand, the fierce competition, and internal politics.
Bloated, lazy software drives the need for PC hardware upgrades. You could argue that there are more features as well but I'm not sure how many users care about the kitchen sink kind of features that tend to get added to justify upgrades. Is Word today really that much better than Word 5 years ago?
Around here, the fiber op company is desperate to sign up subscribers. I suspect that most are happy with 20 Mbps cable (at least they aren't willing to pay more for fiber op). I believe that there's a rather low bandwidth limit that consumers will be satisfied with. A substantial bandwidth improvement with the applications to go with it will be needed before consumers care. Surfing the web and watching Netflix can be covered pretty will without any insane bandwidth reqs.
Being able to surf high-def on-demand shows as quickly as you can with regular cable would be a killer app.
Oh, I agree 100%! As a die-hard hardware guy, I never miss a chance to bag on the (mostly deserving) software weenies! No way are any of use going to even scratch the surface of what most software can do. Note to software weenies: concentrate on making stuff that works rather than adding useless features (those two objectives tend to be mutually exclusive)! Also, make user interfaces so that simple, often used things are readily accessible and easy and simple to do. If the user wants to get deeper into it, fine, but don't force us to jump through hoops just to do the simplest, most common things. To their credit, this principle is starting to be followed; smartphones are fairly easy to use, despite the level of complexity.
I think that there would be bigger opportunities in fiber optic networking for wireless backhaul and Internet infrastructure where you're passing multiple people's video streams simultaneously than in trying to connect fiber to every house. I don't think fiber makes sense in the "last mile". Now, although I officially work for the Mobile and Wireless Group, I do have a background in F/O and hope to do some more work on it in the near future. Optical comms really took a dive with the dot bombs back in 2001, but it sure is a good area now.
Yes, the software needs to catch up with the hardware once again. Software people just don't seem to live in the real world... Sigh.
No need to live in the real world! Software people don't optimize... fast hardware takes care of performance issues! They also don't care if it works out of the gate. No recalls to worry about... just put out a patch.
Software is still hard... just has a different set of things that matter. Nowadays it seems that UI is #1.
@highlander - yes lowering barriers does decrease growth in prices. But it still increases innovation. Meaningfull innovation can open new application areas and markets , making more people/businesses buy more electronic stuff, meaning growth, even with jellybeans.
Another growth option is a shift to online-only commerce - which seriously decrease the share retailers take from products , leaving more to be shared between manufacturers and consumers and/or seriously decreasing prices of electronics(sometimes even around 30-40%) and general products. All those can easily lead to growth.
Hi Scott, the economics of the industry work as they should. I know it feels like all that sweat and blood should yield more than a 50 cent part. You're struggling with the notion of value. Surely a microcontroller is worth more than a candy bar that requires zero risk or R&D, right? Well... not according to the machinations of the market.
I'm sorry but this story is a joke. Maybe Microchip's business is slowing because of you Sanghi. You are in a market that is easy to do. You are competing with multiple other competitors. So many people make this little $2 or $5 microcontroller. They are jelly beans.
Are you in any mobile phone designs? If not, why not? Is it because you missed the boat on mobile? I see a lot of growth in mobile phones. I see Qualcomm exploding. They can't build new buildings fast enough to keep up with the hiring needed.
If I were a Microchip shareholder, I would basically want you fired for what seems like giving up. If I was a Microchip employee, I would be looking for a new job.
There's nothing actually wrong with the business model. The model is to make chips and sell them for more than it costs to make them. It's better to say it's a brutal business than it is to say the model is broken.
This is a great interview, Rick. Sanghi is always a good interview because he is blunt and straightforward. I do, however, remember a few years ago, when Sanghi asked me, "Why does a toaster need to be connected?"
Well, it doesn't need to.
But definitely, he is coming around on the IoT concept when he checked the temperature inside his Tesla!
Reading this article reminded me of a Forbes Magazine piece published back in 1998 called "25 Cool Things you wish you had...and will" (http://onforb.es/QaX4zB). In particular, I thought of the quote from Gene Frantz from TI: "The goal is the Dick Tracy watch."
Now that we are on the verge of making the Dick Tracy watch a reality, it is unclear how many people really want it or need it. I think the larger question is, "What do human beings want to do that can't be accomplished with existing semiconductor technology?" Are human beings becoming more interested in things that don't require new semiconductor technology?
@Zagzagel: Now that we are on the verge of making the Dick Tracy watch a reality, it is unclear how many people really want it or need it.
The idea of the Dick Tracy watch is very cool. The reality that we now know is that it is only as cool as the people who are using it. I want a Dick Tracy watch, but once I have one I need cool friends who also have Dick Tracy watches. Otherwse it's pretty pointless, or worse (just think of when the telemarketing calls start coming in).
But seriously, I think you're spot on with addressing the larger questions of what people want and need.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.