Hereís a look back at my 10 biggest trends of 2012. Though the calendar has just turned the page, I believe my list will drive the global electronics industry in 2013. What's your take? Please read the article and enter your take in the comments box at the bottom.
1. Japanís Decline
I met recently with Brian Toohey, head of the Semiconductor Industry Association. Among other things, we discussed SIAís history and how it was formed 30 years ago in response to the competitive threat posed at the time by Japanís semiconductor industry. Recall that Intel had recently exited the memory business (in hindsight, a brilliant move) and other chip makers like Siemens and Texas Instruments were wishing they had got out at the same time.
Fast forward 30 years and the Japanese semiconductor industry finds itself in dire straits. EE Times international correspondent Junko Yoshida has reported about the many iconic Japanese electronics that are consolidating to survive. Sharp is perhaps the weakest and but Fujitsu and Renesas also are struggling.
What happened? My analysis is that making huge bets on commodity businesses is inherently risky, and there are no more risky businesses than memory and displays. Hence, ďJapan Inc.Ē put itself in harmís way and underestimated the market power of Samsung and Micron.
Thirty years ago, Japanese companies were respected for their innovation and consumer marketing prowess (think Sony, Canon and Panasonic) but something happened in the 1990ís that stifled innovation and slowly asphyxiated Japan Inc. Was it simply due to the Innovators Dilemma working on a national scale? Or was the advantage provided by manufacturing excellence simply not sustainable? This is fodder for books, conferences, pundits and policy makers and would love to hear your views.
IMHO Japan was on a roll and overgrew itself. The world did not need a dozen vertically integrated electronics giants making everything from chips to drives to TVs and washing machines. The dozen giants now weighed down by their own heft are slowly and painfully getting pruned back to four or three.
Hopefully the corporate cutbacks will release some of the talent needed to ignite Japan's dormant startup community.
Maybe this is an age-related perspective thing.
I never thought of Japan Inc. as being the great innovator that others mention. Instead, they took technology that already existed (cameras, including high end cameras, electronics, automobiles,) and they made these for less. Over time they tweaked the designs too. Japan is now going through the same changes that Europe and the US went through, perhaps 20 to 30 years earlier. I think the term "innovator" is too often used to mean, package a well-known product in a nice, trendy form factor, and call it brand new.
China is today what Japan was right after WWII. No doubt, China will also move more into design. So I see Japan and China as nothing more than two countries going through the same evolutions as the US and European countries went through previously.
The "Internet of Things" is mostly press hype. It's an on-going trend which arguably started with the telephone network. Perhaps academia also capitalizes on the press hype, as a shorthand term that everyone (thinks they) can understand. The Internet has ALWAYS been "of things." The only change is, as processing has gotten cheaper, the "things" have become more and more numerous. Used to be only mainframe computers, right? Went to minicomputers, then personal computers, then the peripherals of these PCs, then sensors, then smartphones and tablets, why not refrigerators, coffee brewers, and cars?
"The cloud" is another example of hyped. "The cloud" is something built up from two-way networks. The Internet, helped considerably when the WWW was introduced, gives this "cloud" a big boost, however it existed even before the Internet. Don't believe me? Search under "X.25 networks." I was doing "cloud computing" back in the early 1970s, logging onto some large mainframes which could have been anywhere, from remote keyboard terminals connected to modems, with paper tape for non-volatile storage.
Bert, I have to agree with you regarding Japan's innovation perspective. The big Japanese firms power was in tuning production and yields. They typically did not lead the industry but implemented things very quickly and with a rigor aimed at production/yields. The cloud hype is an interesting phenomena that everyone seems to want to jump onto the bandwagon. Until the internet is really EVERYWHERE and SECURE I would not trust any mission critical work/data there.
The comment that Japan and China are simply evolving just as the U.S. and Europe did belies a complete misunderstanding of history and culture. The 19th century belonged to the UK and Europe. The 20th to the US. Both are now in decline. China and India (and its predecessors) dominated the first and second millennium and are returning to power. There was no Japanese century. Japanese demographics are catastrophic and nothing like the west. Japan is in an inexorable decline, China is in an inexorable rise. And to say China (population 1.3B) is following the same the same path as Japan (population 0.15B) is silly.
Furthermore, the cloud isn't hype at all. It is an old idea, redeployed and an idea whose time has come. Genius is not in the novelty or newness of an idea but the deployment and timing.
Seems like a purely Western vision of history. Surely, China had many centuries when it was supreme, from its point of view. As to its rise, all it would take is another "cultural revolution" to stop its progress tout de suite.
As to the "cloud," even without harking back to the days of X.25 nets, what about the WWW, beginning ca. 1994, was NOT a "cloud"? How long have we been doing "cloud computing" before it was called that? Turbo tax, for instance? Web-based e-mail?
These catchy words are used to generate enthusiasm, to spur any number of academic symposia (publish or persish), and to help foster grants for research. But this is EE Times. We need to be able to see beyond the flash, no?
the difference with the "cloud" surely is that the computer you are using does not belong to either you or your employer, but instead to a third party that has a warehouse full of the things waiting to be used for stuff. To start with this was using spare cycles on big farms built for other purposes (eg web search engines during the hours the US was asleep etc) but more commonly now these are dedicate farms for doing such work. Making good use of that resource (and making a profit if you own that resource) are the real challenges
"the difference with the 'cloud' surely is that the computer you are using does not belong to either you or your employer,"
At first, I thought you were saying that the device in your hands, or on your desk, wouldn't belong to you. Then I thought, maybe he's just talking about the processing power would be "in the cloud," and your own device would only be displaying results?
Either way, though, "the cloud" is at best an evolution of what we already know. Take something like Bittorrent. Is that not "cloud computing"? Or the way PCs can be tied together for SETI research. Or, more along the lines of my previous examples, online banking, online shopping, online language translation, streaming TV online, watching weather radar loops online, etc. etc.
All of these are examples in which the bulk of processing power you are using is on web servers, or in servers in the Internet in general. You haven't downloaded software to run locally, except at most applets.
This can only be considered cloud computing, and yet no one bothered to give it that catchy name. Now that they have, people seem to think it's something all new.
Having been to Japan 30+ times on behalf of several top US IDMs since the late 1980s I could see what was going on. Japanese domestic demand was stagnating following their Real Estate Bubble and the US had switched attention from Japan, its loyal vassal since WW II, to Communist China and giving the store away. Japanese Consumer electronics companies, especially the Semiconductor co.s had hitched their stars to the US market. But outsourcing in the US ditched Japan. At the instigation of Wall St and Wal Mart, both US and Japanese technolgies were xferred to LCMRs like Taiwan, So. Korea which unlike Japan had a fresh crop of US trained PhDs who could absorb them very quickly. Japanese inductry could not keep up with the newcomers because they were used to experimental R&D, could not speed up work by modeling. So in short Japan got priced out by neighboring countries who could speak better English, do better Math and worked harder.
The word "Quality" has been the mantra for Japanese manufactures. Now it is getting curse for us, especially for big-name manufactures. We are very slow to adopt new technologies, because we afraid the technology might be immature / incomplete so it won't meet "Quality standard". Even if just one of hundreds functions is found to be faulty, we just NOT to adopt that technology.
The "software is free (freemium)" topic is an interesting one. The drivers and the OS must be free in order to sell hardware, but at the top of the stack, it is the apps that command most of the profits.
Some software is free or "freemium", other software is expensive, and hardware usually falls in the middle.
Working for a major distributor I can tell you that Japanese suppliers are on the lowest rankings due to poor marketing, communication and just about every facet that we need to help sell product.
On the other hand Microchip and LinearTech product walks out the door with little stress in large numbers. What would I sell...what would customers buy?
Sucessful sales depends solely on people in the supply chain pleasing each other with goods and services to sell. Many if not most in the channel cannot clip their ticket if the end customer cannot fullfill his need at that moment that he chooses.
So why in this day and age do so many in the supply channel still believe they are doing the customer a favour selling to them and making it hard?
As consumer products became more software-based than hardware-based, including cameras, music players and smartphones, they began to change rapidly in shortening cycles. If you are focused on perfecting the means of physical production, you will fall behind in rapid, iterative design. You need a culture of constant change and risk taking to keep up. Charlie Babcock, editor at large, InformationWeek
it's unclear whether listing "Marketing Matters" as a Top Ten anything in 2012 is noteworthy or pathetic. Marketing has always mattered, not just the obvious B2C, but just as importantly B2B. Tech companies esp. B2B) are almost universally managed by engineers, who have some modest intellectual appreciation of the need for marketing but, in their gut (read budgets), consider it a waste of time. Why? Because B2B marketing has been more faith-based than data-driven. Thank goodness behavioral economics is now proving what we B2B marketeers have known for years. Humans (even engineers) buy for all sorts of reasons, some logical, some illogical. Fear, greed, love, hate, security, etc. are all powerful emotions which drive the purchase of anything, from cellphones and cars, to $50M lithography tools. Anyone who thinks those decisions are driven solely by data do it at their own peril. The landscape is littered with companies with great technology and products but lousy marketing. Remember DEC?
I've never thought of Japan as being brilliantly innovative. Japanese expertise has historically been in *refining* existing products and processes.
What is happening to major Japanese electronics firms now recapitulates what previously happened in other areas, like steel.
Japan is suffering now in electronics because things eventually become fungible commodities, largely equivalent and available from multiple suppliers, and competition reduces to price, where the lowest cost producer wins. For structural reasons, Japan *can't* be the lowest cost producer, and can't compete on price.
The Japanese TV makers are a good example. When big flat screen TVs were the hot must have product, they invested heavily in the capacity to make them and did well. As the high end of the market became saturated, prices began to drop, and Japanese manufacturers found themselves in a position where they couldn't make money on TVs at the prices they could sell them for, because their costs were too high.
I'd fault the JApanese TV makers for not seeing the cyclical natue of the market and the handwriting on the wall. They arguably should have looked to reduce their exposure in TV and looked for other markets to address. Sooner or later, what you make will become a commodity competing on price. *Can* you compete on price? Should you try, or should you look to leave the market before that point and find other things to do?
And the same happened to European and US consumer electronics industries. What survives is mostly the super high end industry in this area, while the mass market equipment, whose brand names everyone recognizes, has to go to the lowest cost manufacturing countries.
Kickstarter could be a very valuable addition to the world of start-up finance. It has the potential to fit in a nice spot between family money and angel investors. But, very shortly, it will become a clear example of why investors require business plans and people with the knowledge to execute those plans.
I'm not a huge fan of the VC world, but it is necessary and some of what it does makes sense. The diligence that VCs do before investing weeds out a lot of businesses that have no chance of making it. Even with a goo market, plan and team, the odds are long against success.
Kickstarter needs to solve a few problems if it wants to last beyond 2013. As it is, a few high profile cases like the Pebble watch will make people wary. Then the scammers and opportunists will jump in and KS will join the long list of over-hyped great ideas with incomplete execution.
They've tried to clean up a little bit with some of the new rules, but I don't think that will be enough.
Internet of Things
I think of a plan put forward by a startup here in the SF bay area.Looking at the history of the two principles helped me consider it in context.One was a graduate researcher at Stanford in robotics whose family had been farmers for generations in China.The other had been an engineering team leader for Trimble Agricultural[look into that for a nacent net-integration example].My understanding of their plan is that they aim to deploy many sensor bearing robots to characterize fields real-time fine grained-locally.Also beyond defining the fields it would include flow and application control systems.