I agree that this is not so surprising given the current economic climate, but I also think that we are missing "killer apps" to use some of that surplus computing power. As has been expressed on this forum before, there is not as much innovation coming out of the PC industry these days. Consumers typically don't "need" higher powered computers because there are few compelling applications that require them.
Yes, this be may be another signal for the end of the desktop PC. Thinking about notebooks: most people I know love the netbooks.
Small, light-weight, have all they need and they are cheap! But they seem to disappear while the industry is pushing for ultrabooks which are quite more expensive. Is it the win margin that the industry pushes for ultrabooks?
I agree with all points made above. The soft economy is undoubtedly a factor making it hard to move premium products in any industry. But, yes, where is the killer app? I am thinking about buying a new PC and think I will probably spend around $500. For the rather mundane tasks that my family and I will use it for, I have no idea why I would be compelled to spend more.
Hey Dylan. You started out great, and then you felt you had to get back to the "post PC era" mantra.
The original point was that people (including myself btw) are buying mid-level PCs and laptops/notebooks, rather than top end systems (e.g. Intel Core i5 vs i7). And you gave good reasons for this. Fast, quad core performance that can handily run the new apps, codecs, and up to date virus shields. Where the older single-core systems became an irritating experience.
But as far as I can tell, the last paragraph is a non sequitur. Nothing discussed previously said anything about "post-PC" diddly, as far as I can tell. And I did re-read the article! My take continues to be that the PC and notebook market is essentially saturated, and what you're seeing is mostly replacement purchases. The tablet and smartphone markets aren't there yet. End of story.
@Bert- your point is well taken. I think when people refer to the post PC era, they aren't saying that the PC will disappear. People will continue to use PCs and replace them. But annual PC sales growth will no longer be a given, and the PC will drive less innovation.
I'm not convinced that this is a foregone conclusion. But people are showing that they are willing to buy a tablet instead of adding a new PC, and they are showing that low and mid range PCs--without all the latest and greatest bells and whistles--suit their needs just fine.
Another reason to shun the high-end is power consumption. My pc has enough compute and graphics power to suit my needs... and it's completely silent.
The game is evolving where increased performance must not come at the expense of more power consumption. On the portable and mobile side size and weight are king. Smaller sizes (thinner) leave less room (and weight) for the battery.
I like my smart phone for its portability but it's of limited functionality, at least from a productivity standpoint. Tablets aren't there yet but will be in not too distant future.
I still like my full keyboard, mouse, 27" 2560 x 1440 screen and decent 3.1 sound at home. No need for a high-end quad core though.
The high-end is only needed for serious gaming and various workstation applications, hardly a mass market.
And gamers are not a mass market? I believe most of the people that buy computing platforms (PC, tablets, consoles, etc.) do so mostly for entertainment reasons as for work the boss supplies the equipment anyway! In that case PC is waning down probably due to lack of a killer application. I just give it to a cool killer app surfacing that requires all the computing power of an high-end PC, and that pleases the masses, and the PC will be back in game. For instance I am a huge fan of space simulation games, and would buy a new high-end PC just to play the new Star Citizen/Squadron 42 game currently in the crowdfunding stage (3.5M USD already raised, 50K plus of eager fans that will probably need their desktops upgraded).
My family's needs for PC computing requires a nominal system for email, facebook, and various game apps. We have absolutely no need for a high end PC or Ultrabook. We have a 3 year old laptop and a 4 year old PC that work fine. As others have mentioned, there are no killer apps that require additional computing power or quad core capability. Tablets need to improve but we already own two and plan on purchasing another this holiday season. Our choice for tablets was based solely on their monility aspect. And even for these we didn't go high-end and they work fine for our needs (B&N Nook Tablet and Kindle Fire). Yes, there is a bit of lag on these devices but for the money we don't complain. And yes, the economy does play into our buying decisions -- do more with less, so to speak... I agree with Les. I am most comfortable with my mouse, keyboard, and monitor but my young kids like the tablets better. I am not a technical expert but I think the high end PC market for the mass population is dead. There is no more chasing the next processor for me and many people I talk to are saying the same thing. In the past the PC had to try to keep up with software developments but that is not the case anymore. I sure hope Intel has something else up their sleeve other than Ultrabooks...
My work Vaio has a switch to switch between "Stamina" and "Speed". The trouble is that you have to reboot for the switch to take effect. So, it stays on "Stamina", because battery life is more useful to me than speed.
Same, Jack! I also have the VAIO and it stays on Stamina pretty much permanently, but that's because I have never really noticed it lagging enough for me to want to speed it up. PC gaming, for the most part, has become such a niche that for most of us, the speed of a regular core i5 or core i7 machine is as fast as we'd ever need. I did crack up a little yesterday when I saw an episode of Dexter where one of the lab guys was using an AlienWare gaming PC to run data though... sure, it only has 45 minutes of battery life, but why not?? ;)
Only two places I currently see high-end desktops utilized are:
1) Traditional hardcore gamers will continue to demand high-end graphics cards and quad-plus core CPUs. This market is however changing rapidly due to the surge of gaming on handheld devices (iPhone, iPad, etc).
2) Business workstations. I work for a ~200 person engineering company and our mechanical/CAD folks typically use high-end PCs for everyday graphics and compute intensive applications such as Pro/ENGINEER and Cadence Allegro. I'd imagine similar for other businesses i.e. graphic design, etc. Sure, we have servers running Xeon/etc for long term simulations and such but the high end PC still sees quite a bit of usage.
Further, with PCs having a lifetime of 5 years or so, the lack of innovation in the PC market, and weak economy, replacement purchases are being put off. Bad news for the PC industry.
3) Niche. A/V processing/encoding.
I'm building a Win8/RADI 0-SSD box for this purpose, and this purpose solo.
The rest of my world will live on Elder Intel Mac Mini's and Atom Net top computers for Skype and Email
What a change. Long ago, you couldn't find a PC for less than $1,000!
Intel used to regularly showcase emerging high performance apps that would need its next-gen CPU performance. They haven't said anything on that front in a while.
These days the main apps are all about the Web so network not CPU performance is more key.
[disclaimer: I work for Intel, but my opinions are my own and do not reflect the company's guidance]
You *have* reported Intel's latest thrust in apps, Rick. Partnering with Nuance to make voice-driven PCs, and the whole Perceptual Computing initiative, for which a SDK was just released. Convertibles and dockable tablets with touch are just the beginning of the re-invention of the PC. Don't count us out quite yet. :D
I can't imagine that desktops are going away (even if there are now lots of other options for non-power home users), and Microsoft Windows and office products look like they are going to continue to dominate the desktop.
Considering Windows 8 provides a fuller experience with a touch screen (even on a desktop) I agree with the idea of the re-invention of the PC and I think it applies to the desktop as well.
Intel's other problem is that when making a choice between a higher processor clock speed (or more cores) and more memory, more memory is almost always the right choice. A SSD might be above a faster CPU, but they are still too expensive for the $500 PC. Most CPUs have enough performance for the consumer workload, but browsers use lots of memory, and consumers only close a window when they reboot the machine.
I have to disagree with those who assume that any status quo of today will remain the status quo for all time!
Yes, we are in a strange phase now, where lots of casual users are buying up smartphones and tablets, and therefore the manufacturers of computing devices and software are dedicating a lot of time and effort to meet these demands. But it's also true that these handheld devices are approaching mid-level PCs in their computing power.
Therefore, the only logical conclusion is that the current status quo will soon end, and the apps will be demanding more power from the handhelds, and consequently also putting upward pressure on PC hardware.
After all, growth curves are always S curves. Handheld devices will also eventually saturate the market, and the device makers and software developers are going to want to have something new to sell. Innovation will not end.
Bert, I'm impressed at your ability to see through the fog, look beyond the 'now', and state reality as it is. Your views seem neither biased nor radical compared to most others I've seen post on EETimes.
From historical trends, you're applying concepts to still emerging markets (mobile) and concluding well before anyone's even thought about this, that the mobile market will see its bright light fade sooner than anyone expects.
The only thing missing here is, what's next? People will have cheap, powerful, efficient, quality computing power. Will the computing industry turn into something like the memory industry? Those that aren't bringing down the costs will be pushed out rapidly because the margins will be so little?
Any ideas on what you see coming, or even what you would like to see coming?
Thanks, Horta, you make me blush. I'm no better than the next guy to see into the future accurately, but it does seem that computing devices will become more or less commodity items, assuming they haven't already done so. The trend was clear when IBM PS/2s had to give way to the cheaper and fast-improving IBM clones. The coolness of the MCA bus of the PS/2 was soon history, at first unable to do double duty as a memory bus anymore, and soon after outclassed for peripherals too. That trend won't stop, IMO.
Initially, digital machines were mainframes. Then the machines became more pervasive with minis and later even more so with the single chip processor (e.g. PCs and embedded smarts in common appliances). I think personal digital electronic gadgets are just a continuation of this pervasiveness trend.
What's next, I'll bet, is more of this. What the press likes to call IoT is simply more of what we have been seeing. Potentially even embedding processors in people (uuh, I mean in addition to the brain).
As to applications, especially if extreme weather events keep getting the limelight, I'll bet there will be a lot of eco-oriented control software being developed, and eco-oriented computing embedded in all manner of systems (the grid, home appliances, transportation, traffic control, you name it). More processing, more smart control, in everything we use. Self-driving cars, for instance, I think are a manifestation of this increasing pervasiveness of computing.
We never know exactly what the future will bring but we can extrapolate some trends to get some rough idea.
There was a time when any respectable computer had its own room. Now we try to make them unobtrusive in our work, play or entertainment environments. On their way to becoming ubiquitous they've lost most of their social standing.
I have no doubt that 'computers' will become much more powerful but not at our expense of their regaining intrusiveness. The trend is for them to disappear. 'Computers' will not be a significant topic of conversation in 10 years.
x86 will not be the primary bearer of the computational power too much longer. It is running out of steam. With asymmetrical computation we will find specialized processors doing a greater proportion of the necessary processing.
DWide1, above points to 'voice driven' and 'Perceptual Computing'. These contain tasks that are best served with specialized cores. Adapteva has developed a 64 core array capable of 100 Gflops at only 5 Watts power consumption. With that sort of affordable performance we will see quite a sea change in what, how and where computation is brought to bear.
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.