And gamers are not a mass market? I believe most of the people that buy computing platforms (PC, tablets, consoles, etc.) do so mostly for entertainment reasons as for work the boss supplies the equipment anyway! In that case PC is waning down probably due to lack of a killer application. I just give it to a cool killer app surfacing that requires all the computing power of an high-end PC, and that pleases the masses, and the PC will be back in game. For instance I am a huge fan of space simulation games, and would buy a new high-end PC just to play the new Star Citizen/Squadron 42 game currently in the crowdfunding stage (3.5M USD already raised, 50K plus of eager fans that will probably need their desktops upgraded).
We never know exactly what the future will bring but we can extrapolate some trends to get some rough idea.
There was a time when any respectable computer had its own room. Now we try to make them unobtrusive in our work, play or entertainment environments. On their way to becoming ubiquitous they've lost most of their social standing.
I have no doubt that 'computers' will become much more powerful but not at our expense of their regaining intrusiveness. The trend is for them to disappear. 'Computers' will not be a significant topic of conversation in 10 years.
x86 will not be the primary bearer of the computational power too much longer. It is running out of steam. With asymmetrical computation we will find specialized processors doing a greater proportion of the necessary processing.
DWide1, above points to 'voice driven' and 'Perceptual Computing'. These contain tasks that are best served with specialized cores. Adapteva has developed a 64 core array capable of 100 Gflops at only 5 Watts power consumption. With that sort of affordable performance we will see quite a sea change in what, how and where computation is brought to bear.
Thanks, Horta, you make me blush. I'm no better than the next guy to see into the future accurately, but it does seem that computing devices will become more or less commodity items, assuming they haven't already done so. The trend was clear when IBM PS/2s had to give way to the cheaper and fast-improving IBM clones. The coolness of the MCA bus of the PS/2 was soon history, at first unable to do double duty as a memory bus anymore, and soon after outclassed for peripherals too. That trend won't stop, IMO.
Initially, digital machines were mainframes. Then the machines became more pervasive with minis and later even more so with the single chip processor (e.g. PCs and embedded smarts in common appliances). I think personal digital electronic gadgets are just a continuation of this pervasiveness trend.
What's next, I'll bet, is more of this. What the press likes to call IoT is simply more of what we have been seeing. Potentially even embedding processors in people (uuh, I mean in addition to the brain).
As to applications, especially if extreme weather events keep getting the limelight, I'll bet there will be a lot of eco-oriented control software being developed, and eco-oriented computing embedded in all manner of systems (the grid, home appliances, transportation, traffic control, you name it). More processing, more smart control, in everything we use. Self-driving cars, for instance, I think are a manifestation of this increasing pervasiveness of computing.
Bert, I'm impressed at your ability to see through the fog, look beyond the 'now', and state reality as it is. Your views seem neither biased nor radical compared to most others I've seen post on EETimes.
From historical trends, you're applying concepts to still emerging markets (mobile) and concluding well before anyone's even thought about this, that the mobile market will see its bright light fade sooner than anyone expects.
The only thing missing here is, what's next? People will have cheap, powerful, efficient, quality computing power. Will the computing industry turn into something like the memory industry? Those that aren't bringing down the costs will be pushed out rapidly because the margins will be so little?
Any ideas on what you see coming, or even what you would like to see coming?
I can't imagine that desktops are going away (even if there are now lots of other options for non-power home users), and Microsoft Windows and office products look like they are going to continue to dominate the desktop.
Considering Windows 8 provides a fuller experience with a touch screen (even on a desktop) I agree with the idea of the re-invention of the PC and I think it applies to the desktop as well.
I have to disagree with those who assume that any status quo of today will remain the status quo for all time!
Yes, we are in a strange phase now, where lots of casual users are buying up smartphones and tablets, and therefore the manufacturers of computing devices and software are dedicating a lot of time and effort to meet these demands. But it's also true that these handheld devices are approaching mid-level PCs in their computing power.
Therefore, the only logical conclusion is that the current status quo will soon end, and the apps will be demanding more power from the handhelds, and consequently also putting upward pressure on PC hardware.
After all, growth curves are always S curves. Handheld devices will also eventually saturate the market, and the device makers and software developers are going to want to have something new to sell. Innovation will not end.
Intel's other problem is that when making a choice between a higher processor clock speed (or more cores) and more memory, more memory is almost always the right choice. A SSD might be above a faster CPU, but they are still too expensive for the $500 PC. Most CPUs have enough performance for the consumer workload, but browsers use lots of memory, and consumers only close a window when they reboot the machine.
[disclaimer: I work for Intel, but my opinions are my own and do not reflect the company's guidance]
You *have* reported Intel's latest thrust in apps, Rick. Partnering with Nuance to make voice-driven PCs, and the whole Perceptual Computing initiative, for which a SDK was just released. Convertibles and dockable tablets with touch are just the beginning of the re-invention of the PC. Don't count us out quite yet. :D
A Book For All Reasons Bernard Cole3 comments Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...