For home use, a NAS box with tons of storage has become a critical component -- our own "cloud." We have several desktop PCs (never giving those up!), and I've gotten tired of replacing hard drives just because someone's C drive is almost full -- mostly with media files. The laptops and iPad also share that cloud nicely.
My biggest complaint regarding our various devices is the lack of "whole home" sharing of my biggest content-aggregator device -- the cable DVR box. How nice it would be to be able to watch that content on any desktop PC, laptop, tablet or smartphone in the house, and to be able to archive content on the NAS box. The MSOs are slowly addressing the first issue of multi-screen sharing, but in their own closed-system kind of way, and content owners will never allow NAS archiving of their content. If they would, my "whole home" content sharing problem would be solved immediately.
Well, I may be the anomaly here, but:
I don't own an iPhone, nor do I own a smart phone.
My wife has a iPAD.
Between our offices and home, I and my wife use 6 desktops or laptops.
So my percentages would look more like:
The impression I get is many folks seem to think that shifting "heavy lifting" work from PC to the cloud would result in more use of tablet and less of PC. However, these so called clouds consist of large server farms FULL of PC! Those rack mount systems may not look like your regular PC, but there are PC none-the-less. Many of these are custom-built and as a result may not be counted in typical market survey. The larger server farms may each have over a million CPUs for "heavy lifting". So no matter how you cut it, the demand of CPU bandwidth is increasing, not decreasing. The numbers floating in popular press are just a matter how the accounting is done.
I believe more users will shift to tablet and cloud for the following reasons:
- Instant on and long battery life (no need for a UPS).
- Easier app installation & removal
- App stores only contain sanitized sw (no malware/spyware).
-- If found to contain spyware/malware, the guilty company is banned and app removed.
- Easier to use.
-- Less training cost and IT costs to maintain.
- More development being done in the app space than PC sw market.
- More features: multiple cameras, GPS, gyros, multitouch...
- More power than PCs of just a few years ago.
- Lower cost of ownership (e.g. no antivirus subscriptions or hard disk crashes)
-- If it breaks, buy a new one and resync with your cloud based data.
But...what happens when the shrinking PC market isn't big enough to justify the costs of developing bigger, faster processors?
We're down to two or three companies who have survived in the big processor space. They will soon have to shift to low end processors or die. If this happens, big iron will be a lot more expensive.
Buy IBM, short Intel & AMD ;)
Guessing the rich array of gestures
made possible by the device at
will have a major impact on Windows 8 and
other touch UIs.
("The Leap" appears in January)
2D touch will surely be extended by
a list of standardized 3D gestures.
the end is nigh.... arrrrrrg! Or not...
Come on, things change, evolve, get better, change shape, form factor... that's evolution. It happens to us too. It doesn't mean it's dead or going away. The PC has changed a great deal since its inception too... I don't see the difference between that change and this change.
In my view, what makes tablets so "compelling" is /not/ the UI experience; it's much simpler than that: they are /quiet/.
Build a fanless desktop PC with SSDs that doesn't act like a 300W space-heater, and suddenly it will be a lot more appealing!
Computers should be seen, and not heard.
True enough. Just as many offices have moved to laptops and docks for their bread and butter productivity machines, the same could potentially happen with tablets, e.g. like the Microsoft Surface. Tablets and docks. Useful tablets, tablets with flexible OSs, not toy tablets.
The "heavy lifting" required of most PCs, I would venture, is needed to run effectively the up to date applications, like Office, like Acrobat, and don't forget those pesky virus shields (which have a way of monopolizing CPU cycles). Try converting a document of several hundred pages into Acrobat, with an old, slow machine. It's painful.
Also you need power to run multiple large displays, even if the user isn't into super duty number crunching. Most people around me at work have two or three monitors, and most of them have similar setups at home, so they can work from home.
All I'm saying is, we're not just talking about physicists who want to run supercomputer software at home. We're also talking about lots and lots of office workers or any type of professional here, who may not know the first thing about writing a program.
At my local medical center they used to enter and reference records from my annual checkup and other medical information from a desktop PC in the examination room. That PC station has now vanished from the examination room and the doctors and nurses carry ipads instead. I suspect the doctor still has a PC of some type in his office, but all of the PCs in the examination rooms are now gone. I guess they decided that those PCs took up too much space and were overkill for what they needed. This same type of PC replacement is happening all over in many businesses.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.