My own objection is to the refrain that PCs will disappear, Junko. In fact, we talked about this at the dinner table last night. My wife and I went through the things we use PCs for at home, and wondered how anyone would find those same tasks comfortable on an iPad-like tablet (i.e. a pure tablet, not one with detachable keyboard, mouse, and perhaps capable of connecting to a decent size screen).
Like, one example, would I feel confident moving around my investments online, on a smartphone or tablet? Uuuh, no.
Or another one. We recently did some serious shopping for rugs and rug pads, including research on what kind of pads work on what kind of floors. Would I have selected rugs based on what a smartphone screen showed? Uuuh, no again.
I agree with your point that the two serve different functions. Although I will concede to the "PCs in decline" crowd that in principle, the ergonomic shortcomings of smartphones and tablets can be remedied, without requiring a whole 'nother CPU. Either a dock, or at least a USB 3.0 large screen, keyboard, and mouse, can turn a tablet into something more generally useful.
I can see how some notebook PCs are becoming tablets. There is nothing wrong with that, and as Bert said, that is a natural progression.
But that doesn't mean that the need for a desktop PC, or a large-screen notebook PCs will go away. For work purposes, a PC as we know it in the current form factor, will continue to exist.
In my humble opinion, Intel won't be able to rule the tablet market just because a new CPU makes it easier for PC OEMs to build a cool tablet/notebook PC hybrid. For that, we are only talking about a notebook replacement market.
There might be some people out there still thinking that they will only by tablets when tablets can run full PC functions. But to be honest, I think those who think that are a minority. A lot of people have come to accept tablets, not as a PC replacement (or wishing if their tablets are PCs), but as a whole new entertainment device.
For content creation as opposed to mere consumption, the tablet's biggest limitations are ergonomic ones, not computing power ones, and those limitations are primarily on the input side of things, not the output side. The touchscreen keyboard is ok for modest volumes of typing, but not as efficient as a physical keyboard. Using your fingers as a mouse substitute is ok for modest amounts of graphic object manipulation, but not nearly as efficient as a real mouse.
I have run some VNC sessions from my iPad to my "real" computer where I do engineering work, and have viewed schematics, edited text files, run simulations and even viewed simulation waveforms. It's not as productive as doing those tasks on a "real" computer, but I found that the primary reasons for that were simply no keyboard and no mouse. Add those two input devices and I see no reason why a tablet can't be just as productive for content creation as a laptop.
Then again, if you add those two input devices, how is your tablet then any different from a laptop?
I'm not on board with it at all. The issue is form factor.
I do some content creation. I write. I create/manipulate images. I do DTP. I wrirte the odd bit of code. I can't do those things on a tablet at all well.
The sort of things I do need a big screen. (I have a 23" monitor in 1920x1080 resolution, but I wouldn't mind a bigger one in higher resolution.) Keyboards matter. I need a full size keyboard, and I'm picky about the layout. (I'm typing this while traveling on a netbook, and not too long ago I hit the power buttton instead on the Del key and put the machine to sleep in the middle of typing a reply to a post. Grrr.) My productivity on this device is a fraction or what I can achieve at my desktop. And I need, and use, a mouse.
Hey, Jon Peddle? "I think the tablet will now go from just a media consumption device to a content creation machine" You first. Get a tablet (powered by whatever processor you like.) Restrict yourself to creating all your content on it for a while. Come back and tell us how well you did. I strongly suspect you'll back away from that assertion.
Waiting for a comment about who will actually buy these chips. The only vendors making enough margin on pads to be able to afford high end Intel chips are Apple and Samsung. Both use proprietary solutions now. No money in low end Android, vendors are beating each other to death on price, with no room in the BOM for expensive processors. Windows has failed on tablets. Who is left??
Gondalf, are things like circuit routing and weather simulation routinely done on mobiles or tablets? Being a workstation-oriented benchmark, SPEC is hardly representative for desktop PCs, let alone mobiles and tablets... Geekbench is not the best benchmark, but it is certainly far more useful than SPEC.
Note that Intel's TDP is not the maximum power (what a thermal virus could draw), neither does it include the sum of CPU and GPU components (they often share the same TDP). And Turbo does actually increase the TDP by 25%. This makes comparing TDP's quite difficult.
Your 3W TDP will end up being 5-6W once you include Turbo, RAM, regulator inefficiencies and everything else (but the screen). I prefer real values, and 2W total device power for a 1.5GHz quad core Krait while running a CPU intensive benchmark is pretty impressive in my book. It will be interesting to see results for actual Bay Trail devices.
And hopefully, at some point in the very near future, maybe, just maybe the consumers will wake up and finally realize the real world is infinitely more fulfilling than dorking around with a piece of electronics. Get out and hike. Tend to a garden. Visit someone in an old folks home. Bicycle along a canal. Carve up some twisties on a crotch rocket. Jump out of a perfectly good airplane. Explore some backwoods trails on a Honda (You'll meet the nicest people, to borrow an old ad line from Big Red). Schuss a mountain face on board if you're young or a pair if you're older. Leap off of a rope swing into that swimming hole. Sail over to that forgotten cove on the lake. Learn to fly fish. Snorkle a reef. Canoe down the Saco. Drive to the top of Mt. Washington. Rake some leaves. Cut the grass. Clean the gutters. Pick the moss off the roof. Learn to carve with a chain saw. Make sand castles at the beach with your kids and fill the moats with hermit crabs. Walk around a museum. Read a good book actually made with paper. Take a martial arts class. Restore that '67 Camaro you saw listed on Ebay (OK, you need a computer to get started). Lay down some rubber at the local Tastee Freez.
Agreed, I'm fed up with the benchmarketing and cheating as well. Those who work in the field know SPEC suffers from the same issues as we've recently seen with AnTuTu. ICC is good for one thing, and one thing only: producing artificially high benchmark scores on Intel CPUs. However on real non-benchmarking code it is slower than GCC (and UTC). And that means that anyone using ICC scores as a true indicator of CPU performance is trying to mislead us.
Yep, if it had an 8+ hour battery I'd buy one in an instant. I don't want to have a tablet for tablet things and a computer for computer things. I want to be sitting and editing video, then grab my screen and walk out onto the patio and read an article, then walk into my kitchen and listen to some music while I cook... all on the same machine without transition.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.