I think the tablet will now go from just a media consumption device to a content creation machine.
I am not entirely on board with that premise -- at least on a personal level. I love my tablet precisely because it's not capable of doing my work. When I am on my tablet, it's all play and no work. I don't even check my e-mails....
I have been on the Media Consumption side of this argument for quite some time, but perhaps my view of Content Creation is biased due to my engineering viewpoint.
What if the vast majority of Content being created is now for Facebook, Youtube, Blogs and similar "small" productions. The content from engineers, high end graphics designers and video productions may not be generated with a tablet for quite a while but the vast majority of "content" may actually be in reach. They may also be in reach of an advanced cell phone too!
I think 4G, 5G, and beyond, we will enable more content creation from mobile computers (smart phone/tablets). The ergonomic question will get resolved through standard keyboard, mouse and display clip-ons to mobile computers and the computing power question will get resolved through high bandwidth cloud services.
Unlike many here, I think the PC as we know it today will be a thing of the past sooner than many people predict. For consumers' sake, I hope Intel and Microsoft will be just one of many many players in the new era. I actually think that's how it will pan out in the end.
I'm not on board with it at all. The issue is form factor.
I do some content creation. I write. I create/manipulate images. I do DTP. I wrirte the odd bit of code. I can't do those things on a tablet at all well.
The sort of things I do need a big screen. (I have a 23" monitor in 1920x1080 resolution, but I wouldn't mind a bigger one in higher resolution.) Keyboards matter. I need a full size keyboard, and I'm picky about the layout. (I'm typing this while traveling on a netbook, and not too long ago I hit the power buttton instead on the Del key and put the machine to sleep in the middle of typing a reply to a post. Grrr.) My productivity on this device is a fraction or what I can achieve at my desktop. And I need, and use, a mouse.
Hey, Jon Peddle? "I think the tablet will now go from just a media consumption device to a content creation machine" You first. Get a tablet (powered by whatever processor you like.) Restrict yourself to creating all your content on it for a while. Come back and tell us how well you did. I strongly suspect you'll back away from that assertion.
I am absolutely in agreement with you. I would even go as far as suggesting that those that suggest the tablet is taking over everything are trying to make things go that way rather than what is likely to happen naturally. I have a laptop, tablet and full sized PC. I use the laptop once every 2 months, the tablet once a week and the PC daily 10-16 hours. There's just no comparison between the tablet and the PC, but the tablet does do a reasonable job of replacing most of the functionality of the laptop.
I think, for folks to whom "content creation" pretty much means writing newspaper and magazine articles without illustrations, the tablet can function as a content creation platform. And that's about where I'd draw the line. Once you're doing anything in 2D, much less 3D, anything that requires quick reference to other content, anything that's intricate beyond simple text on a 10" screen, the tablet is a no-go.
And that's not going to change; even as tablets get faster and have storage, you're still hamstrung by that tiny screen. Sure, at enough power, you may opt to use the tablet as a desktop, as laptop users have for decades. But just as with laptops, the desktop configuration always offers more bang per buck.
I kind of agree. However, as for as 2d goes, a beefy tablet can be fine. Especially if you consider the microsoft surface tablets. I would happily edit images in photoshop or do illustration on one. It is roughly analog to a cintique. Sure, I would miss my 23" screen, but hey, I can do it. On the surface you can also fast switch back and forth from reference, again, not optimal but doable.
That is somewhat of an exception though. I've tried illustration on the ipad and didn't like it at all. No pressure sensitivity, not quite responsive as a desktop etc.
Depends on your images. I'm kind of hard on my photo tools, and I don't see a 1-2GB tablet being much good at editing 5, 10, 20GB images (I do lots of composites). Not only that, but trying to see correctness and detail on an uncalibrated 10" screen. Sorry, not for me, I like my 27"ers.
Oh absolutely. I think the question though, is what do you expect out of a portable device. I don't think anyone expects tablets to replace an entire graphics workstation. however, saying that they aren't able to "produce content" because they aren't able to keep up with a 6 core i7 with 64GB of RAM is like saying a scooter isn't transportation because it can't haul the same amount of stuff as a mac truck.
"...for folks to whom "content creation" pretty much means writing newspaper and magazine articles without illustrations, the tablet can function as a content creation platform."
There are of course all kinds of forms of "content creation" that a tablet can be used for, but in terms of writing specifically, you're selling tablets' capabilities short. Creating/writing content that includes images/illustrations and frequent reference to other content is perfectly within the realm of capabilities for a tablet, especially with the right tools/workflow. (Check out writing apps like Editorial and Writing Kit, to name a couple of iOS examples.) But of course for real "heavy duty" work you will need a desktop or laptop - no one is suggesting that tablets are a desktop replacement.
Wacom has a new pen that allows use of a tablet to replace the Cintiq drawing screens. So its pretty clear that tablets can be used for creative work including drawing, sketching, photo editing, video (recording), etc. This is not to mention communication capabilities (audio and video conferencing, etc).
While I don't agree that PCs are hard to use, nor that they are obsoleted every two years (not these days, anyway), I think the trend you describe (tablets becoming real PCs) has been going on for some time. Certainly with the advent of the Surface tablet. Good to see that Intel appears to be on the right track.
I have less than zero interest in tablets of the iPad sort, although a super-portable PC which can also be docked, for use beyond the mere little gadget - tiny screen -ridiculously compromised I/O - not to mention reliance on silly "apps" that no one ever needed with a PC, would be very nice. And it seems almost an obvious progression of the PC.
I can see how some notebook PCs are becoming tablets. There is nothing wrong with that, and as Bert said, that is a natural progression.
But that doesn't mean that the need for a desktop PC, or a large-screen notebook PCs will go away. For work purposes, a PC as we know it in the current form factor, will continue to exist.
In my humble opinion, Intel won't be able to rule the tablet market just because a new CPU makes it easier for PC OEMs to build a cool tablet/notebook PC hybrid. For that, we are only talking about a notebook replacement market.
There might be some people out there still thinking that they will only by tablets when tablets can run full PC functions. But to be honest, I think those who think that are a minority. A lot of people have come to accept tablets, not as a PC replacement (or wishing if their tablets are PCs), but as a whole new entertainment device.
My own objection is to the refrain that PCs will disappear, Junko. In fact, we talked about this at the dinner table last night. My wife and I went through the things we use PCs for at home, and wondered how anyone would find those same tasks comfortable on an iPad-like tablet (i.e. a pure tablet, not one with detachable keyboard, mouse, and perhaps capable of connecting to a decent size screen).
Like, one example, would I feel confident moving around my investments online, on a smartphone or tablet? Uuuh, no.
Or another one. We recently did some serious shopping for rugs and rug pads, including research on what kind of pads work on what kind of floors. Would I have selected rugs based on what a smartphone screen showed? Uuuh, no again.
I agree with your point that the two serve different functions. Although I will concede to the "PCs in decline" crowd that in principle, the ergonomic shortcomings of smartphones and tablets can be remedied, without requiring a whole 'nother CPU. Either a dock, or at least a USB 3.0 large screen, keyboard, and mouse, can turn a tablet into something more generally useful.
Bert, I don't believe that PCs will disappear either.
You make a good case that how some of us make a mental distinction what tablets are for and what PCs are for.
You talk about the ergonomic shortcomings of smartphones and tablets for the stuff you would normally do on PCs; for me, I found the erognomic shortcomings in PCs. WhenI want to do a quick search on Google, or catch a glimpse of video clips on Youtube, do I go to my PCs to do that? No, I prefer the ergonomic convenience of tablets -- just lying down on my sofa.
We can compare the desktop with laptop, while laptop are lot more capable than tablet still they are not able to fully replace desktop, however comparing laptop in early days to now, they have gained lot more power and becoming close enough to desktop where they replace its for large number of use cases.
Tablet is in early stage with limited capabilities. It certainly have replaced desktop/laptop for sales people who mostly use devices to check email/document preprations and for school kid who use it for social media or may be for non-technical people who use it for staying connected with people. However its still far away from use cases like editing photos e.t.c.
It will be interesting to see how it evolves and how many usecases it can beat laptop/desktop. may be they come with enough powerfull CPU with wireless monitor/keyboard hookup where you feel like using desktop and having the system around tablet. Syrface is so far closest to this among all tablet, however "windows" itself is loosing edge (in my opinion).
"Like, one example, would I feel confident moving around my investments online, on a smartphone or tablet? Uuuh, no.
Or another one. We recently did some serious shopping for rugs and rug pads, including research on what kind of pads work on what kind of floors. Would I have selected rugs based on what a smartphone screen showed? Uuuh, no again."
I would use my tablet in both of the above cases. As Junko said, its just a personal mental distinction created by people.
Current Tablets are mostly good for entertainment and may be reading occasional mail.
I still cannot open 2-3 apps at same time, so I cannot listen to music and also surf the web or write a email or a Doc.
I can't imagine using a tablet for work. I need a real keyboard and mouse and 2 screens and lot of DRAM.
Even when I am crunching data on a remote server or open a Design layout from secure server, I need my big screen to look at Design on 1 screen and keep an eye on a code execution on other, all the while listening my music.
Indeed... it's not simply tablets. Once you're doing professional work on a full fledged desktop, even a laptop is an uncomfortable compromise. My home-made desktop runs a 6-core Intel i7, 64GB of DDR3 memory on a quad-bus configuration, a 960GB boot drive SSD, and a 5TB RAID5 data drive. This is for electronics CAD, photo editing, video editing, music creation.
I'd be waiting a month for a couple of hours of high quality H.264 to render on a tablet, even if I had a place to put the input video (that's 40GB/hr out of my Canon 6D in AVC-Intra mode). I have individual composited photos, 40-80 20Mpixel individual shots merged into a single image, that can run into the 20GB range, each. I'm mixing 30-60 tracks of 96kHz/24-bit audio, sometimes with effects on each track. Electronics CAD uses less CPU and disc resources, but I'd go insane without at least two screens (I have two 1440p and one 1200p screen on my home system) -- that's schematics, PCB layout, maybe an FPGA design tool, data sheets, web sites open on DigiKey and Mouser and all those guys, more web pages searching for specs or other things I need.
That's the real work.
Don't get me wrong, I have a decent tablet (Asus Transformer Infinity, 128GB storage), great for reading PDF data sheets in the lab, great for videos on vacation, web browsing in hotel rooms or on the couch, music playback (got most of the library there), works ok for writing or even light coding, with the keyboard add-on, it's nice for note-taking... the best aspects of a PC and a note pad combined. But it's not for creating things. And no 10" screen will ever be suffiicient for that, even if they eventually do stuff the power of a high end CPU+GPU of today and a few TB of storage in the thing.
And that's actually ok... tablets are also cheap. I really don't want to take my expensive desktop PC out to a bonfire so I can reference songbooks for a little guitar playing. But the tablet's not a huge issue to replace, and of course, it's going to run all night on a charge. They're both application processors, but each adapted to different use conditions. Making a PC into a tablet is just silly... making a very good tablet that runs SOME PC software, maybe not such a bad thing. After all, while as I said, the Transformer is ok for some light coding work -- and you could even develop commercial apps for Android ON Android, that is not true for iOS or Windows RT. They're specifically not for software work. Windows 8 on a real all-day tablet would let Windows programmers have what Linux programmers already have on Android (or one of the many other Linux-for-mobile projects in the works, but not quite mainstream yet).
I am with Junko on this...tablet is a nice toy, I have it and use it for fun...but I can't do any work on it...and I am not special, I assert 90% of engineers can't do any real work on tablets, let's take a poll amoung EE Times readers!...Kris
" I assert 90% of engineers can't do any real work on tablets, "
That's true but what percentage of computing users in the world (smartphone-only users included) are engineers?
One aspect of the issue I did not see addressed in the comments so far is the use of VNC sessions. Sure, I use a laptop for design work but that same design work is performed on powerful machines in remote locations that I merely access screen sessions on. I don't see how a comparatively "wimpy" docked tablet or even phone couldn't do the same. I'm just using this thing to pass mouse and key inputs.
This is absolutely true. When people talk about "working" on a tablet, I often wonder what exactly they mean. If your work is insanely processor intensive and RAM intensive like 3d modeling or simulations, obviously tablets aren't for you right now. Basically, if your work machine is a special system built for your task(like most workstations), a tablet isn't going to cover it.
However, a huge percentage of the business world only needs spreadsheets, documents, calendars, email, etc. Tablets work fine for this. Unfortunately, the form factor can be a bit of a drain after a while if you're a power user and want more information visible at once.
Tablets work effectively as screens for apps that run in the cloud that require more compute power. For example, I use Autodesk 360 which is the cloud app for AutoCAD and other Autodesk applications. It let's you access for viewing or drawing AutoCAD and Autodesk apps and documents. Because the computing (e.g. 3D rendering, cpu, memory, etc.) is done in the cloud, the compute power is not required on the tablet and the tablet effectively just acts as in I/O device.
It is extremely handy to have a device that you can bring with you to meetings, in the field, with customers, etc. to go over designs, etc.
I can envision a possible future where we buy a processing box that sits in a closet and all of our devices defer the heavy lifting to to it. You just choose which interface is most fitting at the time.
And hopefully, at some point in the very near future, maybe, just maybe the consumers will wake up and finally realize the real world is infinitely more fulfilling than dorking around with a piece of electronics. Get out and hike. Tend to a garden. Visit someone in an old folks home. Bicycle along a canal. Carve up some twisties on a crotch rocket. Jump out of a perfectly good airplane. Explore some backwoods trails on a Honda (You'll meet the nicest people, to borrow an old ad line from Big Red). Schuss a mountain face on board if you're young or a pair if you're older. Leap off of a rope swing into that swimming hole. Sail over to that forgotten cove on the lake. Learn to fly fish. Snorkle a reef. Canoe down the Saco. Drive to the top of Mt. Washington. Rake some leaves. Cut the grass. Clean the gutters. Pick the moss off the roof. Learn to carve with a chain saw. Make sand castles at the beach with your kids and fill the moats with hermit crabs. Walk around a museum. Read a good book actually made with paper. Take a martial arts class. Restore that '67 Camaro you saw listed on Ebay (OK, you need a computer to get started). Lay down some rubber at the local Tastee Freez.
It's the x86-based tablets, using Intel's new Atom-based SoC and AMD's APU that will allow us to do real work on a tablet. I agree, the Android tablets are imposible and don't have the app base for producity (Sure, you canget by and compromise by unlearning everything you did learn for the past ten years). X86 tablets will change the game and our usage modes
"It's the x86-based tablets, using Intel's new Atom-based SoC and AMD's APU that will allow us to do real work on a tablet. I agree, the Android tablets are imposible and don't have the app base for producity (Sure, you canget by and compromise by unlearning everything you did learn for the past ten years). X86 tablets will change the game and our usage modes"
What does this mean!? Where is the data that support these assertions? All the benchmarks that I have seen show Intel-based solutions for tablets well behind ARM-based ones. That includes performance and performance per battery life.
As for productivity, what are most people using their tablets for? circuit simulation? It's mostly used for email, web surfing, audio and video consumption. Surely there is no need for any "unlearning" here. And if you want PC applications on your tablet, surely the way is not to use processor technology designed for PCs. Yes, you might have the tools but you won't have much time to use them before the battery goes flat!
I am sorry but I have been reading/hearing the above assertions so many times lately and I can't for the life of me see any reason to believe them. Show us the numbers, please!
@jonpeddie, have a look at this EETimes blog: http://www.eetimes.com/author.asp?section_id=36&doc_id=1318857
In it you can see a reference to a number of benchmarks showing Intel lying well behind ARM-based SoCs, and explaining how false is the latest AnTuTu benchmark. The academic in many is very disappointed with such baseless claims - fair play to all parties involved but please let's not bend science and facts to suit our aims.
Anyway, I look forward to the IDF annoucement in early September as you said, perhaps we will then get some useful numbers....
Jon, the benchmark was discredited precisely because Intel was cheating it. Intel has a long history of artificially inflating benchmark scores using their ICC compiler. So far there have been a few Geekbench results leaked for Bay Trail. As can be predicted from its limited out-of-order microarchitecture, Bay Trail has lower IPC than the aggressive OoO Cortex-A15, and can only compete at higher frequencies. However it requires a lot of power at high frequencies, eg. the quad 1.9GHz E3840 has a 10W TDP! Compare that with the new Nexus 7 tablet which Anand measured as using 2W in total when running CPU intensive benchmarks.
So you could say that Bay Trail finally narrows the large gap in performance between the current Atom and ARM cores. However calling it a "game changer" is rather premature and wishful thinking.
You're right about the benchmarking fiasco - really unbleviable its still happening these days.
I'm not alone in my enthusiasm for Bay Trail and Intel's prospects in the Win 89 tablet market. It's only competiton is AMD, and I think there will be a strong demand within enterprise and prosumers for a Win8 tablet. Also, FYI, we (and I think some other research firms) are counting the Win8-based tablets as PCs, and with that policy we (and others) will start to show improved "PC" sales.
Win8 tablets are effectively PCs indeed, however I very much doubt it will help overall PC sales. My bet is that Win8 tablets will eat marketshare from laptops just like laptops did from desktops, so the overall PC market will continue its slow but inevitable decline. Surface Pro didn't sell well and Bay Trail will be slower than the i5 currently used, so I don't see why it is more suitable for content creation or why it will sell better.
Meanwhile Android and iOS continue to grow to new heights. The fact is we no longer need x86 or Windows to do useful work. For example, the most popular laptop on Amazon is the Cortex-A15 based Chromebook.
Umm i haven't see cpu intensive benches in power measurements of nexus 7, moreover it is not "in total" but instead without screen.
Geekbench is only a small syntetic bench not a thermal virus.
Krait series is in the 0.7-1.3W/core ballpark, depending on clock speed, like ARM says officially in it's own public presentations. So a Snapdragon 600 at 1.5Ghz likely draws around 3.5W under full cpu load. Snapdragon 800 2.3Ghz draws around 5.2W only considering the cores, a lot more with GPU running.
About BayTrail performance, i don't think a syntetic bench, done so so by an unknown geek is the best manner to judge a cpu. I believe that SPEC is the best and looking at the results around the web, Silvermont core seems clock to clock faster than A15, so your claims about the high clock speed to mach the competition have not common sense.
For your information Baytrail Z series has a TDP of 3W and 2W SDP. Your figure is for the D series, aimed for desktop applications.
Agreed, I'm fed up with the benchmarketing and cheating as well. Those who work in the field know SPEC suffers from the same issues as we've recently seen with AnTuTu. ICC is good for one thing, and one thing only: producing artificially high benchmark scores on Intel CPUs. However on real non-benchmarking code it is slower than GCC (and UTC). And that means that anyone using ICC scores as a true indicator of CPU performance is trying to mislead us.
Gondalf, are things like circuit routing and weather simulation routinely done on mobiles or tablets? Being a workstation-oriented benchmark, SPEC is hardly representative for desktop PCs, let alone mobiles and tablets... Geekbench is not the best benchmark, but it is certainly far more useful than SPEC.
Note that Intel's TDP is not the maximum power (what a thermal virus could draw), neither does it include the sum of CPU and GPU components (they often share the same TDP). And Turbo does actually increase the TDP by 25%. This makes comparing TDP's quite difficult.
Your 3W TDP will end up being 5-6W once you include Turbo, RAM, regulator inefficiencies and everything else (but the screen). I prefer real values, and 2W total device power for a 1.5GHz quad core Krait while running a CPU intensive benchmark is pretty impressive in my book. It will be interesting to see results for actual Bay Trail devices.
Waiting for a comment about who will actually buy these chips. The only vendors making enough margin on pads to be able to afford high end Intel chips are Apple and Samsung. Both use proprietary solutions now. No money in low end Android, vendors are beating each other to death on price, with no room in the BOM for expensive processors. Windows has failed on tablets. Who is left??
So sorry wilco but your useless Geekbench is more like MANY other syntetic tests around the web, it is popular because is cross plataform, not because is good.
The main concern in this so so (many so) bench is that it doen't stress the L2 cache and even less the memory controller with its own memory subsystem.
The main advantage of SPEC is that it is able to stress a lot "at the cpu level" the device (soc level if you prefer), showing the average performance of core, L1, L2, L3, memory controller, main memory ALL together.
SPEC is done for this and its widely used to judge a server cpu (not the core).
You are almost wrong saying that a poor syntetic bench is better in a phone or in a tablet. It's not an accident that the old Atom is able to perform ,in real world software, much much better as indicated by a poor, small synthetic test. In a real world workload is the cpu(soc) that counts, the core is only a component of the equation. Its not a secret that the main defect of ARM socs is the memory subsystem, and ARM is trying to address this .....but needs time...many years to match Intel or Amd expertise.
About power consumption (that is disclosed now thanks to some official Qualcomm slides), call me when a review will show an ARM SoC running a real stress test ala Prime95, not a very light bench done so so.
I do not comment on your rather unfair reasoning on TDP, capable of triggering a stupid flame war.
Its funny to see ARM followers not even trust in Qualcomm. I pretty believe that this ARMmania is like a religion.....no comment.
Gondalf, from your reply the only conclusion one can make is that you must be confusing Geekbench with Dhrystone. Geekbench does not only get 50% of its score from memory tests (including Stream), but it even includes a primality test that you seem to regard so highly... Please do try to understand it before dismissing it.
I agree SPEC is good for evaluating server CPUs, and when 64-bit ARM servers appear I'd imagine we'll see some ARM scores (expect the ICC vs GCC compiler issue to stir up more discussions in the future as AMD, Transmeta and VIA have/had the same issues). But it's just not useful for mobile SoCs as they do not run the same floating point heavy workloads.
It's true that older ARM SoCs suffered from memory bandwidth and prefetching isssues, but given the much improved memory scores (the GT-I9500 scores 2.5 times that of the S3 on Stream), it looks like A15 SoCs have now solved those issues, and are pretty much level with Bay Trail.
Who says I don't trust QC? I discuss and base my opinion on hard facts, not anonymous claims without evidence. If you do have links that prove Anand's numbers are incorrect, then I'd be interested to see it. However talking about TDP numbers between different SoCs is going to be fruitless due to the completely different methods and definitions used.
Tablet are based on ARM CPUs. So what? "I agree, the Android tablets are imposible and don't have the app base for producity (Sure, you canget by and compromise by unlearning everything you did learn for the past ten years)" And a whole new form factor won't require you to unlearn a bunch of things? Exactly how similar do you expect an app designed assuming a large monitor, full keyboard, and mouse will be when translated to a tablet with much smaller screen, on-screen virtual keyboard, and touch screen interface? I submit it will look and work very differently, even if it is designed to do the same things, and there will be a substantial learning curve for the users trying to use it instead of a desktop or laptop.
CPU architecture is largely irrelevant. If there's a market demand for the sort of apps you are thinking of on tablets, they'll get ported to ARM and Linux. There's nothing magical about X86, and current ARM processor designs have more than enough power.
Here's the thing: tablets, in the Android and iOS model, are used very differently than laptops, even when doing the same kind of things. I don't believe Microsoft was really on-board with this concept. Like many things at Microsoft, this whole mobile computing thing wasn't a home grown idea, it's an import.
So look at their "business" tablet, the Surface Pro. This runs a moderately good laptop CPU, the kind of i5 you find in $500 laptops. So it'll run just about anything a business guy's going to throw it's way. But it's got at best a four hour battery, it's over 2lbs heavy, and it's designed to be used primarily with an add-on keyboard. In other words, it's basically a convertible ultrabook -- it totally misses the point of tablet.
Then there's the Surface RT... sure, it was way over-priced when it came out, the guts of a $350 Asus Transformer TF300 selling for more than a $499 iPad 4. But it had the SWAP down to what you'd expect for a real tablet, more or less, even though Microsoft still wasn't embracing the while tablet idea.. I mean, the whole Metro UI is best used with a slew of available keyboard shortcuts -- hardly the thing for a tablet OS.
But as of just recently, Intel's at least making credible CPUs for tablets. An x86 tablet doens't need to be a desktop or even laptop equivalent -- it needs to be a real tablet. That means as close to 1lbs as you can make it, all-day battery, ideally a display that good in bright light (Asus does this pretty effectively with their IPS+ thing), a little rugged because you're going to be dragging this places, etc. This doesn't affect Apple, but putting ARM, AMD, and Intel in a race for the best tablet CPUs ultimately will make things very good for we consumers. Android's at best processor agnostic, and x86 is pretty well supported in the NDK world, not an issue for most apps based on Dalvik. And for Windows, it's a no-brainer.. you'll take the Windows 8 tablet. It'll run all the tablet stuff more or less as well as Windows RT tablet, but you can use it for some level of real work... or just surf the net with your browser of choice, rather than Microsoft's browser of choice.
Back in the early days of personal computing, complex CAD and other applications requiring heavy lifting, used workstations. The term wasn't as generic back then as it is today. Essentially, they were just really powerful (for the day) small computers optimized for computational processing.
As PCs became more powerful, the performance gap decreased and the expense gap increased to the point where a "workstation" just became a more powerful PC.
Perhaps, that category will be returning. The gap between what many, many people need (those that can get by with a tablet) and what engineers and designers need has suddenly gotten quite large again.
My prediction is that the categories will fall out as such:
1) Personal home use computing and light corporate computing needs will be addressed by tablets and tablet/notebook hybrids.
2) Hard core gamers will have more traditional PC type systems with powerful CPUs and GPUs.
3) Engineers, and media producers will have more traditional PC type systems optimized similarly to game systems, but for commercial usage.
Category one is already at risk of replacement, even though it's not fully evolved. Not long from now it will be a person's phone wirelessly connected to dumb displays and input devices.
Categories 2 and 3 will likely keep the form factor for quite a while yet.
Duane, workstation is a nice analogy. Obviously, tablets are getting powerful enough to do PCs' job, just as PCs got powerful enough to do workstations' job. The question, then, is as you pointed out, will "personal home use computing" become all tablets?
Junko, in terms of actual time spent on the PC, most of my "personal use computing" at home is probably watching Internet TV. For that, I'm in the den watching on a 42" HDTV set and separate audio system.
If I can dock a tablet to connect it to the HDTV set and the audio system, then it would make sense. Otherwise, why would I ever want to go to tablet for this?
I guess I'm saying, although tablets are getting more powerful, they're still small gadgets. In place of a book or magazine, sure. In place of home theater, uuh, not so good. There are uses of the home PC that a tablet, by itself, will remain unsuited for.
I find this article full of assertions with no substance. It is about time we saw real solid numbers to have any confidence in Intel's claims. The latest AnTuTu mobile benchmark fiasco does not give me any confidence things are about to change.....
The discussion here is mostly about computing power...for sure tablets are getting more powerful and can do what PCs could do few years ago...and PCs in turn in the past became as powerful as old workstations...but this is all semantics...transition from workstation to PCs was noticed by engineers, for the rest of the wold nothing has really changed, yes, price came down, vendors changed but computer remained a computer
The supposed shift from laptops to tablet transition is different...in addition to price, computation power, and vendor shift there is one dramatic difference: the tablet is small physically. Using it is not ergonomical. You can't see everything, you can't draw something precise quickly, you can't type very quickly and reliably, you can't work very efficiently for long period of times...your eyes and fingers are just suited that well to the tablet...this is fundamental reason it will not take over the laptop...for similar reasons Google eyeglasses or Apple iWatch will not take over from tablets...unless humans start growing larger eyes and smaller fingers, but that might take a while;-)
For content creation as opposed to mere consumption, the tablet's biggest limitations are ergonomic ones, not computing power ones, and those limitations are primarily on the input side of things, not the output side. The touchscreen keyboard is ok for modest volumes of typing, but not as efficient as a physical keyboard. Using your fingers as a mouse substitute is ok for modest amounts of graphic object manipulation, but not nearly as efficient as a real mouse.
I have run some VNC sessions from my iPad to my "real" computer where I do engineering work, and have viewed schematics, edited text files, run simulations and even viewed simulation waveforms. It's not as productive as doing those tasks on a "real" computer, but I found that the primary reasons for that were simply no keyboard and no mouse. Add those two input devices and I see no reason why a tablet can't be just as productive for content creation as a laptop.
Then again, if you add those two input devices, how is your tablet then any different from a laptop?
A tablrt is too big to take with you onto a bus from suburb to the city. I have observed most people on the bus are using smart phones. But many home users do not need a PC: it is overkill for them. They might be tablet guys if the tablets would be cheaper. Now you can buy a PC for $329. iPad is $400 - 600. So is problem of price: the tablets are going to replace PCs at home if all you need is to get your email or surf Internet. For now laptops are preferable for students and even professionals/
I see a lot of people here arguing over wether tablets are consumption devices or creation devices. Up to this point, this has been largely decided by the application makers. The fact is, while your tablet may not be a workstation equivalent, it is actually quite capable of doing many work tasks. It is the software creators or app creators who are deciding what you will do with it.
The surface pro has a ton of potential to change how people utilize tablets. The transition from work to play, stationary to mobile should be seamless. If only it had a battery that lasted more than a few hours.
I am sure the surface pro has a potential to chnage the way people use tablets, and possibly that will eat into the notebook PCs. But as Caleb pointed out here, the clincher here is, as Calebl pointed out:
If only it had a battery that lasted more than a few hours.
Yep, if it had an 8+ hour battery I'd buy one in an instant. I don't want to have a tablet for tablet things and a computer for computer things. I want to be sitting and editing video, then grab my screen and walk out onto the patio and read an article, then walk into my kitchen and listen to some music while I cook... all on the same machine without transition.
I'm told Intel planned to talk about and annouce Bay Trail at Hot Chips this week. A presentation was all approved and ready to go, but got pulled at the last minute. Apparently they are saving the news for their IDF event in two weeks.
I have read a lot of discussion & comparison on this forum of Intel & ARM chips, lkg power etc...
In my experience Low power design & architectures are not the same ones one can use in High performance applications and vice versa. Same goes for Process technology. A process optimized for Low power will be most efficient at slower Freq. Freq/Power curve for this tech would not look the same at 4GHZ and vice versa.
So when somebody quotes the xyz chip has best Perf/watt and much better than some other chip, that is not entirely true. This xyz chip may be best in a certain freq range, but will not be best option in High Perf needs and vice versa.
( I have worked in IDM,OEM, ASICs and foundry business and seen multiple gens of Process tech and Design ).
I don't want to get you in trouble but if you were generously provided with a look at Bay Trail covered by a non-disclosure agreement ....how is it you can write and publish an article about it, which is surely disclosure?
In defense of Jon, I don't think he is breaking any NDA here. He is not giving us any more details on Intel's Bay Trail -- other than saying that he thinks it is a game changer. And of course, he were to share any inside knowledge of Bay Trails (which he couldn't), he could have defended his position better in this forum -- in my opinion.
The comments are my impressions of what I saw, no technical details. I showed Intel what I was going to say before posting it, something I usually do as a fact check to make sure my scribbled notes match reality and not just what I had for lunch. That's the good news/bad news - good news I this its a game changing hunk of Si, bad news, I can't tell you why - but IDF is almost here, my post is a heads up and a tweak for you and the others to think about the future of tablets.
Here's another POV on tablets for you to consider, the <$50 7-in tablet being sold into the educational market. We - the world - will be breeding a several generations of kids who grow up knowing nothing but a tablet. They won't be able to read or write cursive, and won't have memorized the multiplication tables. Old timers will protest that they are not getting a good education, to wit I say, how's your Latin, and what was the last Greek book you read.
Take a look at this: http://jonpeddie.com/back-pages/comments/may-the-tablet-be-with-youreally/
Yes new generations will grow up using tablets as their first and main computing device just like the last generation grew up using laptops instead of desktops. While I prefer old fashioned keyboard and mouse (I guess many on here grew up with those like I did), I know younger people can be just as productive while texting on a mobile keypad or using a touch keyboard just because they learnt that first rather than typing on a keyboard.
The low price of tablets is one of the reasons why Surface Pro wasn't successful and why Bay Trail won't do any better. Current Atom chips cost $40+, so Bay Trail will likely cost more, while ARM SoCs go as low as $7-10. That's hard to compete with...
Also note that most of the technical details of Bay Trail are already publicly known for a while and won't be covered by your NDA. AnandTech and other tech sites have written very detailed articles about the micro architecture, performance etc. Lots of other details were leaked about the parts, TDP and benchmarks. That's why I am very sceptical about claims of a game changer - there may well be some new bits of info next week (eg. actual devices unveiled), but based on what we know already, it isn't going to be revolutionary.
Obviously Bay Trail is another increment in Intel's quest to support mobile computing. Why the only company on the planet able to sell regular consumers $1,000 CPUs is worrying about being competitive against $15 Tegra's is beyond me, but I guess, like Microsoft, Intel fears for the future.
The first rule of the CPU business: Intel and the x86 never fails. The x86 wasn't as good as the 68K and a bunch of other CPUs in the 80s... but by mid-1990s, it was the only desktop CISC processor left standing. Then RISC was going to kill it... but curiously, the ability to sell 100 million CPUs per year rather than 100K allowed Intel (and AMD) to RISCify the x86 enough to kill off most of RISC, at least for desktops and servers. Then 64-bit was going to kill the x86, even Intel was trying to make that happen. And then it was low power CPUs for laptops and servers, Transmeta's VLIW concept had only 1/4 the transistors of a low-end x86, it was surely going to take over. Nope... Intel got serious about lower power chips and killed that off (and stop calling me Shirley).
So if Intel wants a chunk of this market, they'll have it. Not all of it, simply because the numbers for a single tablet or smartphone are enough to justify an SOC designed specifically for that device, as Apple's shown over and over again. But they'll have their piece of it, and at some point pretty soon, it won't be the CPU as the primary source of battery depletion in a mobile device, but the display, the radios, etc. Wait -- that's already true in highly mobile devices. Intel's going to get low enough on power that, even if ARM's using half as much, it simply won't matter anymore.
The most interesting immediate effect is on Windows. Mobile Intel does nothing for Apple and the iOS world, they're on ARM, making their own ARM chips, and happy that way. Mobile Intel can be a choice for Android, and given that Intel just released their "magic" C++ compiler for Android NDK development, that's stepping up a notch. But still, no Intel advantage here, and for awhile anyway, perhaps a disadvantage, as ARM's got the market for now.
But Windows is a different story. Windows RT runs on ARM, and locks you into a Microsoft walled garden composed of Metro/WinRT apps and, well, whatever Win32 things Microsoft wants to release, as the only company allowed to do desktop stuff on ARM. Windows 8 runs all that Meteo/WinRT stuff too, but also your existing software. The very first Windows 8 tablet at $299 was just put out there, a pretty crappy 8" tablet from Acer... but still, real Windows 8 on the last-gen Atom. I'm not going to run Altium, Vegas, or Photoshop on that tablet, but I can run Chrome -- stock WinRT doesn't allow any other web browers (or for that matter, JIT compilers necessary to implement a modern web browser). There is no reason anyone without an agenda would choose a Windows RT device over Windows 8, given comparable price and battery life (oh yeah, that 8" tablet also runs around 8 hours on a charge -- an acceptable approximation to "all day battery").
So I think the real game Bay Trail is changing is Microsoft's game... it's the final nail in Windows RT's coffin.
The quad core processor based on Silvermont architecture technical characteristics are very promising, tablets need such low-power-consuming X86 processors to maximize battery life. It is possible to run autodesk software on tablets having such processors? My nephew`s Intel atom processor is too slow for such kind of operations.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.