I kind of agree. However, as for as 2d goes, a beefy tablet can be fine. Especially if you consider the microsoft surface tablets. I would happily edit images in photoshop or do illustration on one. It is roughly analog to a cintique. Sure, I would miss my 23" screen, but hey, I can do it. On the surface you can also fast switch back and forth from reference, again, not optimal but doable.
That is somewhat of an exception though. I've tried illustration on the ipad and didn't like it at all. No pressure sensitivity, not quite responsive as a desktop etc.
I think, for folks to whom "content creation" pretty much means writing newspaper and magazine articles without illustrations, the tablet can function as a content creation platform. And that's about where I'd draw the line. Once you're doing anything in 2D, much less 3D, anything that requires quick reference to other content, anything that's intricate beyond simple text on a 10" screen, the tablet is a no-go.
And that's not going to change; even as tablets get faster and have storage, you're still hamstrung by that tiny screen. Sure, at enough power, you may opt to use the tablet as a desktop, as laptop users have for decades. But just as with laptops, the desktop configuration always offers more bang per buck.
Indeed... it's not simply tablets. Once you're doing professional work on a full fledged desktop, even a laptop is an uncomfortable compromise. My home-made desktop runs a 6-core Intel i7, 64GB of DDR3 memory on a quad-bus configuration, a 960GB boot drive SSD, and a 5TB RAID5 data drive. This is for electronics CAD, photo editing, video editing, music creation.
I'd be waiting a month for a couple of hours of high quality H.264 to render on a tablet, even if I had a place to put the input video (that's 40GB/hr out of my Canon 6D in AVC-Intra mode). I have individual composited photos, 40-80 20Mpixel individual shots merged into a single image, that can run into the 20GB range, each. I'm mixing 30-60 tracks of 96kHz/24-bit audio, sometimes with effects on each track. Electronics CAD uses less CPU and disc resources, but I'd go insane without at least two screens (I have two 1440p and one 1200p screen on my home system) -- that's schematics, PCB layout, maybe an FPGA design tool, data sheets, web sites open on DigiKey and Mouser and all those guys, more web pages searching for specs or other things I need.
That's the real work.
Don't get me wrong, I have a decent tablet (Asus Transformer Infinity, 128GB storage), great for reading PDF data sheets in the lab, great for videos on vacation, web browsing in hotel rooms or on the couch, music playback (got most of the library there), works ok for writing or even light coding, with the keyboard add-on, it's nice for note-taking... the best aspects of a PC and a note pad combined. But it's not for creating things. And no 10" screen will ever be suffiicient for that, even if they eventually do stuff the power of a high end CPU+GPU of today and a few TB of storage in the thing.
And that's actually ok... tablets are also cheap. I really don't want to take my expensive desktop PC out to a bonfire so I can reference songbooks for a little guitar playing. But the tablet's not a huge issue to replace, and of course, it's going to run all night on a charge. They're both application processors, but each adapted to different use conditions. Making a PC into a tablet is just silly... making a very good tablet that runs SOME PC software, maybe not such a bad thing. After all, while as I said, the Transformer is ok for some light coding work -- and you could even develop commercial apps for Android ON Android, that is not true for iOS or Windows RT. They're specifically not for software work. Windows 8 on a real all-day tablet would let Windows programmers have what Linux programmers already have on Android (or one of the many other Linux-for-mobile projects in the works, but not quite mainstream yet).
Obviously Bay Trail is another increment in Intel's quest to support mobile computing. Why the only company on the planet able to sell regular consumers $1,000 CPUs is worrying about being competitive against $15 Tegra's is beyond me, but I guess, like Microsoft, Intel fears for the future.
The first rule of the CPU business: Intel and the x86 never fails. The x86 wasn't as good as the 68K and a bunch of other CPUs in the 80s... but by mid-1990s, it was the only desktop CISC processor left standing. Then RISC was going to kill it... but curiously, the ability to sell 100 million CPUs per year rather than 100K allowed Intel (and AMD) to RISCify the x86 enough to kill off most of RISC, at least for desktops and servers. Then 64-bit was going to kill the x86, even Intel was trying to make that happen. And then it was low power CPUs for laptops and servers, Transmeta's VLIW concept had only 1/4 the transistors of a low-end x86, it was surely going to take over. Nope... Intel got serious about lower power chips and killed that off (and stop calling me Shirley).
So if Intel wants a chunk of this market, they'll have it. Not all of it, simply because the numbers for a single tablet or smartphone are enough to justify an SOC designed specifically for that device, as Apple's shown over and over again. But they'll have their piece of it, and at some point pretty soon, it won't be the CPU as the primary source of battery depletion in a mobile device, but the display, the radios, etc. Wait -- that's already true in highly mobile devices. Intel's going to get low enough on power that, even if ARM's using half as much, it simply won't matter anymore.
The most interesting immediate effect is on Windows. Mobile Intel does nothing for Apple and the iOS world, they're on ARM, making their own ARM chips, and happy that way. Mobile Intel can be a choice for Android, and given that Intel just released their "magic" C++ compiler for Android NDK development, that's stepping up a notch. But still, no Intel advantage here, and for awhile anyway, perhaps a disadvantage, as ARM's got the market for now.
But Windows is a different story. Windows RT runs on ARM, and locks you into a Microsoft walled garden composed of Metro/WinRT apps and, well, whatever Win32 things Microsoft wants to release, as the only company allowed to do desktop stuff on ARM. Windows 8 runs all that Meteo/WinRT stuff too, but also your existing software. The very first Windows 8 tablet at $299 was just put out there, a pretty crappy 8" tablet from Acer... but still, real Windows 8 on the last-gen Atom. I'm not going to run Altium, Vegas, or Photoshop on that tablet, but I can run Chrome -- stock WinRT doesn't allow any other web browers (or for that matter, JIT compilers necessary to implement a modern web browser). There is no reason anyone without an agenda would choose a Windows RT device over Windows 8, given comparable price and battery life (oh yeah, that 8" tablet also runs around 8 hours on a charge -- an acceptable approximation to "all day battery").
So I think the real game Bay Trail is changing is Microsoft's game... it's the final nail in Windows RT's coffin.
Yes new generations will grow up using tablets as their first and main computing device just like the last generation grew up using laptops instead of desktops. While I prefer old fashioned keyboard and mouse (I guess many on here grew up with those like I did), I know younger people can be just as productive while texting on a mobile keypad or using a touch keyboard just because they learnt that first rather than typing on a keyboard.
The low price of tablets is one of the reasons why Surface Pro wasn't successful and why Bay Trail won't do any better. Current Atom chips cost $40+, so Bay Trail will likely cost more, while ARM SoCs go as low as $7-10. That's hard to compete with...
Also note that most of the technical details of Bay Trail are already publicly known for a while and won't be covered by your NDA. AnandTech and other tech sites have written very detailed articles about the micro architecture, performance etc. Lots of other details were leaked about the parts, TDP and benchmarks. That's why I am very sceptical about claims of a game changer - there may well be some new bits of info next week (eg. actual devices unveiled), but based on what we know already, it isn't going to be revolutionary.
Here's another POV on tablets for you to consider, the <$50 7-in tablet being sold into the educational market. We - the world - will be breeding a several generations of kids who grow up knowing nothing but a tablet. They won't be able to read or write cursive, and won't have memorized the multiplication tables. Old timers will protest that they are not getting a good education, to wit I say, how's your Latin, and what was the last Greek book you read.
Take a look at this: http://jonpeddie.com/back-pages/comments/may-the-tablet-be-with-youreally/
The comments are my impressions of what I saw, no technical details. I showed Intel what I was going to say before posting it, something I usually do as a fact check to make sure my scribbled notes match reality and not just what I had for lunch. That's the good news/bad news - good news I this its a game changing hunk of Si, bad news, I can't tell you why - but IDF is almost here, my post is a heads up and a tweak for you and the others to think about the future of tablets.
Tablet are based on ARM CPUs. So what? "I agree, the Android tablets are imposible and don't have the app base for producity (Sure, you canget by and compromise by unlearning everything you did learn for the past ten years)" And a whole new form factor won't require you to unlearn a bunch of things? Exactly how similar do you expect an app designed assuming a large monitor, full keyboard, and mouse will be when translated to a tablet with much smaller screen, on-screen virtual keyboard, and touch screen interface? I submit it will look and work very differently, even if it is designed to do the same things, and there will be a substantial learning curve for the users trying to use it instead of a desktop or laptop.
CPU architecture is largely irrelevant. If there's a market demand for the sort of apps you are thinking of on tablets, they'll get ported to ARM and Linux. There's nothing magical about X86, and current ARM processor designs have more than enough power.
I am absolutely in agreement with you. I would even go as far as suggesting that those that suggest the tablet is taking over everything are trying to make things go that way rather than what is likely to happen naturally. I have a laptop, tablet and full sized PC. I use the laptop once every 2 months, the tablet once a week and the PC daily 10-16 hours. There's just no comparison between the tablet and the PC, but the tablet does do a reasonable job of replacing most of the functionality of the laptop.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.