I agree with Frank. Most of the time I spend watching TV is with my wife. While we both have Ipads they are used for quick searches on the internet or for reading and games. I consider tablet keypads close to useless and do most of my computing on a laptop.
Tablet as a personal TV might already be happening. I still remember years back when Sony launched personal portable TV. It was not going too far. Yet, the world has changed. The new generation are looking for more personal experience. Today's kids might want to watch a cartoon shows while dad is watching news or a documentary. I found personal TV appealing and I am pretty sure there is a market for it. Question is whether the market is big enough. Personally, movies (in particular, action movies) deliver better on a big screen. I definitely enjoy watching sports more on a TV than on a tablet device.
For solo viewing resolution, the tablet can beat a large TV and a movie screen. The problem becomes how to accommodate social viewing (I've not watched TV in 30 years of business travel; I only watch at home with my wife.) My hunch is that large screen TVs will disappear when pico-projectors take over. they are small, cheap, and can fill an arbitrary area on a wall.
I don't believe the communal experience of watching a tv in a living room with friends will disappear. But that doesn't make watching crappy video on tablets acceptable. Here's the opportunity for the engineering community to bring the HD experience to mobile!
I agree. I didn't take your article's title to mean that tablets would displace large screen TV. More like, they would become another TV.
Thing is, they should and could already be. Why aren't they? We're not talking big technological hurdle here, even if initially their video quality can't be UHD. Streaming media protocols that adjust to the capabilities of the link and the appliance have existed for years now.
I've seen an odd phenomenon at work on this subject. Back in the 1990s and early 2000s, when digital TV was being developed and then deployed, people were saying absurd things about what digital TV was (or was going to be). Somehow getting DTV confused with the Internet.
Now that ISP nets have evolved to the point where they really can and do carry TV, everyone seems unable to figure out how it works. Seemingly unable to accept that they can move past the old delivery media.
BTW, my hunch is that Steve Jobs, by refusing to support Flash on Apple hand held toys, delayed TV to these devices by several years. Flash was the lingua franca of Internet TV, and still is to a large extent.
To me, the issue is bandwidth. If I want to watch basketball on my iPad, it varies depending on who is sending it. The iPad resolution is fine on a good day for the feed.
Apple has a good model with movies where you download to rent rather than streaming. Netflix streaming model has a problem with bandwidth depending on network congestion.
That problem is resolving itself as we speak. ISP core networks are getting faster, and ISPs are offering always faster broadband.
Just about all of my TV viewing has been streaming, mostly from abc.com, cbs.com, fox.com, nbc.com, or hulu.com. If there are bandwidth issues at all, it's usually during the ad breaks, when the player is trying to download the whole next segment and stream the ad at the same time. Also use streaming, on rare occasions, from Amazon. Also streaming from foreign TV networks, but mostly that's just for the newscasts.
Don't know about Netflix, because frankly, I can get more TV from those sites I mentioned than I can dedicate time watch to anyway.
My only point was, if I can do this on a PC, there's no reason in principle why a pad shouldn't be able to as well. This is not a technological leap we're talking about. It's here and now.
I can't think of many other things that are as big a waste of bandwidth and resources as watching tlevision on a tablet device. Of course, I can not imagine a bigger waste of bandwidth than sending television programs over the internet. Just because a lot of people do it does not make it any less wasteful or any less stupid. If a program is already broadcast over the air, or over cable, why in the world should it also be wasting bandwidth? Because it is more convenient? Or because somebody can profit from selling the service? There are, after all, still a few things thatbsimply should not be done, not even for the money.
It's a paradigm shift. Instead of tying up spectrum on one-way broadcasts, you distribute the vast majority of TV material on demand. The bandwidth taken up is therefore mostly between servers located at the edges of the ISP networks, to individual households.
Eventually, much of the spectrum now taken up by those one-way broadcast streams can be repurposed to two-way service.
Think for example of a typical cable system. The majority of frequency channels on that coax are still dedicated to carrying one-way TV programming, broadcast throughout their network. That can be reorganized into more individual, broadband two-way channels (say, DOCSIS), of course at the cost of adding in the servers at the edges.
The end result is far more modern networks, where the network matches what people actually want rather than people having to conform to the restrictions of the network. Not just in terms of those old traditional daily time slots. Also in terms of having the freedom to select any TV content source, without being tied to the offerings (and prices) of just the one cable or satellite system you happen to subscribe to.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.