Viewing distance and TV size: OK, mainly in the US it is easier to have these 60 inch fancy TVd with the highest definitions. If the TV room is as big as I see from the US house plans - fine! But here in Europe apartments are smaller and so is the market for the big screens. For me as a house owner 42 inch is already my limit (windows, furniture, aesthetics, seating distance - do not permit more size.) And frankly the content delivered via a SatTV dish hardly does value the markup of a 4K TV set, which has only slight visible improvements in picture quality compared to HD TVs and this added quality is only visible with "fine tuned" demo material.
A top studio exec told me recently the studios oppose UDHTV, and they are lobbying for a more nuanced next-gen format:
I can't wait to see a 4K screen since I want theater quality video.
If the problem is just selling 3D, however, why don't they just have a 4K TV everyone in the room can watch along with 3D glasses (with 3D circuitry and small, eye-ball size screens built in to the glasses) available so that if you like the show, you can sit on the couch, slip on the glasses, and watch using 3D glasses?
I know it doesn't do much for the TV maker's sales but it would be nice for the fancy headset makers and the consumers.
They could even put small cameras in the 3D glasses (headsets) which project a 'shadow image' of things moving in the room in front of and around the TV set.
I know that the data requirements are already big for 4K, and 4K + 3D makes it even bigger, but I think that existing Blue-ray can store the data.
From a consumer view point you buy the receiver and the 4K monitor...then buy the 3D headsets if you want to go that way. (Or I suppose that you can just buy the 3D headsets without the 60inch monitor...)
Just wondering if this is technically and financially feasible...seems OK to me at first sniff.
A few years ago, 3D was being pushed on us by the industry or trade press, both actually, and I never could figure out who had asked the question. Not only that, but there were absolutely no elegant solutions for transmitting 3D. So I was also not surprised to see it fail, although for different reasons than those cited in the article, perhaps. And too, it seemed a real leap to think that people would want a steady diet of wearing glasses, just because they occasionally might enjoy a 3D movie.
Analog TV was pretty dismal, so when an elegant solution finally emerged, i.e. digital RF channels and MPEG-2 compression, it seemed like a no-braner to me. And as we all know, the refrain about high prices, in consumer electronics, has a way of evaporating.
UHD also has elegant solutions for transmission and storage. Digital transmission channels already exist, and new compression algorithms such as H.265, and other potentially even better ones already available today, will allow UHD content to be transmitted or stored in no more space (or channel width) than HD. Good deal! And the screen looks gorgeously smooth, even compared with HD.
So yeah, HD and UHD made/make me look forward to fun new toys. The 3D hype instead made me grumpy.
The big fly in the ointment for 4K is viewing distance. People are in the habit of watching their TVs from a great distance, many screen diagonals away, from where high screen resolution simply cannot be perceived.
On the other hand where seeing detail is necessary, people routinely watch their 1080 16x9 PC monitors from a distance of less than 1 one screen diagonal. Following this logic, one should view a 4K screen from 1/2 the screen diagonal or 30" away for a 60" screen, to fully appreciate the detail. Logical as that may be, the public is simply not going to accept that close a viewing distance for their TVs. I can almost hear my mother saying, "Get back from the TV or you'll go blind."
In movie theaters, everyone crowds into the back rows as if they were in church. To counteract this viewer ignorance, France has a law mandating that the last row of a movie theater be closer than 3 screen diagonals, forcing the French to better appreciate movie resolution. IMAX theaters also force a close relative viewing distance. Unfortunately, there's no chance of changing the public's habitual TV viewing distances.
4K's original design specification in EE Times actually called for viewing close enough that the viewer's view of the screen subtends an angle of 120 degrees! That's how close you have to be to fully appreciate the 4K resolution, and get immersed in the scene as intended. I'm willing to bet that at this intended viewing distance most 4K TVs will appear quite dim at the edges.
While 4K certainly won't have the inconvenience of 3D glasses, it's resolution will be as unused as the 300hp or more of engines in our cars that their manufacturers brag about, or the 1200watts of our speaker amplifiers.
Are those Companies getting crazy? I feel shamed about those technology Companies pushing 4K TV, 32M Pixel Consumer Camera, and Multi-Core 2+GHz Mobile phone. Although those number sounds bigger. It bring in little additional benefit. All they want is to steal money from Consumer's wallet. Does anyone know how much electronics garbage human creates now days?
Junko, I agree with you in regards to the win if the difference is noticeable. I smiled at the idea that kids were a part of the equation, as they are not the consumers doing the purchasing. If the parents can see the difference then we have a winner, otherwise it will take some time before we know what the true market share will be (or not).
Honestly speaking. I think it is most likely 4K feel better than 2K is due to 4K using other advanced technologies. You could easily see quality difference between 2K TV set as well at different price range. I am highly doubt that 4k itself make much difference at all unless you are close enough to the panel.
There is a fairly short shelf-life to some of these. My first CRT-projection HDTV, a 65" monster, was not long for the world when I dumped it after 6-7 years. Earlier this month, my DLP went out... DLPs are not long-lasting, due to the electromechanical nature. Early plasmas fade substantially in a few years. LCDs from 5-19 years ago will keep working, but they look as bad as they did back then, particularly compared to today's LCD/LED hybrids.
So there's replacement, and the rate is certainly higher than it was in the old 27" tube days, based on both replacement and technology advancement.
The 3D thing was greed.. the industry had seen some good years. Early adopters' HDTVs, the mainsteam moving to HD, the early adopters buying their replacment, then the secondary televisons being replaced with cheap LCDs, etc. But once you've got a fairly modern LCD/CFL or particularly LCD/LED, the TV's going to last 25 years. Again. They had to think of something else to drive upgrade en masse. 3D wasn't it.
I'm not sure 4K is, either. It's much like the upgrade from CD to ... what, exactly? DTS-CD, SACD, DVD-Audio, Blu-ray Audio... none of these caught on big. And mostly because CD was plenty enough for most listeners. I think HDTV may be plenty enough for most viewers.. even DVD-quality SD took away most of the complaints of analog NTSC/PAL.
And the real reason HDTV took off: football. That needs OTA HD, ATSC+ or whatever. H.265. Another ten+ years of tech development, because you see the networks buying in.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.