Many years ago I had the privilege of being a research assistant with NASA while in University. The object of the research was to determine if TV broadcast satellites in geosynchronous orbit could be spaced at 1.5 degree intervals to improve utilization of the resource. We recorded a set of standard MPAA still images mixed with an action movie (synchronized to place vertical and horizontal retrace intervals within the video trace area of the desired image) as the interfering source at various attenuations with respect to the desired content.
These images were then presented via TV monitors to a group of subjects (from the psychology department, of course) and they were asked to rate acceptability of the pictures. The results were quite interesting in that very nearly all subjects found images acceptable with an interfering source attenuated 12dB with respect to the presented image. This validated spacing the satellites at 1.5 degrees, as a twelve foot dish was able to attenuate RF from a satellite 1.5 degrees away by at least 19dB.
What became apparent to me was that most people were interested in the content being presented on the screen, and once someone is involved with the presentation, the actual quality of the image becomes much less important. NTSC video was almost always subject to interference, often by multipath reflections, but also from atmospheric sources and the degeneration of signal caused by distance from the broadcast antenna. Unless the interference is very egregious, most people tend to become "blinded" to it as they engage in the presented content.
Most people would still be watching CRT based television if that were available in the format engendered by the switch to the ADTV standard. The standard was developed in part to try and return margins to the television manufacturers, but it wasn't successful as can be seen by the troubles experienced by the likes of Sony and Sharp and the transformation of RCA into a Chinese held company.
The only driving forces for new technology in television displays is by making it less expensive than the current technology du jour or by legislating it into existance. We would still be watching NTSC video on CRTs if the government hadn't stepped in and mandated the transformation. Most consumers were quite happy with what they had.
I'll leave you with a couple of questions. Do you watch television because of the picture quality, or do you watch it for the content being presented? If everythiing the director includes in a production can still engage you when the image is gracefully degraded by interference, does the interference detract from the experience?
This is redundant, but I'm one of those of the view that the demise of plasmas is at least five years overdue. As aptly noted, sunk cost decision are the hardest, but they can also be the least rational. So even though lots of very cool technology and innovative design went into plasmas, in the long term plasmas simply got outcompeted in the market by very different approaches. The time will come for LCDs also, of course. As also noted here, technologies with OLED-like properties are more likely to carry us into the Minority Report world of active-wall malls and talking cereal boxes. It's worth watching that movie again just to contemplate how magical swishing images seemed before it became an everyday feature of the phones we carry. Remarkable times we live in, indeed.
But those things (technology changes/shifts) never happen in a linear fashion. What's fascinating to me, and I'm sure to a lot of our readers -- both in engineering and management -- is how best to sense, capture and capitalize on the moment when the tide starts to change.
I can only compare this Plasma holdout-edness with the same phenomenon with respect to tube electronics and vinyl records. The proponents will insist that the quality of their favorite technology is leagues ahead of the newer competition.
I think you hit the nail on the head, Junko, when you said that some things are simply more important than others, to different consumers. To me, for instance, it has to be a balance. A system that is inelegant, touchy, or wasteful, is unacceptable, even if the proponents claim that the sound or image quality has no equal.
Oh, and that blade sharpener wasn't dangerous at all. It was all enclosed. A very clever gadget. I think it was made in Switzerland. You moved a handle back and forth, making a clackety clack sound as it sharpened the blade. One of those lasting memeories from very young days.
Even though it was not mentioned in the article, Panasonic has quality control problems with their plasma displays going bad just beyond warranty. At least that is the experience I had with my own Panasonic plasma display.
Actually, this conversation is getting interesting. (I need to confess that I too am one of those biased people, firmly believing that plasma has been dead in the last few years....)
But in my own defense, I used to cover really esoteric stuff like merits of an AC plasma display panel -- many moons ago. (In those days, I thought plasma was really a hot stuff, with a real future for large displays)
At any rage, why do I find this conversation interesting? It's because this is presenting a classic example -- how and when one technology dies and loses in the market over another. I don't think it's just marketing. I don't think it's pure R&D matters, either.
There must be a lesson we can all learn from this.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.