I was pondering a poser just the other day, when I conceived a way to (a) sell lots and lots of FPGAs and (b) make the world a better place for a lot of people.
Over the last few years I've become very interested in biological and robotic vision systems. For example, at some stage along the evolutionary path – let's say 800 million years ago on a Wednesday afternoon following a small lunch – some multi-cell organisms managed to develop photoreceptors that gave them the ability to detect and respond to some form of light.
It may be that these first photoreceptors were cone cells that were primarily sensitive to light in the ultraviolet (UV) portion of the spectrum. (Cone cells are so-named because of their shape under a microscope.) Alternatively, the first photoreceptor may have had their peak sensitivity way up in the red region at the opposite end of the spectrum; both of these scenarios are consistent with existing data. Be this as it may, during the course of the next several hundred million years, the creatures that were to evolve into vertebrates, dinosaurs, mammals, primates and – ultimately – humans developed four different types of color photo-receptors (cone pigments). Thus, these creatures would be known as tetrachromats.
Sad to relate, however, sometime between 310 and 125 million years ago our ancestors lost first one and then two of these pigments. We don't know exactly when or why, although one possibility is because these creatures became nocturnal. This explains why most of today's mammals are dichromats with only two types of color photo-receptors.
Stick with me, we're almost home. Sometime between 45 and 30 million years ago, the primates that were to evolve into humans "split" one of their color photo-receptors into two different types. Thus, typical humans have three different types of color photo-receptors and are known as trichromats. This gives us a much richer visual experience than our dichromat cousins, such as dogs and mice, who tend to perceive images only in terms of blues and yellows.
A typical human trichromat with three types of color
photo-receptors enjoys a rich visual experience.
A mammalian dichromat – such as a dog or mouse – with only two types
of color photo-receptors has more limited visual experience.
So, why am I waffling on about all of this? Well, depending on the source of one's data, it may be that anywhere between 1-in-12 and 1-in-20 people are color blind in one form or another. For example, approximately 1 to 2 percent of all men on the planet have one of their three color cones missing completely. Another 7 percent have all three cones, but one or more are abnormal in some way such that they aren't able to discern the same amount of color information as the rest of us. For example, a person with a form of red/green deficit known as protanopia might perceive images only as hues of blue and yellow in the same way as a dichromat.
If you visit the Vischeck Website, you can run their Vischeck software on your own images to see how they appear to people with different types of color deficiencies. Of particular interest to us here is the fact that the folks at Vischeck also have a program called Daltonize. [This is named after the English physicist John Dalton (1766-1844) who was one of the first to describe color blindness.] This program is really clever. If you tell it what form of color deficiency you have and provide it with an image, it will analyze the image and replace the colors you can't see with other colors you can. Furthermore, it does this in such a way as to accentuate details that would otherwise be hidden from you.
Another very clever tool is Visolve from the folks at Ryobi System Solutions. This is special software that takes colors on a computer display that cannot be discriminated by people with various forms of color blindness and transforms them into colors that can be discriminated. In addition to a variety of transformations and filters, you can also instruct the software to apply different hatching patterns to different colors. This really is very clever technology and you should take a moment to check it out.
So . . . my idea is that is that it would be really useful if someone looking at a television or computer monitor could inform the display as to any color vision problems they have, and for the display to use Daltonize- or Visolve-type-algorithms to correct for this on the fly. In the case of television, of course, you would probably leave it on the "normal" setting if multiple viewers (perhaps a family) were enjoying a show. But consider the case of an individual viewer with a color disability – say "Dad" watching a football game on his own while the other members of the family were "out-and-about". Such a viewer might really appreciate the enhanced amount of detail that could be provided by this form of image processing.
Meanwhile, in the case of computer displays used by individuals, I think this capability would be a "no-brainer". If one display manufacturer had this technology, a lot of users would be interested, so this manufacturer would sell more units, which would drive the other manufacturers to follow suite. In fact, this could be advantageous for any form of graphics display, such as the GPS systems in cars and trucks.
And what would be a good way to implement these algorithms? Well, you need massive parallelism and humongous amounts of processing power, which screams "FPGAs" to me. Of course, this may never come to pass; on the other hand, it may be that the "egg heads" working in the top-secret underground bunkers at the FPGA companies are working feverishly on this as we speak. Just remember, if you start seeing this sort of capability appear in the not-so-distant future, you saw it here first!
Last but not least, if you are interested in further reading, I have an ever-evolving paper on color vision that includes links to sites that allow you to test yourself for color blindness or to experience how color blind folks perceive things that most of us take for granted; check out the "Color Vision" topic on the "More Cool Stuff" page of my www.DIYCalculator.com website.
Questions? Comments? Feel free to email me – Clive "Max" Maxfield – at firstname.lastname@example.org). And, of course, if you haven't already done so, don't forget to Sign Up for our weekly Programmable Logic DesignLine Newsletter.