Image sensors are used in a wide range of applications, from cell phones and video surveillance products to automobiles and missile systems. Almost all of these applications require white-balance correction (also referred to as color correction) in order to produce images with colors that appear correct to the human eye regardless of the type of illumination—daylight, incandescent, fluorescent and so on.
Implementing automatic white-balance correction in a programmable logic device such as a Xilinx FPGA or Zynq-7000 All Programmable SoC is likely to be a new challenge for many developers who have used ASIC or ASSP devices previously. Let’s look at how software running on an embedded processor, such as an ARM9 processing system on the Zynq-7000 All Programmable SoC, can control custom image- and video-processing logic to perform real-time pixel-level color/white-balance correction.
To set the stage for how this is done, it’s helpful to first examine some basic concepts of color perception and camera calibration.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.