Projected capacitive touchscreens continue to see ever-higher market demand and have become, in just a few years’ time, the de-facto user interface for smartphones. Over the next couple of years, we will see this growth trend continue, with the technology further trickling down to lower-end feature phones. The industry has developed a well-established touch supply chain for this technology, including discrete sensor vendors patterning indium tin oxide (ITO) films or other transparent conductor material, touch controller IC suppliers, cover lens glass vendors, and display module vendors (LCD or OLED based). The various components of a touch module assembly are then integrated by a combination of contract manufacturers (CMs) for smartphone OEMs, sensor or cover lens vendor subcontractors, or display vendors. Touch module integration involves attaching the flex cable with the touch controller IC to the sensor, laminating the sensor assembly to the display module, and attaching the cover lens.
Display vendors, anxious to capture more value, have allocated significant R&D budgets over the past few years to integrate the touch function inside their LCD module. Display vendors already use ITO deposition for the interconnection of pixel elements across the display. Each pixel requires a thin-film transistor (TFT), with a source, gate, and drain that need to be driven with the appropriate signals to hold pixel data values. With display integrated touch, the discrete sensor vendor is effectively bypassed, simplifying the supply chain.
Early efforts for display-integrated touch focused on optical “sensor-in-pixel” technology, which expanded the circuitry inside the TFT LCD cell to create an optical touch-sensitive cell. These efforts have been largely abandoned and replaced by projected capacitive touch in recent years. Projected capacitance is the same mainstream technology used in the traditional discrete, or “on-stack”, sensors. The “cell” is now considered in a broader sense as the area between the top color filter glass and bottom TFT array glass, the two enclosures between which the liquid crystal material is contained. As long as the touch function is contained between both substrates, the industry considers the display as an “in-cell” design. This means the touch layers can still be discrete layers, either separate from or shared with the layers driving the display, as long as they are contained within the LCD module. The industry has effectively turned to a more practical approach, with the first phones implementing this type of in-cell touch in production this year.
Figure 1 shows in more detail the comparison between the traditional “on-stack” sensor, and the display-integrated variants resulting in “on-cell” and “in-cell” touch panels. The on-stack sensor shows a version with the touch screen cover lens used as the sensor substrate (“sensor-on-lens”).
Figure 1: On-Stack, On-Cell, and In-Cell Panel Diagrams
Having the touch area limited to the display size is OK for some user interfaces. However, there are many user interfaces that require an active touch area beyond the display area. This is typically fix function buttons or an additional area surrounding the display for gestures. Having the touch sensor elements as part of the display will restrict the type of user interface that can be used in a product.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.