Matching Human Visual Inspection for Automated Display Inspection

Author:
Anne Corning

Human perception is the ultimate standard for determining the visual quality of a display. For this reason, human inspectors have traditionally been used for quality control inspection of products like display devices. However, using human inspection can be problematic because of the statistical variation between observers. Advances in photometric technology and image processing in the last decade have provided an inspection alternative that can match the level of visual acuity of the human eye to detect display defects.

Limitations of Human Display Inspection

Human vision is subjective, unquantifiable, and difficult to replicate. In an inspection setting, this variability may increase the risk of accepting defective devices or failing good devices—both outcomes that add cost to the manufacturing process. Human inspectors on the production line are unable to capture detailed quantitative information about all defect types and occurrences—with sufficient repeatability—especially because human observers tend to only classify the most obvious defects.

These inspection shortfalls have implications when evaluating devices that have a predominantly visual impact—for instance, display devices. Displays have become the pivotal user interface in consumer devices, from smartphones and computers to televisions and automotive interiors, so quality control measures that take human visual experience into account are particularly important in display manufacturing.

Automating Display Inspection

Automated visual inspection (AVI) methods have emerged in recent decades that are able to closely match the visual perception of the human eye, while providing the added benefits of speed and consistency. Sometimes referred to as automated optical inspection (AOI), this approach relies on a combination of camera technology, lighting, software algorithms, and data processing. By the 1980s, AOI was commonly used for inspection of printed circuit boards (PCBs). The approach is now being applied to many other electronic components with complex inspection needs such as displays.

Visual display testing is automated using photometric and colorimetric imaging systems that are capable of objectively quantifying visual qualities like brightness, color, and contrast of displays. These systems can detect defects like stuck-on or stuck-off pixels, lines, and mura (a term used for non-uniform areas or blobs in a display). Imaging systems with light and color measurement capabilities (imaging colorimeters) can measure the spatial tolerances (size, position, location) of a display’s visual qualities, enabling precise identification of mura and defects.

A key element of automated visual inspection is the imaging system. Typically, one or more imaging systems may be used to capture 2D or 3D images of a device under test (DUT). The requirements of a specific inspection task determine the choice of imaging system (e.g., camera, photometer, colorimeter, high-resolution photometric imager), the lighting, and inspection setup. For inspecting today’s pixel-dense consumer display devices, which incorporate technologies such as LCD and OLED to new microLED, high-resolution imaging systems are necessary to capture pixel-level detail that is discernable to the human eye.

Modeling Human Visual Sensitivity: JND

Quantifying human visual perception—the ultimate display quality standard—was a challenge until the National Aeronautics and Space Administration (NASA) developed a method to measure Just Noticeable Difference (JND) based on a Spatial Standard Observer (SSO).1 This method was developed with numerous potential applications in mind, “most notably…the inspection of displays during the manufacturing process.”2

The JND model added a spatial element to metrology, creating a tool for measuring the visibility of an element—or the “visual discriminability” of two elements—within a given area. Based on a sampling of human observers, the JND scale is defined such that a JND difference of 1 would be statistically “just noticeable” by an observer. On an absolute scale, a JND value of 0 represents no visible spatial contrast and an absolute JND value of 1 represents the first noticeable spatial contrast. By creating a JND map of the image of an illuminated display, random and variable mura defects can be graded according to their severity with a direct correlation to human visual perception.

Using the JND method, display manufacturers can accurately determine if mura or defects on a DUT will be discernable to human users, and therefore decide whether they should be pulled from production for rework or (if defects are severe) discarded.

The raw JND analysis of a display measurement image captured by a ProMetric® Imaging Colorimeter and TrueMURA™ Software (part of Radiant’s TrueTest™ family of automated visual inspection software). The image clearly shows the mura at the center of the screen. Some artifacts (light leakage and dark spots) at the edge of the display are also visible. This analysis highlights mura that were barely visible in the original image.

From this false-color representation of the JND map (in TrueMURA software), it is clear that the two spots in the center of the display and some areas along the bottom of the display have JND values greater than 1 (the threshold value for being “just noticeable”). The speckled area across most of the display—apart from the mura at the center and along the edge—represents JND values of about 0.7 or lower and could be accepted or rejected based on user-defined pass/fail tolerance settings in the software.

 

To learn more about the application of JND algorithms for fully automated display testing, read our article published in a recent issue of Quality Magazine: “Using ‘Just Noticeable Difference’ to Automate Visual Inspection of Displays According to Human Visual Perception”.

 

 

CITATIONS

  1. Spatial Standard Observer, United States Patent 7,783,130 B2, March 20, 2012
  2. Spatial Standard Observer (SSO), Technology Solution, NASA 2015
radiant vision system wechat