Think About Optical Filters First as You Design Your Vision Application
Optical filters fix the lighting anomalies that complicate most vision applications. That’s why they are so fundamental to designing effective machine vision applications.
Unfortunately, many vision application designers learn the value of optical filters too late in the game. They get so caught up in the details of combining cameras, sensors, lenses and software that they don’t start thinking about optical filters until their application has been deployed.
That’s when they discover that ambient light or lack of visual contrast is ruining their application. Then they have to find the proper optical filter while reconfiguring the application.
Thus, it makes much more sense to weigh your optical filter options at the beginning of the application design process. Understanding a few key principles of optical filtering will help clarify their pivotal role in machine vision applications.
The Basics: Why Optical Filters are Essential
A machine vision system must accurately identify what it sees when an object moves into camera range. Some systems inspect objects like welds on a production line or product labels on conveyor belts. Other systems enable industrial robots to align components properly.
But all vision systems share a common trait: They must determine whether to “pass” or “fail” an object being photographed. Ambient light from fluorescent fixtures and manufacturing processes can trip up a machine vision system, creating over- or underexposed images.
Moreover, some objects have information imprinted upon them that is invisible to a camera in some conditions but perfectly visible with the right combination of lighting and filters.
Thus, the core role of optical filters is to debug your lighting environment to reduce the likelihood of incorrect pass/fail ratings.
How an Optical Filter Works
An optical filter typically is a thin disk of colored glass mounted on the external end of a machine vision camera’s lens. The filter makes sure only certain kinds of light reach the camera’s image processor.
Digital cameras process light rays within the wavelengths of the human visual spectrum. They also work with light in the ultraviolet (UV) and near-infrared (NIR) frequency wavelengths.
Optical filters manipulate the light rays that reach the camera’s charged couple device (CCD) and complementary metal oxide semiconductor (CMOS) sensors. These sensors send this data to the camera’s imaging processor. Machine vision software reads these images and translates them into pass/fail ratings.
Problems Optical Filters Fix
These are some of the most prevalent visual anomalies that optical filters help overcome:
- Shadows. Light rays cast shadows on three-dimensional objects like machine parts or electronic components. An optical filter blocks some of these light rays to overcome shadows.
- Visual contrast. Components often have serial numbers or other identifiers etched into their surface. If that surface is colored, uneven or otherwise difficult to photograph with available light, optical filters can create contrast that makes these numbers easier to interpret with machine vision software.
- Glare. Shiny surfaces and bright lights can overexpose large sections of an image and render it useless to machine vision software. An optical filter can reduce or remove glare in much the same way that a pair of sunglasses does.
- Optical distance. Visual applications often operate in tight spaces that place the camera lens extremely close to the object being photographed. Small distances can play havoc with digital imagery without the right optical filter.
The testing phase of visual application design provides a prime opportunity to identify lighting problems and find the best filter to deal with them.
Common Varieties of Optical Filters
Optical filters come in a vast variety because every production ecosystem has unique lighting characteristics and photography challenges. These are some of the most common optical filters:
Polarizing filters. Visual glare typically results from light rays rebounding from flat, shiny surface. Polarizing lenses clarify the field of view by preventing these rebounding light rays from reaching the camera’s sensors.
Band-pass filters. A band-pass filter aligns with a specific slice of the visual spectrum, either blocking a specific section of the light spectrum or allowing only a specific wavelength to reach the camera sensor.
Band-pass filtering often works like this: The filters are paired with colored lights (like LEDs) to highlight certain things in an image and block everything else. Consider a wiring harness with red, green, blue and yellow wires. Strategic use of LEDs and band-pass filters can target each of these colors, allowing a vision application to do inspections that ensure all the correct wires are right where they belong.
Neutral-density filtering. Some inspection applications require LEDs to fully illuminate and object. Alas, LED lights often overwhelm a camera’s image sensors, creating oversaturated pictures. Neutral-density filters correct this lighting effect.
Ultimately, designing a visual application is a bit like baking bread: You have to add yeast at the right time or the dough will never rise. Indeed, that’s why it’s imperative to bake optical filters into your visual application designs.