A Quick Look at Color Lighting and Filters for Vision Sensors
Manufacturers who use machine vision systems to inspect parts, guide robotic arms and sort packages confront the same reality known to every weekend shutterbug: It’s all about getting the light right.
The trouble is that a factory floor is a photographic minefield. Shops windows let in too much light at high noon but cast too many shadows when the second shift clocks in. Shiny parts reflect light back into the camera, spoiling the image.
The solution to these lighting quirks lies with machine vision filters, dispersed lighting, and many more lighting tactics that ensure machine vision systems produce the clear digital images that factory automation inspections requires.
Common Lighting Challenges in Automated Factories
Machine vision cameras, like all digital cameras, use a lens to refract light onto a vision sensor that works with a digital signal processor (DSP) chip to translate light waves into pixels that reproduce images. These images allow industrial robots to identify parts and move them to their proper location in the production sequence. All these variables must interact quickly and accurately to keep the production line running at peak efficiency.
That’s not easy because of all the lighting variables in a production environment. The most common lighting challenges in industrial automation settings include:
Ambient light: A factory’s ambient-light levels fluctuate throughout the day. Operations like welding create white-hot blasts that can affect the lighting on nearby automated production processes.
Though conventional fixes like adjusting the cameras’ shutter speeds and aperture widths can help, vision systems also can deploy band-pass filters that allow only a narrow spectrum of light to reach the camera’s vision sensor. This ensures the camera sees only what the vision system needs to see.
Colors: The color of a part in production can affect a robot’s ability to see it with a camera and then manipulate it. On a box or packing label, some colors reflect light while others absorb it. That can have a huge effect on the working of a logistics company’s distribution system. In electronics manufacturing, the color of wires can send critical signals to a vision system.
Machine vision cameras often use colored LED lights to create visual contrast and counteract the effects of colors on surfaces. Color filters can enhance or complement the effects of LED lights.
Reflections: Shiny surfaces on metals and plastics create blobs of excess light that prevent effective imaging. But reflections aren’t all bad: They can be used to redirect light into an otherwise dark area that other lighting tactics can’t reach.
Polarizing filters are one of the best fixes for reflections. Another tactic called dark field illumination sheds light on a surface at a shallow angle to reduce reflectiveness. Using light diffusion spreads light rays over an object to decrease the number of direct rays hitting the shiny surface. Cameras also can be angled and lit specifically to spot flaws in reflective areas.
Shadows: Light from almost any angle can cause shadows that reduce the clarity of a machine vision image. Three dimensional parts (as opposed to a two-dimensional product label, for instance) are especially prone to creating shadows in digital photographs.
Though shifting cameras’ mounting angles can overcome some shadows, an imaging technique called high dynamic range (HDR) is becoming more popular. HDR photos digitally darken light areas and brighten dark areas to create more visual consistency and reduce shadows.
Angles: Complex parts like engine blocks and exhaust manifolds have lots of angles that create a host of lighting challenges.
A camera mount can be angled to account for many of these issues. Placing parts, cameras and lights at specific angles also can help identify flaws in a product.
Surface defects: Drills and milling machines leave burrs on parts that must be removed. A robotic weld might leave a chunk of molten metal where it doesn’t belong. Machine vision cameras must be able to flag these defects before they cause problems.
A diffusion filter helps distribute light across these three-dimensional surfaces to identify anomalies. On flat surfaces, dark-field photography might produce a better image.
Dimensions and optical distance: The size of an object can pose substantial lighting challenges, whether in a microscopic environment like a silicon wafer or in a macro environment like a commercial aircraft factory. Moreover, optical distance between the camera and its subject can affect image quality and lighting requirements.
Short optical distances typically require less light than long optical distances. Thus, cameras must be mounted strategically to optimize machine vision effectiveness.
Hardware and software: Ultimately, machine vision comes down to selecting the proper cameras, lenses and software that produce the best outcome for a specific automation need. For instance, cameras can be equipped with multiple colored LEDs to project a blue light on one section of part and a red light on another section. Cameras with liquid lenses have no moving parts, reducing maintenance costs.
Some applications require infrared or ultraviolet cameras that capture light outside of the human vision spectrum. High-performance machine vision software pulls all these variables together into a cohesive solution.
- The Basics of Machine Vision Lighting
- Interactive Lighting Advisor shows how different lighting strategies produce specific outcomes. This tool provides guidance for machine vision, barcode scanning and object verification.
- Sensors in Factory Automation explains how vision sensors work and why they are so valuable.