Adding vision to process monitoring

Features - Quality

Sciemetric Instruments’ software coordinates massive numbers of picture files generated by camera-based quality monitoring and management systems.

February 8, 2018
Robert Schoenberger

Carefully analyzing feature measurement data can tell manufacturers when a product is falling out of spec. Monitoring quality variations throughout time can help experts identify process problems, telling manufacturers where to invest in new equipment or systems.

But post-processing measurement data has limits. Users can tell when a product is nearing the limitations of its specs, but they often don’t know exactly what’s causing the drift. It could be tool wear, heat generated in a continuous manufacturing process, variations in base materials, or other causes.

Loads of analytical data are great, but identifying root causes generally requires an up-close look at processes. Seeing the process through time and in real time can provide insights far greater than many charts or tables. So, many manufacturers have been adding cameras to their process-monitoring systems. However, managing that image data and making it usable is a huge task says Mat Daniel, vice president of operations for Sciemetric Inc., a Canadian process control software company recently purchased by the TASI Group in Cincinnati, Ohio.

“The cost of cameras, the cost of processing, the ability of software to do some really smart stuff, and be a really reliable source of data, that’s been driving what we do,” Daniel says.

As digital cameras have become cheaper and more capable, they’ve become more useful in industrial settings. The challenge, he adds, is tying those images to the process data already being collected.

Matching images to processes

Call it the Industrial Internet of Things (IIoT) or Industry 4.0, the push to connect production machinery with monitoring systems has been a massive trend the past decade. Manufacturers are collecting more data from equipment, hoping to glean important information that will boost productivity and quality.

“The knee-jerk reaction when people start getting into Industry 4.0 is to collect all the data you can and figure out what to do with it later. A lot of people are collecting data into these big, huge databases with no plans for how to use it,” Daniel says. “There’s a better way of doing things, and vision can be a very important part of it.”

Sciemetric’s QualityWorX Vision software pairs images from cameras mounted in and around production machines with process data from that equipment. If quality managers at a manufacturing plant notice process changes, such as higher cutting forces on a machining center, they can call up images from that machine in real time to identify what’s causing the variation.

“We can merge image data into a consolidated part history,” Daniel says. “With that, you can compare part images from early in a part’s production history to the ones coming off the line months or years later. We take the complexity of managing all of that image data out of the manufacturer’s hands.”

Data Management

A problem with recording image data is that it’s massive compared to process information. Measurements from a scanner or touch probe generally get recorded as simple text points measured in bytes or kilobytes. Modern digital cameras generate megabytes of data in seconds.

For example, an automotive engine producer can generate 40MB of image data every 40 seconds as equipment monitors machining processes on bore honing and other machining steps. At full resolution, 1MB-per-second data generation rates can lead to nearly 29TB of data generated in a single shift. Manufacturers who want that level of data collection have to figure out where to store that amount of information in the short run, how to transfer such large file sizes to servers, and how much data they want to keep long term.

Daniel says Sciemetric’s engineers work with customers to determine the best approaches to those storage and processing questions on a case-by-case basis. A few best practices have emerged, but the answer generally comes down to each company’s goals for its quality initiatives.

Full, raw image data tends to have a short shelf life, he explains – typically about three months’ worth of raw footage. Quality problems that require enlarging images up to 800x magnification to look for detailed wear or production marking tend to show up early in the process. Once that window has passed, most manufacturers want reference data to compare parts and processes through time. In those cases, a lower-resolution photo works well.

“Capturing low-res versions in one database, alongside the process data, eliminates the need for huge raw files,” Daniel says.

Sciemetric’s software saves JPEG images at about 75% image quality to a database, tying those images to process information. Daniel explains that the lower-resolution images tend to be more than adequate for quality control and legal/corporate/warranty accountability.

Vision management system design

System architectures vary depending on company goals, existing hardware, and other factors. Daniel says vision management systems tend to have similar components:

  • Cameras – placed inside equipment, within quality stations, or along material handling systems; capture image data as parts move by post processing
  • Gateways – computers connected to cameras and process monitors; collect image and process data
  • Network – typically wired or wireless Internet connections between gateways and central servers
  • Servers – computer systems that can be on-site, off-site, or in the cloud; collect data from gateways for processing

“If the network’s down, the gateway allows you to cache the data. You can store images at the gateway level, so you’re not moving these huge data files across the network,” Daniel says.

How users allocate funds and equipment to the various parts of the image system can determine overall cost and functionality. Smaller manufacturers that operate one or two plants can invest more heavily in gateways, using floor-level machines to collect and analyze data. Network and server spending can be minimized in those cases, as smaller companies can use those portions of the system for backup and occasional analysis.

Large, multi-national users will likely invest more in network and servers, giving quality managers in corporate offices the ability to pull up real-time images and data of processes running simultaneously in the United States, Europe, and Asia.

To get the most out of QualityWorX Vision or any quality management system, Daniel says many companies are moving responsibility for data from information technology (IT) departments to stand-alone data processing divisions. Those data managers are the ones who typically decide what data to collect, how to design the systems, and what improvements the company should pursue.

“It takes someone with a vision for data to drive the process. Typically, a lot of organizations have been good at building equipment or systems monitoring,” Daniel says. “But we’re seeing more need for a data person who thinks through how to architect and manage their data strategically.”

Sciemetric Inc.

TASI Group

About the author: Robert Schoenberger is the editor of TMV and can be reached at 216.393.0271 or