Motorola Uses Machine Learning to Boost Quality
This article originally appeared on June 5, 2019.
As manufacturers across the globe increasingly automate and connect their production operations, quality has become one of the key differentiators among products and companiesâ brand reputation. This reality is leading more companies to explore advanced technologies to improve their quality processes. The technology most prevalent in this area today is machine learning.
We recently covered Frito Layâs use of machine learning to test the quality of its chips and streamline its potato weighing processes. Now weâve learned that Motorola is applying the technology to its mobile phone production operations.
Motorola is using machine learning software from Instrumental to discover design and production issues faster, strengthen quality control on the line, and streamline their issue response to deliver new products on demanding schedules.
The role of vision
In January 2018, Motorola began working with Instrumental by first identifying a handful of mobile phone assembly states that highlighted all of the key components of the phone as it was built. With this information in hand, Instrumental built and deployed inspection stations consisting of cameras, tunable lighting, and customized fixtures in less than three weeks.
Explaining the use of cameras in these inspection stations, Anna-Katrina Shedletsky, CEO of Instrumental, said, âVision in industry is used very specifically, for example, to measure a gap; but this is a different use of vision. Because we were going where there wasnât any [pre-existing] vision system, we built a low-cost station using a 20-megapixel flir camera with no IP [intellectual property] in it and integrated it with our software to work as a test station on the line. We use the cameras to scan the bar code and take pictures while the software does the analysis in real time to give a pass/fail result.â
Another differentiator of Instrumentalâs use of vision technology is that, as soon as images were collected, they were uploaded to Instrumentalâs database and made available in the Instrumental web application to Motorola engineers around the world. According to Instrumental, this complete data record is a key differentiator between Instrumental and traditional industrial vision systems, where the applications must be incredibly specific and the data remains trapped on the local machine in the factory, unavailable to the team.
Dark blotches on the PCB
One of the initial production operations to which Motorola applied Instrumentalâs software was on the mobile phonesâ printed circuit boards (PCBs). The industry standard for PCB inspection is automatic optical inspection (AOI) systems, which compare an image of a circuit board to the digital CAD file to make sure that each part is present. Shedletsky said one limitation of AOI is that it does not find new defects or damage and cannot analyze PCBs that have undergone additional assembly steps, something that is incredibly common in modern miniaturized devices.
In this first application of Instrumentalâs machine learning algorithms, dark blotches were detected on empty areas of a subset of the PCBs. When engineers examined this, they determined that the blotches aligned with buried vias (areas that connect two or more inner layers of the PCB) that make up the board circuitry itself. Armed with this insight, the engineers investigated further and found that the boards were thicker than specifiedâwhich would have created serious problems with their critical tolerance stack ups (calculations of the maximum and minimum distance between two features or parts)Â and could have been very difficult to track down as a source of the problem.
âThese PCBs had gone through AOI and had passedâeven though the blotches were clearly visible,â said Shedletsky. âUsing Instrumental data, it was easy to see that the defective PCBs were all from one vendor, enabling the Motorola team to work with their supplier to correct the issue quickly.â
Self-programmed algorithms
Shedletsky said that once the first 30 units from the build were completed, Motorola engineers could use Instrumentalâs machine-learning algorithms to find new defects that they werenât previously aware of. Once a defect was found, every subsequent unit can be setup to test for the same failure mode. Failures are then automatically sorted into collections where defect rates and trends are calculated in real-time.
âThe machine-learning methods that drive Instrumental technology also enable each Monitor [Instrumentalâs software app] to learn the difference between a typical and an anomalous unit. This makes it possible to set up tests that can find unforeseen defects automaticallyâsomething traditional industrial vision systems cannot do,â she added.
Explaining Monitor, Shedletsky described it as a software application that lets an engineer set up a test that runs only in the cloud. "When Monitor validates that itâs catching the things the engineers want to identify, we have a drop-down selection that will push the self-programmed algorithm into a productâs recipe at the edge," she said. "Training of the algorithm takes place in the cloud with compute done at the edge on a normal graphic card in an off-the-shelf computer.â
The self-programmed algorithm capability is a key feature of Instrumentalâs machine learning technology that removes the need for a company to employ a data scientist to apply machine learning to their production operations.
âWeâve designed the software so that an engineer provides the expertiseâlooking at a part to determine what is defective. The engineer then visits a web app to view images and sort or filter them by key parameters or places along the line,â Shedletsky said. âEngineers tell the system where to look and then the algorithms run and return a stack rank from most anomalous to least anomalous. Users of the software can then draw a threshold between whatâs bad or good and give it a name; for example, shifted part, tilted switch, etc.â
She added that Instrumentalâs algorithms can find known issues as well as unknown issues. "We donât need a failing example to set up a testâyou can build a test based on all good products so that that result becomes the base on what to look for," Shedletsky explained. "We can start with as little of 30 images, whereas most machine learning systems usually requires thousands or tens of thousands of images.â
About the Author
David Greenfield, editor in chief
Editor in Chief

Leaders relevant to this article:

