Featured Product
This Week in Quality Digest Live
Management Features
Tom Taormina
How to evolve quality management into business management
Dave Coffaro
Five tips for change leaders
Gleb Tsipursky
Pride and rose-colored glasses goeth before destruction
Ryan E. Day
Plasser American uses FARO laser projection and laser scanner technology to improve efficiency and increase throughput

More Features

Management News
Awards to be presented March 24, 2020, at the Quest for Excellence Conference, in National Harbor, MD
Workers more at ease about job security. Millennials more confident regarding wages.
46% of creative workers want video games in the office
A guide for practitioners and managers
Provides eight operating modes and five alarms
April 25, 2019 workshop focused on hoshin kanri and critical leadership skills related to strategy deployment and A3 thinking
Process concerns technology feasibility, commercial potential, and transition to marketplace
Identifying the 252 needs for workforce development to meet our future is a complex, wicked, and urgent problem
How established companies turn the tables on digital disruptors

More News

Isaac Maw


Machine Learning System Detects Manufacturing Defects Using Photos

A conversation with former Apple engineer Anna Shedletsky

Published: Monday, August 19, 2019 - 11:03

Machine learning can be used for more than violating your privacy for a social media challenge. For example, one fascinating application has been developed by Instrumental AI, which uses machine learning to detect defects and anomalies in photographs of parts during various stages of assembly, primarily in the electronics manufacturing industry.

Instrumental was founded by Anna Shedletsky, a former Apple engineer with two degrees from Stanford. A mechanical engineer by training, Shedletsky led system products design for the Apple Watch for six years. “I led the team that designed the physical product itself, as well as being responsible for the first production line, says Shedletsky. We had to prove mass production yields at mass production speeds on the first line.”

Anna Shedletsky, CEO, Instrumental

In this role at Apple, she noticed how very small defects on the line could cause huge delays and problems in the product life cycle and profitability. So, she set out to build technology tools that could address these defects. “I had this kind of naïve, but correct, insight that there arent great tools for this; that we struggle by just throwing people and money at these problems. And if Apple didnt have it, it didnt exist.”

Shedletsky left Apple in 2015 to start Instrumental, a software company which has developed a toolkit for detecting manufacturing defects in digital photos taken on the manufacturing line. “Weve been at it for four years,” she says. We work primarily with Fortune 500 companies in the consumer electronics and electronics space. We do have customers outside of electronics, but consumer electronics is kind of the hardest beast, because the volumes are pretty high. The products themselves are high value, but they go through production super fast. So, some of our customers only run products in production for six months. It’s just these super-fast cycles of development and production. So thats where we kind of specialize today. We build technology that helps accelerate the development process and then maintain control in production.”

The system creates a database of images taken during selected stages of assembly. This database can then be analyzed and searched to find specific defects or uncategorized anomalies.

Engineering.com sat down with Shedletsky to ask questions about the technology and applications.

Would a system like this typically be used to trace issues in components prior to assembly, or during final assembly?

All of the above. Process issues, part quality issues, workmanship issues, and design issues. We primarily do our first deployment in the new product introduction stage. We’ll primarily be looking for a combination of part quality and design issues in the design phase. Some of our customers, like Motorola, have actually pushed us up in their supply chain. So theyre actually using this technology with some of their suppliers to prevent poor quality parts from leaking into their final assembly stage. We had a customer where were in three phases of their supply chain. Those parts start in Thailand, they go to Korea, and then they end up in China.

We work with those factories, as well as the main China assembly facility. So, we can look at both quality issues created by suppliers, as well as quality issues that are created in on-site final assembly.

How are all these data collected? Does it require human input, such as a SKU barcode thats scanned?

The unique insight that we had is that often when a problem occurs, theres not the right kind of high-resolution data available as soon as the problem is discovered. So, then an engineer has to go collect additional data before you can even get started on solving the problem. What Instrumental does is we actually collect data proactively. Specifically, we take images of every single unit at key stages of assembly. And those images, as you might have heard, are worth a thousand words. You dont need to know what youre looking for when you take the image, but that image could be very valuable for identifying these types of issues that we were discussing.

A prototype L16 inside an Instrumental station. Photo Credit: Light

The way that we get these images is with drop-in camera boxes, or we can take images off of preexisting automation equipment, if our customers have cameras already. We read bar codes out of the images to make those traceable, or you can manually scan a barcode if theres no barcode on a part.

You mentioned to me that the technology uses machine learning to identify defects that conventional vision systems cannot. So, what are you detecting that conventional systems cant?

Conventional systems are usually used for very specific applications. For example, measuring a gap, or detecting if certain screws are present or not. For these systems, theres a preconceived notion of what the value of that system is to justify buying it and putting it on the line. Thats how vision has been used in the past. These vision scenarios are very rules-based: The screw is either there or not. But that detection wont find if the screw is stripped, for example, unless its been pre-programmed to do that as part of that specific deployment.

Instrumental is a generalized tool. It can actually discover issues that its never seen before and that our customers have never seen before. So that means that we ultimately go in places that typically industrial vision or human inspectors havent been able to provide value in the past. For example, we’ll go after major subassemblies that have been built and do inspection across those entire subassemblies. And we can identify a wide variety of different types of issues that would be difficult to wrap specific specifications around.

For example, in a system which measures a gap, you could say point four plus or minus point one, like there is a very specific specification. But what about glue volume and glue dispense? What about bubbles in that glue? What about if you have solder? And is the solder, does the solder look right? Is it cold solder, or is it going to make a good connection? These things could use conventional vision to inspect if it was pre-planned. However, often its too expensive to actually deploy vision systems to do these, because each individual algorithm you have to set up on a machine vision system will cost you many hours of a consultants time, whereas Instrumental programs itself.

So, you drop our system in, we look at 30 units of data, which usually we get in the first day of being on the line, and our algorithms start to program themselves to be able to find other types of defects that they know about and ones they dont know about yet.

Can those initial 30 units be normal?

Yes. We dont need red rabbit defects to be able to set up these algorithms. We can certainly take and use red rabbit data, but we dont need them to be effective. Key point: those 30 units, you said normal units, they dont have to be golden, either. Thats often another constraint or problem in development, is you cant even build one golden unit, let alone 30. And thats actually been a problem. And so just normal input is good. There can be defects in it, too. Thats fine, no problem. It doesnt have to be special in any way.

Is the system retraining over time and getting beyond those initial 30 data points?

Yes. Thirty is when we allow the algorithm to start working because there are some use cases in which thats enough and ready for the production line. And then as we get more data, then we can tune in and prove these algorithms. They effectively tune in and prove themselves. What we do need to know is what is defective, because what we find is difference, but our customers needed to tell us which differences actually correspond to defects.

Can I generate, for example, Pareto charts of what defects are appearing?

Yes. We have Paretos. We have trends, because we know what these defect rates are over time. So, we can show run charts that show different types of defects and when theyre occurring.

The trend functionality in Instrumental software.

A lot of times, our customers view that when they roll into production, things shouldnt change. Like, “Its production! Everythings stable. Everything’s fixed.”

But actually, at least in the consumer electronics industry, things are not fixed, because operators are turning over at very high rates, so things are constantly changing. For example, one of our customers found that every Monday at 8 a.m., they would have a spike of defects such as a missing screw or something silly like that. And no matter what they did on training, theyve always had this spike. And so, being able to kind of monitor that and understand that youre going to be able to stop those units. That is a key value that this technology provides.

I used to do this as an engineer, where you get a problem, figure out what’s going on, do failure analysis, create corrective actions, validate that its good, and then make the change, and do that many times over during development. So, our technology is augmenting and supporting that by identifying these issues more quickly, identifying trends to help understand if your issues are being fixed or whether theyre increasing, and being able to essentially take multiple data sets together and be able to identify correlations.

Is it possible to close the loop and have the system identify a defect and then change the machine control to prevent the defect without human intervention?

Thats the vision of what were trying to do. With the software that weve built today, vision is just our first application. Theres this whole data pipeline which manufacturers are pushing additional data into, so the idea is that someone else will build the automation. These automated lines are going to be filled with different vendors of automation. So, youre going to need one common place for these data to go into, and you want this common place to be some place that you can access from all of the sites in the supply chain.

We're thinking bigger than just one factory floor. Were thinking about all the upstream suppliers and also the downstream customer experience with the products, like for quality escapes and things like that. So, thats the vision of Instrumental, to create self-optimization feedback loops across this entire supply chain. Were a manufacturing data company that doesnt sell to manufacturers. We actually sell to brands, like Motorola. The reason is because they own the supply chain; the factories dont.

Is it challenging sometimes to get a manufacturer to allow a network-connected, cloud-connected box onto their production floor?

As I mentioned, we work with brands. The brand is our paying customer, and then they often have a factory partner, companies like Flex, Foxconn, or Pegatron where they actually build the stuff. So, we come into the factory on two well-worn ruts. The first rut is that were just the test station on the line, and we happen to require a data line. So, our customer specifies, Oh yeah, we want an internal station here. They need two outlets and a data line. And a factory says, “OK. Its a test station. And it’s just kind of part of their normal flow. The other well-worn rut is third party QA.

So, these are services such as Underwriters Laboratories (UL). Factories are constantly having people come and essentially look over the data and what theyre doing. So, we come in in these ways. We havent ever been in a situation where our customer cannot get internet and have prevented us from being able to run. We designed this to work in China, after all.

What kind of benefits has Motorola seen using this technology?

Theyre piloting the system in production, and their expectations are to see more dollars and cents, because were essentially intercepting defects that they dont have good ways to catch.

We work on every phone theyve made since 2018. Its about six or seven products at this point, and key takeaway values for them include enabling them to accelerate product maturity during development, meaning they find issues faster, and they fix them. They find them in the first build instead of the third build, and there are so many stories. Each particular program there seem to be 10 or 20 different situations in which each of those problems would have cost them $150K or more, such as finding a tooling issue on the first day of the build instead of all the way at the end after youve already replicated tools.

That product maturity acceleration and the production of experiments and mistakes involving tooling specifically and DOEs that have to be run to fix things have been significantly reduced with Instrumental. They also saw an acceleration in the ramp. Their time to stability was much shorter over the seven programs that use Instrumental versus the ones that didnt. So, they saw a significant delta there. It means their yields are higher faster, so theyre essentially saving a lot of money they would normally spend on rework and potentially shipping bad units to customers.

Motorola is interesting in that they kind of lost some of their gleam on their brand name in recent times, but Motorola from an engineering and manufacturing process standpoint is the OG” [original gangster]. Apples process was replicated off of Motorolas. We work now with a bunch of Fortune 500 companies, and they all use the Motorola process. Six Sigma was invented at Motorola. They really are leaders and innovators in the product development process and manufacturing process.

How important is cybersecurity when you are storing high-res images of the product during various stages of assembly?

Super important. We have preproduction images of products in highly competitive spaces, and so from the beginning we built this system with enterprise-grade security. In many ways, cloud security can be more secure than on-premise systems, because on-prem systems dont necessarily get patches and updates. Cloud systems that are fully encrypted are getting updated constantly with any latest security patches. Thats part of working with a new customer. We go through extensive security reviews to make sure our system is up to par. Again, as I mentioned, we work with these Fortune 500 brands that like to be protected. The key point is making sure the system is secure for their data.

I guess lastly because youre in the machine learning space, this is just kind of a side question: What are your thoughts on the term "AI" being used? I think there's a lot of hype surrounding it. What do you think about using the term AI?

I hate AI! I hate industry 4.0 and industrial IoT, too. I think the key point is that AI is a collection of technologies, of which machine learning is one. Computer vision is another. Then, frankly, more than 50 percent of it is just math and statistics and Pearson coefficients. But those sound a lot less sexy.

Im pretty skeptical on the buzzwords. We do use them because thats how people find us, because they’re looking for these types of solutions, but personally Id rather just be 100-percent clear and concise about what it is we do. We find defects. Thats where we try to position ourselves, but thats my personal vendetta against buzzwords. Especially industry 4.0 and industrial IoT. I don’t know. What do you think about it? You probably use it all the time?

I dont know. I agree about industrial IoT, because I think that you can usually be a lot more specific. But we already have industry 5.0, so the buzzwords clearly aren’t slowing down!

Yeah, well. I think that people at various larger brands are trying to co-opt these terms as well. The term Industry 4.0 was invented in Germany, but [a certain American conglomerate] kind of took it over and spent all the marketing dollars on it. But then they never delivered anything, so I have a skepticism around something that was a big marketing thing, but then there was no delivery on the promise. I think that industry 4.0 is frankly a little short-sighted around the capabilities, and thats probably what 5.0 means, but Ive seen several different definitions of 5.0, so its clearly not clear.

The fifth one, as far as I understand it, is about human and machine communication, cooperation, and collaborative technologies.

In my opinion, the next thing after automation is full autonomy. Its not humans in the loop. Thats just people who are trying to make themselves feel better that theyre going to get replaced. I think its about autonomy and intelligence, and essentially creating systems that can do this stuff. Sure, a human can be in the loop to push the button to make the change, to make sure that the machines dont go awry. But I think that the collaborative stuff, I dont think thats the future. Thats just part of 4.0.

Thats an interesting take! How are the parts usually presented to the camera? For example, on a conveyor, like you might have with a conventional vision system?

It can depend. Usually, as I mentioned, most of our customers dont have vision in the places that would be most valuable for the way Instrumental works. So, well drop in place a station that will have a fixture. The part will be manually loaded, and well capture the image.

Image credit: Light

Consumer electronics is still a 98-percent manual assembly. There are a lot of people on these lines. So, in a production line, we sit between two operators, so that its neutral. A person loads the part. We scan the barcode, take the picture, and do the analysis, and then pass the results to the downstream operator.

If our customers have automation equipment or cameras that are already taking images from conveyor lines, for example, we can ingest those images and process these images and then give real-time results back to the MES for that line, so they actually intercept those parts.

Are there other industry applications besides electronics manufacturing?

Essentially any discrete manufacturing serialized product. We think in general for modest volumes, $50 or more in value of the product itself. We primarily today work on electronics applications, but we have medical device customers that have products that dont have any electronics in them, such as pump systems and things like that where its medical applications. Some of those parts are as cheap as $3 in terms of how much the part costs, but the reason there is return on investment is because the cost of a defective unit getting through is very, very high in the medical industry.

We also have customers in the automotive industry, and we believe that therell be some really interesting applications in defense and government. And were just kind of starting that now.

First published July 25, 2019, on the engineering.com blog.


About The Author

Isaac Maw’s picture

Isaac Maw

Isaac Maw is Associate Editor of Manufacturing at ENGINEERING.com. He is a graduate of the University of Toronto with a Bachelor of Arts degree in Communication. He is a passionate writer and a skilled online researcher and marketer.