How AI enhances NBP’s production operations
Here at New Berlin Plastics, we’re obsessed with driving variability out of our processes. It helps us improve part quality as well as reduce scrap, downtime, and labor costs while streamlining our operations. That’s why we’re starting to experiment with artificial intelligence (AI) in two areas.
Here’s a closer look at the benefits we’re seeing from adopting this cutting-edge technology.
Training vision systems
When certain parts come out of a press, they must be inspected to verify dimensional accuracy and determine whether an insert is present. One inspection method is touch sensors. After a robot removes a part from the press, it “presents” it to a fixture containing a set of touch-sensitive sensors. This automated step is part of validating whether a part is good or not in-cycle.
During the last several years, New Berlin Plastics has moved away from this inspection method, because it has several limitations:
“The touch sensors required extremely tight tolerance between the part and the inspection fixture. If the robot arm was slightly out of position or if there was even a slight amount of arm movement, the parts would be rejected,” recalls automation engineer Adam Machajewski.
This physical inspection method was also relatively slow. “The robot arm could move fairly quickly from the press to the inspection fixture. But then it needed to slow down so it didn’t crash into it. On average, that movement and positioning added about two seconds to the cycle time,” he adds.
The other inspection method, which is faster and more reliable, is to use vision systems to inspect the parts. But until recently, the training process required a time-consuming set-up.
“With presence detection for inserts, the sensor takes a picture of the part. Its software then looks for a certain number of pixels that are the color of the insert,” Machajewski explains. “If it contains equal to or greater than that number, it’s a good part. With earlier-generation cameras, we had to manually define the colors of each pixel, which was very time-consuming. The cameras could also be fooled by sunlight coming in through the windows of the building at certain times of the day. Sometimes, that caused an increased number of false rejects.”
Today’s AI-enhanced vision systems have solved these problems. “All I need to do is draw a circle around the part of the image that represents the insert. That tells the AI where to look for it. Next, I train it by presenting it with a series of parts. I tell it which ones are good and bad. After three to five parts, it learns the differences and can accurately accept or reject parts,” he emphasizes. If there’s low contrast between the color of the insert and the part, it could take up to 10 or 12 parts to train the AI. But it’s still much faster than the previous process.
This AI-enhanced process is also significantly faster than using fixtures and touch sensors to inspect parts.
“The vision system camera isn’t dependent on precise positioning of the robot’s arm and end-of-arm tool. It can adapt. It also completes the inspection in less than a second without ever touching the part – which is significantly faster than the old technology,” he says.
The newer vision system cameras aren’t as easily fooled by changes in ambient light, either. That means fewer false rejects. As a result, they are saving NBP’s automation team many hours of time and are helping to streamline part production.
Smarter robots
Several years ago, NBP standardized on Star Automation for overhead cartesian robots and Universal Robots for collaborative robots for its production lines. Like the vision systems used to inspect parts, today’s robots have become smarter than ever, using AI to help maximize production uptime.
At the most fundamental level, a robotic arm and end-of-arm tool are designed to repeat the same motions, over and over, with a high level of precision. But what happens when the robot’s physical components start to wear out? That may lead to slight variability in the robot’s motions. In the past, these slight changes would go unnoticed. Wear and tear would gradually increase – until a robot broke down and caused unplanned downtime.
Today’s AI-enhanced robots perform self-monitoring and alert the automation team to any unusual hardware or software variability. NBP’s automation team can then determine if they just need to monitor it or if they should schedule downtime to fix it.
“I may get an alert from a robot that tells me that its arm missed a position target, for example. I can then decide what to do with that information,” Machajewski points out. He also says the current generation of Universal Robots is more adaptable than earlier-generation models.
“If we need to move a robot from one press to another, there are always slight variations in positioning and measurement between the two workstations. These new robots can easily adapt to those differences,” he adds. That decreases set-up time, so part production can begin faster.
Today’s AI-enhanced robots can also adapt intelligently to unplanned situations. Machajewski shares the example of a collaborative robot used to pack parts in a cardboard shipping box. If an older-generation robot collided with a flap sticking up inside of a box, it would stop, requiring a technician to restart it.
A new AI-enhanced collaborative robot can be trained to recognize that a box flap is not an obstruction and that it should continue pushing with a low amount of force to move the flap and place the part in the box.
The bottom line
As New Berlin Plastics takes its first steps into the world of AI, it’s already obvious that it can make a significant impact on its operations – especially when it comes to removing human variability from the equation.
“When a press operator is working with a collaborative robot, it’s usually the human that causes the mistake or the variation. Plus, each operator does things slightly differently,” Machajewski indicates. “We’re getting to the point where the robots can adapt to those differences and keep running. That means our team can spend more time adding value to the business and less time manually diagnosing what’s causing false rejects and restarting robots.”
Ultimately, AI is improving our ability to deliver high-quality parts at affordable prices to our customers.
“That’s why we’re starting to experiment with artificial intelligence (AI) in two areas as part of our automation efforts.”