Robotic Packaging: More Vision, More Brains, Fewer Parts

March 11, 2013
Integrated and affordable vision systems are bringing sophisticated vision-assisted robotic capabilities to a wider range of packaging applications.

The technology that is rapidly changing today’s packaging industry seems almost like science fiction. Allow me to enlist the aid of Hollywood to make the point. Suppose we hop into Doc Brown’s time-traveling 1981 DeLorean from the movie Back to the Future, jump back a decade or two, and look at robotics technology.

Let’s start with bin picking, or the ability of a robot to accurately select/manipulate randomly oriented objects. Having this capability greatly enhances robot productivity, but it’s not simple to implement: It requires compact, robust and affordable three-dimensional (3D) vision technology integrated with the robot in a high-speed, computation-intensive control environment. In the early 1990s, that was a problem without a satisfactory solution.

Today, however, “integrating 3D vision with a robotic system remains complicated, and it is still in its infancy, but it is definitely gaining traction. And we will see more 3D applications, including bin picking, in the near future,” says Amy Peters, who is with the business strategy and portfolio planning, control & visualization business at Rockwell Automation, Milwaukee.

Cost-effective cameras are fueling this trend, but Peters says the increasing sophistication of sensors also is playing a major enabling role. And it’s not just the sensors we are accustomed to encountering on the factory floor; powerful sensors developed for the consumer electronics world are figuring into the equation. She cites the Xbox Kinect as an example.

In the near future, observes Jim Anderson, vision product manager for sensor specialist Sick Inc., Minneapolis, “smaller, more effective vision sensors that can deliver data directly to the robot’s controller will become more common.” In terms of the emerging 3D capabilities, Anderson notes that, in addition to bin picking, 3D vision benefits other applications, such as making the picking of objects more reliable.” This is possible because 3D vision allows greater precision in locating the position and edges of objects, which is particularly valuable when the product to be handled is in bags, such as flour or coffee.

With this combination of vision-enabled precision robotic flexibility opening up new application possibilities, Anderson envisions baked goods packaging lines that could switch quickly “from baguettes to bagels to croissants with little downtime.”

Doing more with less
A significant advantage to these new, advanced packaging systems is that even though they sport more technology, their powerful controllers can often perform the tasks required with less equipment and in a smaller footprint. For instance, Anderson says, some applications that today depend on several cameras may soon be able to dispense with those extra cameras and associated hardware. “It is possible to have the robot move the camera to different positions and angles so only one camera is required instead of an array of them,” he says.

Rick Tallian, consumer market segment manager for ABB Robotics, Auburn Hills, Mich., expands on that thought: “As vision guidance and inspection improve, we will have the ability to reduce the number of devices that separate and organize products for recognition and handling. This will dramatically reduce the ancillary equipment required for pick-and-place operations.”

Craig Souser, president of automated packaging system provider JLS Automation in York, Pa., stresses that this is already happening. He says that vision-guided robotic technology, coupled with an agile design approach, allows the elimination of many “product handling mechanisms, collators, pattern formers, etc. All this allows for a much simpler mechanical system, fewer change parts—if any—and no dialing in the process after changeover.”

The reason this is all becoming possible is because robotic complexity is moving away from the hardware and into the software. To illustrate this Souser notes that, in one design review meeting, a customer’s engineer insisted that JLS had messed up by forgetting to include a collator in the packaging system they were offering to build. However, JLS had not messed up; the robot just did not need a collator.

In Souser’s experience, quality also improves with vision-aided robotics. “We can do quality inspection on the product as well as the package,” he says. “Undersized or misshapen products or packages that leak or have missing labels are detectable with the same vision system that is guiding the robots.”

This QC capability even extends to misoriented products, which is a potential quality issue as well as a production issue. “ADCO’s primary use of vision in combination with robots is to check product orientation and spacing,” says Scott M. Reed, vice president of packaging machine builder ADCO Manufacturing Inc., Sanger, Calif. “For example, we have used vision to check the orientation and spacing of small, individually wrapped snack packs as they were fed into the robotic loading area.”

In this high-speed application, when the vision system sensed an improperly oriented product, it signaled the robot. The robot then picked the product and properly reoriented it. The vision system was also used to determine whether or not the product was right side up. If not, the robot picked the product and, instead of packing it, placed it on a tipper bar which righted it. After the product was righted it was then recirculated back into the production flow.

“Lastly,” says Reed, “the vision system was used to check spacing.” If two or more products were inadvertently stacked, or they were too close to one another for the vision system to get a clear picture, the system would communicate this to the robot. “Instead of picking the products, the robot was signaled to let them pass to a reject station. The reject station would then receive a signal letting it know to discharge the product out of the production flow.

“While robots are inherently flexible,” Reed stresses, “the addition of a vision system can maximize this flexibility. Robots without a vision system will generally perform an exacting ‘point-to-point’ task. However, with the application of vision guidance, a robot can be signaled to respond to variations based on camera input.”

“The big idea here is to utilize vision as a way to more directly and intelligently control production processes on-the-fly,” Matt Lecheler, motion specialist for Beckhoff Automation LLC, Burnsville, Minn., says. “This sort of active engagement of the vision system with the manufacturing process is a leap beyond the traditional way of thinking. The vision system gets more involved in the automation system as a control input instead of a control result; therefore, making it possible to adjust machines without having to shut the system down for changeovers or troubleshooting.”

The communication needed to do this demands tight integration between the robot and the vision system, rather than using a dedicated vision processor. Of course, “this means more specialized hardware, software, and slower networks (i.e., basic Ethernet) to add a vision system to the packaging machine,” says Lecheler. It’s a situation that can benefit the suppliers of those dedicated systems but impedes the more dynamic uses of vision discussed above.

Robot suppliers have gotten this message and are addressing the problem by integrating vision functions within the robot controller. Along with potential performance benefits, this is proving to be a major step forward in terms of speeding deployment, ease of use, and accompanying cost benefits. Still, as welcome a development as this has been, the robot/vision system still stands apart from the rest of the packaging control system.

Dick Motley, senior account manager for Fanuc Robotics America Corp., Rochester Hills, Mich., is well aware of this disconnect and sees the need to address this situation as one of “the major drivers affecting the packaging industry.” He sees a move toward “easier integration among control platforms, with users seeing a much more seamless connection.”

Like many others, he feels that this integration will be helped by the growing acceptance of the PackML state model. “For the end user, this model provides a more consistent ‘look and feel’ among pieces of equipment from different manufacturers with potentially different control platforms. This facilitates a more intuitive line control for operators, faster troubleshooting by maintenance personnel, and better data monitoring and collection. In theory, this also translates to faster startups, higher uptime over the equipment life and reduced training requirements for personnel.” 

Motley adds that, for the packaging OEM and integrator, PackML can provide a standard, reusable framework for machine control that can help reduce re-engineering costs related to controls and software through the use of standard tools and templates available from control vendors.

Available today
Lest all this discussion of advanced robots with integrated vision capabilities seem merely like a prescription for future success, we should harken back to the beginning of this article and stress that the future is beginning to appear now. That future was in evidence at the recent Automate 2013 trade show in Chicago. At one of the show’s many demonstrations, in this case the ABB palletizing demo, attendees could see some of this futuristic technology in action.

ABB robots were palletizing and depalletizing various configurations of corrugated cases with the complete robot cell under the control of ABB’s recently introduced PalletPack 460 function package. Ancillary software provided the ability to configure the complete palletizing system via the configuration wizard on the robot’s FlexPendant, eliminating the need for traditional robot programming. The software also provided a FlexPendant HMI (human machine interface) for cell operation, and an integral ABB PLC (programmable logic controller) controlled the flow of product on the infeed conveyors and pallet conveyors as well as gripper operation. Gripper operation was prefigured in the software, and simulation models can be transferred directly to the robot controller.

The ability to prefigure, or simulate, is nothing new of course, but the increasing sophistication of the simulation software is helping propel the speed of deployment, flexibility, and production efficiencies that characterize an integrated control environment. Sumeet Vispute, robotic project development specialist for Schneider Packaging Equipment Co., Brewerton, N.Y., can testify to that. Speaking of PickPro, the Fanuc simulation package, Vispute notes that it allows Schneider to simulate virtual Fanuc robots running PickTool which, crucially, is the same software the actual robots employ. The simulation software gives him the ability to change system parameters like conveyor speeds, indexing rates, grip-and-drop delays, payloads, approach vectors, load balancing ratios—in short, the entire spectrum of robot operation.

As an example of the real world impact of this type of capability, Vispute says, “We recently sold a high-speed picking system made up of four robots handling 240 parts per minute. These were uniquely shaped parts packed together in an intricate pattern.” The Fanuc simulation software was used to model the system before quoting it. “The customer would now like to buy more copies of this system with some modifications to the load balancing ratios and pack patterns,” he adds. “Also, newer and faster robot models have been launched by Fanuc since the last system was built for this customer. We are using PickPRO to model these modifications and also to evaluate the new robot models for increased throughput.”

“Adding vision to this mix is where the industry stands to grow the most in the near future,” says Beckhoff’s Lecheler. He feels the chief benefits for both machine builders and end users are clear. “Lower system cost opens up the possibility to use more advanced robotic and vision systems on a much wider range of packaging machines. Also, relying on just one software platform and one network across these elements reduces the burden on engineering teams, simplifies support efforts and results in fewer points of failure.”

Sponsored Recommendations

Why Go Beyond Traditional HMI/SCADA

Traditional HMI/SCADAs are being reinvented with today's growing dependence on mobile technology. Discover how AVEVA is implementing this software into your everyday devices to...

4 Reasons to move to a subscription model for your HMI/SCADA

Software-as-a-service (SaaS) gives you the technical and financial ability to respond to the changing market and provides efficient control across your entire enterprise—not just...

Is your HMI stuck in the stone age?

What happens when you adopt modern HMI solutions? Learn more about the future of operations control with these six modern HMI must-haves to help you turbocharge operator efficiency...