Collaborative Robot Safety

July 23, 2020
How Veo Robotics uses speed and separation monitoring to make industrial robots collaborative.

Earlier this year, Automation World met with the co-founders of Veo Robotics, Clara Vu and Patrick Sobalvarro, about their approach to collaborative robots (cobots). As described in the article, Making Industrial Robots Collaborative, we explained that Veo Robotics’ approaches cobot development by making industrial robots capable of working alongside humans, rather than developing specialized collaborative robots similar to most of the robots that comprise the bulk of the cobot market.

In that article, we described Veo Robotics’ FreeMove system, which uses multiple camera sensors and an algorithmic computing platform to transform industrial robots into cobots. Now, Clara Vu is offering more insights into Veo Robotics’ application of the ISO speed and separation monitoring standard to achieve this.

Cobot standards
The first thing to understand is that speed and separation monitoring (SSM) is one of four standard methods for robotic collaboration. The other three are: safety-rated monitored stop, hand guiding, and power and force limiting. 

Veo Robotics uses SSM because it works with standard industrial robots and has fewer limitations on end-effectors, speed, and payloads, according to Vu. “With SSM, no contact is allowed between the robot and human while the robot is moving. A moving robot is assumed to be hazardous; a stationary robot is assumed to be safe. SSM requires a protective separation distance (PSD) between the robot and human so that it is always possible to bring the robot to a stop before contact with a human. The PSD must take into account the time the robot takes to stop and the distance it will travel during that time, as well as the distance that the humans can move while the robot is stopping. SSM is fundamentally a perception problem because it relies on understanding where humans and robots are in the scene. The system needs to identify the position of each robot joint as well as all the places the robot could reach before it is brought to a stop. It must also understand the location of any humans in the proximity of the robot and where they could move.”

To create a safe perception system for its robots, Veo Robotics’ FreeMove system uses 3D time-of-flight sensors positioned on the periphery of the work cell to capture “rich image data of the entire space,” said Vu. “The architecture of the sensors ensures reliable data with novel dual imagers that observe the same scene so the data can be validated at a per-pixel level. With this approach, higher level algorithms will not need to perform additional validation. This 3D data can then be used to identify key elements in the work cell, including the robot, workpieces, and humans.”

Safety system requirements

Beyond the FreeMove system’s need for reliable data, Vu noted that the data must be “processed with safety in mind. Most algorithms that use depth images from active IR (infrared) sensing identify regions of space as either empty or occupied. However, this is inadequate for a safety system because a safety system requires that humans be sensed affirmatively; therefore, a part of a human body not showing up in sensor data does not mean there isn’t a human there.” 

Because dark fabrics may not always be accurately detected by active IR sensors, the FreeMove system was designed to classify collaborative robotic workspaces as one of three states: empty, i.e., something can be seen behind it; occupied; or unknown.

“When examining volumes of space, if the sensors do not get a return from a space but cannot see through the space, that space is classified as unknown and treated as occupied until the system can determine it to be otherwise,” explained Vu.

She noted that this approach also addresses static and dynamic occlusions. “In a work cell with a standard-size six arm robot moving workpieces around, there will always be some volumes of space that are occluded from or outside the field of view of all the sensors, either temporarily or permanently,” she said. “Those spaces could, at some point in time, contain a human body part. For example, a human could be reaching their arm into a space near the robot that none of the sensors can observe at that moment. So they are also treated as occupied for SSM purposes.”

About the Author

David Greenfield, editor in chief | Editor in Chief

David Greenfield joined Automation World in June 2011. Bringing a wealth of industry knowledge and media experience to his position, David’s contributions can be found in AW’s print and online editions and custom projects. Earlier in his career, David was Editorial Director of Design News at UBM Electronics, and prior to joining UBM, he was Editorial Director of Control Engineering at Reed Business Information, where he also worked on Manufacturing Business Technology as Publisher. 

Sponsored Recommendations

Why Go Beyond Traditional HMI/SCADA

Traditional HMI/SCADAs are being reinvented with today's growing dependence on mobile technology. Discover how AVEVA is implementing this software into your everyday devices to...

4 Reasons to move to a subscription model for your HMI/SCADA

Software-as-a-service (SaaS) gives you the technical and financial ability to respond to the changing market and provides efficient control across your entire enterprise—not just...

Is your HMI stuck in the stone age?

What happens when you adopt modern HMI solutions? Learn more about the future of operations control with these six modern HMI must-haves to help you turbocharge operator efficiency...

AVEVA™ System Platform: Smarter, Faster Operations for Enhanced Industrial Performance

AVEVA System Platform (formerly Wonderware) delivers a responsive, modern operations visualization framework designed to enhance performance across all devices with context-aware...