Take a safe approach to collaborative robots
Technology enables humans and robots to work together
By Carole Franklin
Not many years ago, the idea of collaborative robotics-with a robot and human worker sharing an active workspace-was met by strong skepticism. To ensure the safety of human workers, a variety of safeguarding systems prevented direct physical contact between the robot and its operator while the system was operational. Safeguarding might be a physical barrier or a light curtain that would shut down the robot system if the operator intruded into the safeguarded space, or a variety of other technologies.
The conversation has changed, however. Today, as long as we can ensure that it will not cause pain or injury, we are comfortable with the idea of a robot or its tooling or workpiece touching a human. And technologies have advanced to the point where we can be more confident of our ability to prevent pain or injury from such contact. The change in available technology and attitude has helped usher in an era of automation that enables humans and robots to work more closely together, while still being safe.
This possibility of safe, close proximity is what we mean by collaborative robotics. Collaborative robotic applications are intended to optimize the use of human workers and robots, using both to their greatest advantage. The capability becomes important when attempting to automate processes that include delicate or compliant materials, for instance, which are difficult for robots to handle. In a collaborative robot system, we gain the benefit of the strength and precision of the robot, together with the creative problem solving, flexibility, and sensitivity of the human operator.
This approach is certainly gaining momentum. An ABI Research study predicts the collaborative robotics market will surge to $1 billion by 2020, populating factories and businesses with more than 40,000 collaborative robots. While a key selling point for these robots is their ability to work side by side with humans, typically without fencing or guarding, care must still be taken to ensure safety. What the industry calls a "collaborative robot" (sometimes termed a "cobot") is simply one that is designed for use in a collaborative workspace. These robots are designed to have safe contact with humans through the implementation of safety features of the robot or the control system, such as power and force limiting (PFL). These types of robots are typically made from lightweight materials, have force and torque sensing in their joints, and may have soft, padded skins or rounded corners.
But despite how the robot was designed or marketed, its actual use might not be safe for collaborative operation if appropriate risk assessments have not been performed, and if the workspace has not been carefully planned and integrated. Some tasks are simply not well suited for collaborative operation, even if the robot performing the task is PFL and is marketed as a collaborative robot.
For example, it is important to remember that the robot arm by itself cannot do any work. The robot system or workstation also includes the end effector, the workpiece, and the presence of other robots or equipment in a cell. All these factors and more must be considered when companies plan for a safe, collaborative robot system. A robot that is operating a welding torch or is moving razor-sharp sheets of metal presents significant opportunities for injury if people are in close proximity. In this example, the robot system as a whole-including the workpiece, end effector, and so on-is not appropriate for collaborative operation, regardless of whether the robot arm itself is a "collaborative" type.
When using a robot designed for collaborative use, safety standards require companies to complete a risk assessment and mitigate any risks identified in the system. The highly anticipated technical specification for collaborative robotics was released in February 2016. ISO/TS 15066:2016 Robots and Robotic Devices - Collaborative Robots provides data-driven guidelines for designers, integrators, and users of human-robot collaborative systems on how to evaluate and mitigate risks. (The full technical specification is available at the Robotic Industries Association [RIA] bookstore.)
Scott Fetzer Electrical Group in Tennessee has installed Universal Robots’ UR5 and UR10 collaborative robot arms on the assembly line. The robots work in tandem with employees, picking up parts at the end of the line for wire cutting and outbound conveyor placement.
Four methods of collaborative operation
Under the ANSI/RIA 15.06 and ISO 10218 harmonized robot safety standards and the new TS 15066, there are four methods of collaborative operation that reflect different use scenarios:
- Safety-rated monitored stop
- Hand guiding
- Speed and separation monitoring
- Power and force limiting
These tend to be the most misunderstood aspects of human-robot collaboration. It is important to gain a thorough understanding of what each collaborative method requires. For instance, a safety-rated monitored stop requires that the robot does not move at all if a person enters the shared space. The benefit is a quicker restart after the human leaves, compared to a noncollaborative system. But in this case, it is not a situation in which human and robot are working together at the same time and in the same space, which is what most people think of when they think of "collaborative robots."
Similarly, hand guiding is very similar to a common method of "teaching" the robot its tasks. When used to describe a type of collaborative operation, however, hand guiding indicates a condition where the robot and person occupy a shared space and the robot is only moving when it is under direct control of the person.
In speed and separation monitoring, both the robot and the person can be present in the space, but if the distance between the robot and the person becomes too close, the robot first slows, and then stops. This is effectively the first scenario of a safety-rated monitored stop. In power and force limiting, there can be contact between the person and the robot, but the robot is power and force limited and sufficiently padded. If there is any impact, there is no pain and no injury. It is also possible to have any mix of some or all of these four methods of collaborative operation in one robot system.
The new TS 15066 specification includes formulas for calculating the protective separation distance for speed and separation monitoring. But perhaps the most interesting part of the technical specification is Annex A. It contains guidance on pain threshold limits for various parts of the body, for use when designing power- and force-limiting applications. These pain thresholds were established by a study from the University of Mainz, Germany, using male and female volunteer human test subjects of a variety of ages, sizes, and occupations. The data can be used to set limits on levels of power and force used by the collaborative robot system or application.
The ISO standard TS 15066 and TR 606, which explains safety requirements specific to collaborative robots and robot systems, establish pain thresholds to guide appropriate use of safety guards or protective devices.
Risk assessment – Application, not robot
The most important aspect for any collaborative robot integration is a risk assessment. But it is important to remember that when assessing risk, the application, not the robot, is the main concern. In fact, the standard document rarely uses the term "robot." Instead, it discusses collaborative work cells or collaborative applications: all the elements involving cables, jigs, clamps, the robot, and the gripper that are inside the cell.
If the application requires somewhat higher force or power than what is stated in the document, it does not mean the application is not safe. The technical specification relates to pain, while what is required from 10218 is that no injury should occur. There is a difference between pain and injury. Tests could show that even if the impact is above the amount stated in 15066, the application may still be safe if it can be proven that the robot cannot hurt or injure the people in those circumstances.
Another common misconception is that if the robot is "inherently safe," then the operation is safe. The term "inherently safe" is similar to the term "collaborative robot." It describes built-in safety features of the robot's design. Again, no matter how "safe" or "collaborative" your robot arm, it needs to be assessed as integrated into a complete robot system-and the system as a whole may not be safe for collaborative use. For instance, if the operation requires your robot to manipulate sharp objects, then it is not safe to have a human beside it-no matter how small, rounded, or padded the robot arm itself might be-without additional protective safety measures. Another case is if the robot is handling a heavy object, which could cause injury if it were dropped, or could become a projectile at a higher rate of speed.
These issues are covered in the ANSI-registered technical report RIA TR R15.306-2016, Task-based Risk Assessment Methodology. TR 306 describes one method of risk assessment that complies with requirements of the 2012 R15.06 standard and was updated in 2016.
The FANUC CR-35iA collaborative robot has six-axis articulation and a 35-kg payload. In this palletizing stacking operation, its soft cover and force sensors protect workers who are in direct contact with the robot for training or operation.
Gripper safety guidelines still to come
Although in the works by the ISO committee, currently there are no specific safety guidelines for robot end effectors or end-of-arm tooling in collaborative applications. In the interim, designers and integrators should follow the guidelines in TS 15066, such as requirements that an operator must not be trapped under any circumstances by the robot. If there is no power to the robot and a person is trapped, the person must be able to escape by applying minimal force to the robot to remove the part of the body that is trapped. This applies to the gripper as well; for instance, if a person's fingers are stuck between the gripper jaws, he or she must be able to escape from the jaws to avoid danger, such as a fire.
A study of pain thresholds for PFL applications was done at the University of Mainz in Germany. It covered 100 human test subjects of both genders and a wide range of ages and body dimensions.
Sources: ISO/TS 15066:2016, Annex A and www.dguv.de/ifa/fachinfos/kollaborierende-
Annex A: “The Body Model” incorporates important data from the study, with maximum permissible pressure values that represent the 75th percentile.
What about cybersecurity?
With the rise of Industry 4.0 and the Industrial Internet of Things, robots and other automation equipment are increasingly being connected to each other and to other computer systems, networks, and applications. And with continued news of hackers taking control of financial or industrial systems, medical devices, and vehicles, we are increasingly aware of the tight connection between security and safety-not to mention protecting sensitive company data being collected by automated systems. But now that robots are no longer isolated devices, serious information technology concerns are arising.
There is an entire body of standards describing requirements for cybersecurity developed through decades of experience with software. For example, a good place to start is IEC 62443, a set of standards describing cybersecurity in an industrial setting. RIA will offer at least one presentation on cybersecurity and industrial robots at this year's National Robot Safety Conference, set for 10-12 October 2017, in Pittsburgh, Penn.
New standard provides data-driven safety guidance to manage risk
When robots work alongside humans, companies have a responsibility to ensure that the application does not put a human in danger. Until the release of ISO/TS 15066, robot system suppliers and integrators only had general information about requirements for collaborative systems. Now they have the specific, data-driven safety guidance they need to evaluate and control risks.