Koma Precision
Published

Collaborative Robots Learn to Collaborate

Accessible 3D vision unlocks the potential of machine learning for making our autonomous partners more humanlike.

Share

To be truly collaborative, robots must be capable of more than working safely alongside human beings. Russell Toris, director of robotics at Fetch Robotics, says robots also need to act (and “think”) more like people.

This is particularly true of autonomous mobile robots (AMRs) like those manufactured by Fetch. Typically employed for material transport and data collection (such as counting inventory), these wheeled systems use vision sensors and navigation software to dynamically adapt to new environments and situations. Increasingly common in warehouses and distribution centers, this technology is likely to spread to other applications and industries, including our own. In fact, our January issue’s coverage of JIMTOF in Japan touched on the promise of machine-tending robot arms on wheels. Whatever the application, ensuring that a robot can safely occupy the same spaces as humans is an entirely different proposition than ensuring that its behavior neither hinders implementation nor wastes resources by making humans uncomfortable.

Mr. Toris cites the example of two people approaching each other from opposite directions in a narrow hallway. They typically will acknowledge each other in some way before passing, even if only by a nod or a glance. One or both will likely slow down, and perhaps step aside to allow the other a wider berth. A robot with a myopic focus on moving as efficiently as possible from point A to point B will not be nearly as considerate. It might not collide with the person striding toward it, but its movements will seem as cold as they are efficient, and possibly even threatening.

The robot is “aggressive.” The robot is “rude.” The robot is “acting drunk.” We cannot help but assign human traits to inanimate objects, particularly objects that act autonomously and purport to be our collaborators. This tendency influences our behavior, Mr. Toris says. Employees might lose time keeping a wary eye out for rampaging robots. They might even stop working entirely to observe odd behavior. Whatever the specifics of the situation, we are less likely to use any technology to its full potential, or even use it at all, if it evinces feelings of hesitation or intimidation.    

Instead, what if AMRs could maintain a comfortable distance as they pass? What if they could differentiate between a person, a forklift and a pallet, and adjust their behavior accordingly? Robots may not be able to nod or glance, but what if they could use sound or light (say, a turn signal) to notify people of their intentions? Making behavior more natural and more predictable is a primary design philosophy at Fetch Robotics. “We design robots for people, not robots for robots,” Mr. Toris says.

This is possible through the intersection of two inherently intertwined technologies. The first is 3D vision systems, which are more affordable than ever due to advances in seemingly unrelated fields like autonomous vehicles, Mr. Toris says. Although the 2D laser sensors used for most AMRs are extremely accurate and capable of detecting distant objects, their vision is limited to a shin’s-eye view of the most basic geometric shapes. Add 3D cameras to complement the 2D sensors, as Fetch has done, and the robots can paint a more comprehensive picture of their environments. More robust visual data is critical not only for distinguishing objects, but also for fueling the machine-learning algorithms that enable the robots to determine how best to respond to those objects.

Fetch must teach its robots in order for them to learn, and teaching requires masses of data. To collect that data, the company has constructed a mock warehouse to train AMRs at its facility in San Jose, California. Mobile robots have been navigating the aisles for four years now, filtering rich vision-sensor feedback through artificial neural networks (ANNs) to distinguish obstacles and determine not only how to navigate around them, but to navigate around them appropriately. These ANNs consist of layer upon layer of interconnected, computerized nodes, creating a vast web that filters data from the robot’s sensors (2D lasers complemented by 3D cameras). Each time the robot identifies and/or responds correctly to an obstacle, individual nodes are weighted accordingly. This makes the same outcome more likely in the future, even when the ANN is tasked with filtering novel sensor data from a novel environment.

Four years’ worth of data from the mock factory ensure that the latest AMRs will benefit from all the experience of their predecessors, Mr. Toris says. Four years from now, the dataset will be even more robust, and new machine-learning techniques likely will be available. Whatever the future of AMRs in CNC machine shops, it is well worth considering how robot design might change as a result of both technological developments and changes in thinking about the nature of automation.

To any Measurement Question there is an Answer
The view from my shop.
IMTS 2024
CHIRON Group, one stop solution for manufacturing.
World Machine Tool Survey
Precision grinding & hard turning custom solutions
Koma Precision
VERISURF
Gardner Business Intelligence
OASIS Inspection Systems
Gravotech
High Accuracy Linear Encoders

Related Content

SPONSORED

Finally, A Comprehensive Software Solution Designed for Small Job Shops

Zel X from Siemens is an integrated software application that consolidates collaboration, design, manufacturing, and operations into a comprehensive, easy-to-use solution. From RFQ to delivery, it’s a more efficient way to handle quotes, manage jobs, make parts, and collaborate with teams of all sizes.

Read More

Diving Deeper Into Machine Monitoring Data

Data visualization is the first step in using machine monitoring data, but taking it to the next level requires looking for trends within the data.

Read More

Shop Moves to Aerospace Machining With Help From ERP

Coastal Machine is an oil and gas shop that pivoted to aerospace manufacturing with the help of an ERP system that made the certification process simple.

Read More
Automation

5 Stages of a Closed-Loop CNC Machining Cell

Controlling variability in a closed-loop manufacturing process requires inspection data collected before, during and immediately after machining — and a means to act on that data in real time. Here’s one system that accomplishes this. 

Read More

Read Next

Vertical Machining Centers

The Cut Scene: The Finer Details of Large-Format Machining

Small details and features can have an outsized impact on large parts, such as Barbco’s collapsible utility drill head.

Read More

3 Mistakes That Cause CNC Programs to Fail

Despite enhancements to manufacturing technology, there are still issues today that can cause programs to fail. These failures can cause lost time, scrapped parts, damaged machines and even injured operators.

Read More
CNC Turnkey Package for Knee Mills and Lathes