Robots learn basic 'body language' to work together


Friday, 30 September, 2016

Robots learn basic 'body language' to work together

Researchers at the KTH Royal Institute of Technology in Sweden have completed a project in which robots are able to work cooperatively with each other. For we humans, sometimes all it takes to get help from someone is to wave at them, or point, and now robots can do it too.

Dimos Dimarogonas, an associate professor at KTH and project coordinator for RECONFIG, said the research project has developed protocols that enable robots to ask for help from each other and to recognise when other robots need assistance — and change their plans accordingly.

“Robots can stop what they’re doing and go over to assist another robot which has asked for help,” said Dimarogonas. “This will mean flexible and dynamic robots that act much more like humans — robots capable of constantly facing new choices and that are competent enough to make decisions.”

As autonomous machines take on more responsibilities, they are bound to encounter tasks that are too big for a single robot. Shared work could include lending an extra hand to lift and carry something, or holding an object in place, but Dimarogonas said the concept can be scaled up to include any number of functions in a home, a factory or other kinds of workplaces.

The project was completed in May 2016, with project partners at Aalto University in Finland, the National Technical University of Athens in Greece and the École Centrale Paris in France.

Dimarogonas said that common perception among the robots is one key to this collaborative work.

“The visual feedback that the robots receive is translated into the same symbol for the same object,” he said. “With updated vision technology they can understand that one object is the same from different angles. That is translated to the same symbol one layer up to the decision-making — that it is a thing of interest that we need to transport or not. In other words, they have perceptual agreement.”

In another demonstration two robots carry an object together. One leads the other, which senses what the lead robot wants by the force it exerts on the object, he said.

“It’s just like if you and I were carrying a table and I knew where it had to go,” he explained. “You would sense which direction I wanted to go by the way I turn and push, or pull.”

The important point is that all of these actions take place without human interaction or help, he said.

“This is done in real time, autonomously,” he said. The project also uses a novel communication protocol that sets it apart from other collaborative robot concepts. “We minimise communication. There is a symbolic communication protocol, but it’s not continuous. When help is needed, a call for help is broadcast and a helper robot brings the message to another robot. But it’s a single shot.”

Image: Two off-the-shelf robots used to demonstrate how robots can pick up each other’s signals for assistance, and even set aside their own tasks in order to ‘lend a hand’. (Photo: Peter Larsson)

Related News

Inspection robot can do parkour and walk across rubble

Researchers in Swizerland have used machine learning to teach the ANYmal industrial inspection...

Open Robotics launches Open Source Robotics Alliance

The Open Source Robotics Alliance is a new initiative to strengthen the governance of open-source...

Universal Robots to debut new cobot solutions at APPEX 2024

UR will be demonstrating its UR30 locally for the first time at APPEX 2024, along with the...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd