MIT researchers develop robotic assistants for factory workers

Monday, 18 June, 2012

Wish you had a robot to help with repetitive tasks at work? Researchers at MIT are working on it.

Professor Julie Shah, who leads the Interactive Robotics Group in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), and her colleagues at MIT have devised an algorithm that enables a robot to quickly learn an individual’s preference for a certain task and adapt accordingly to complete the task.

The team tested the algorithm with spar assembly, a process of building the main structural element of an aircraft’s wing.

“If the robot can provide tools and materials so the person doesn’t have to walk over to pick up parts and walk back to the plane, you can significantly reduce the idle time of the person,” said Shah.

“It’s really hard to make robots do careful refinishing tasks that people do really well. But providing robotic assistants to do the non-value-added work can actually increase the productivity of the overall factory.”

The process of spar assembly can be highly individualised: workers can order several different tasks in different ways to achieve the same outcome. The MIT team developed a decision tree, with each branch representing a choice the worker may make. For example, a worker may choose to apply sealant, insert a bolt and then hammer it in or apply sealant to every hole before hammering in the bolts.

“If the robot places the bolt, how sure is it that the person will then hammer the bolt or just wait for the robot to place the next bolt?” said Shah. “There are many branches.”

The team trained a robot to observe a worker’s chain of preferences then adapt to that person’s work habits, either applying sealant or fastening a bolt accordingly.

Many workers already wear RFID (radiofrequency identification) tags, which Shah says could be a way for robots to identify individual workers.

Shah says in future the research could be used for medical robotic assistants that would hand surgeons the appropriate instruments during lengthy procedures in an operating room. She envisions a future in which robots and humans work side by side, with the right algorithms, but acknowledges it’s a long way off.

“We have hardware, sensing, and can do manipulation and vision, but unless the robot really develops an almost seamless understanding of how it can help the person, the person’s just going to get frustrated and say, ‘Never mind, I’ll just go pick up the piece myself’,” Shah said.

The research group will present its findings at the Robotics: Science and Systems Conference in Sydney in July. The research was supported in part by Boeing Research and Technology and conducted in collaboration with ABB.

MIT has released a YouTube video of the robot at work.

Related News

Schneider Electric signs Motion Solutions as ANZ cobot distributor

Motion Solutions Australia and Motion Solutions New Zealand have been appointed as Schneider...

Top 5 robotics trends for 2025

The International Federation of Robotics has identified five key trends in robotics for 2025.

ARM Hub offers NVIDIA access for Propel-AIR participants

Robotics companies are invited to access NVIDIA AI and robotics tech through the Propel-AIR...


  • All content Copyright © 2025 Westwick-Farrow Pty Ltd