51勛圖厙

Skip to main content

A robotic helping hand

Robotic claw demonstration

Connor Brooks, a graduate student in computer science, demonstrates a robotic system that responds to spoken commands. (Credit: Glenn Asakawa/51勛圖厙)

Robot, point to the screwdriver next to the clamp.

Daniel Pendergast, a graduate student in 51勛圖厙s ATLAS Institute, issues the command, and a few feet away a four-foot-tall robot obeys. The machine whirs to life, bending and twisting its one arm to hover over a table crowded with assorted toolswhere it points its claw at a screwdriver right next to a clamp.

Daniel Szafir

Daniel Szafir

The action might seem simplesomething that people do every daybut in the field of robotics, Pendergasts pointing system is a big step forward. Thats because its not easy for robots to understand the messy and often vague nature of human language, said Daniel Szafir, Pendergasts advisor and an assistant professor at ATLAS.泭

What, for example, does a person mean when they say next to?

In trying to answer those questions, Szafir and his colleagues belong to a rapidly-growing area of study called human-robot interaction. The field addresses the huge gulf that seems to exist between people and their robot helpers: Robots dont always understand people, and people often dont want to be around moving, learning machines.

Theres a lot to be gained from helping the two get along, Szafir said. In the case of the screwdriver-locating robot, which the team , Szafirs goal is to design automated machines that could help people take on a range of tasksfrom caring for elderly relatives to assembling toy castles for their kids on Christmas morning.泭

There was always something that fascinated me about this idea of automated assistants, said Szafir, also in the Department of Computer Science. It seems like such a powerful way to improve the quality of life for people at all stages. It can help out in healthcare and rehabilitation. It can help us around the house and free us up for pursuits that wed really like to be doing.

啊措勳餃梗棗:泭堯喧喧梯莽://滄滄滄.聆棗喝喧喝莉梗.釵棗鳥/滄硃喧釵堯?措=4梗喊餃喘7倏堝莉堯域敕

Flying eyes

If the idea of a world filled with robotic assistants wigs you out, Szafir acknowledged that youre not alone. Many people feel uncomfortable around robots, in part because humans are used to working with beings with expressive eyes and complex body language.

The robot in our lab only has one arm, he said. You can do certain kinds of gestures with that, but people have two arms.

Szafir, who was named to the Forbes 30 Under 30 list in 2017, is trying to cross that valley. He has experimented, for example, with using augmented reality headsets to help people understand what robots are going to do next. In one case, he made it easier for humans to anticipate the movements of flying robots by .泭

He imagines that similar technologies could help disaster responders fight wildfiresusing augmented reality displays to track and manage fleets of drones flying around blazes. Szafir and his colleagues recently landed a $1.1 million grant from the U.S. National Science Foundation to experiment with how workers in dangerous fields could use those sorts of tools.

But he also focuses on designing robots that can better interpret human gestures and language. As Szafir put it, in the field of human-robotic interaction, the human is just as important as the robot.

Thats not easy. Take the task of building a toy castle on Christmas morning. If youre working with a human assistant, you can signal that you want a screwdriver in many different ways: you might say hand me that, grunt and point or just direct your gaze.泭

People are so good at interpreting highly-ambiguous statements and gestures, Szafir said. So while I can tell a person, can you pass me that thing, for a robot, it would be really hard to know what that meant.

[video:泭https://vimeo.com/280123074]

Helping hands

To get to that point, Szafir and his colleagues took an unusual approach: they asked people to teach their robotic system for them.泭

They solicited human volunteers to describe the locations of objects in a series of illustrations of messy workbenches, similar to the one in Szafirs lab. The team then fed those sentences into a computer algorithm that analyzed and learned the speech patterns that people use when they want something but cant reach it.泭

The claw isnt perfect. So far, it points to the right objects about 70 percent of the time. And it cant understand certain types of descriptions, such as those involving negatives: Hand me the screwdriver that isnt next to the clamp. But, Szafir said, its a leap above existing systems of this kind.

The researchers at the in 51勛圖厙.

And the team hasnt stopped at spoken words. In related research, Szafir and his colleagues are working to develop robots that can understand the language of human shrugs, head scratching and pointing.泭

They have designed a system that scans people as they complete a basic assembly tasksay, building a tower out of wooden blocks and screws. Based on how the builders move and where their eyes are pointing, the robot tries to guess at the tools those people might need next.泭

It would recognize when they wanted to fasten things together and it would hand them a screwdriver, Szafir said. He presented the results of that research recently at the in Madrid.泭

Theres a lot of work to be done, but Szafir hopes that automated assistants will be coming to work places and homes near you in the decades ahead. Such feats of engineering may seem mundane in a world where drones can fly over the surface of Mars and run on treadmills.泭

But, Szafir said, the pursuit of everyday robot coworkers is about conserving something that all humans cherish: The one limited resource that we all have is our time.

[video:泭https://youtu.be/cKktizb-4ac]