Robotic limb learned to walk itself without 'explicit' programming

Dunya News

The astounding achievement could help future robots navigate the world independently.

(Web Desk) – Researchers at the University of Southern California (USC) have developed the first AI-controlled robotic limb that can learn how to walk without being explicitly programmed to do so.

The astounding achievement could help future robots navigate the world independently.

The algorithm they used is inspired by real-life biology. Just like animals that can walk soon after birth, this robot can figure out how to use its animal-like tendons after only five minutes of unstructured play.

"The ability for a species to learn and adapt their movements as their bodies and environments change has been a powerful driver of evolution from the start," explains co-author Brian Cohn, a computer scientist at USC.

"Our work constitutes a step towards empowering robots to learn and adapt from each experience, just as animals do."

Inspired by animals like impala and wildebeest, whose young become skilled runners within short span of time after birth, the limb uses a bio-inspired artificial intelligence algorithm to learn about its environment and refine its mobility.

Today, most robots take months or years before they are ready to interact with the rest of the world. But with this new algorithm, the team has figured out how to make robots that can learn by simply doing. This is known in robotics as "motor babbling" because it closely mimics how babies learn to speak through trial and error.

The limb developed by Valero-Cuevas and his team looks extraordinary but it’s built using animal-like tendons that enable it to recover its footing after being tripped. Within five minutes of “free play,” the algorithm behind the robot is able to get its footing without additional programming. A robot that doesn’t need extensive programming has significant advantages over those that do, since programmers can’t always foresee for every possible scenario the robot might end up in.

“Similar to infants and juvenile vertebrates, our system starts creating an understanding about itself and its environment by random exploration of its limbs and their interaction with the environment,” Valero-Cuevas said. “But it is interesting to note that its performance can improve just by repeating a task and becoming more familiar with it. [It’s] like adopting a habit.”

Valero-Cuevas thinks his system could be used in a number of applications, including assistive robotics, search and rescue, and planetary exploration.

All of this means that when roboticists are writing code, but they no longer need exact equations, sophisticated computer simulations, or thousands of repetitions to refine a task.

Instead, with this new technology, a robot can build its own internal mind map of its limbs and its environments, perfecting movement with its three-tendon, two-joint limb and interactions with its surroundings as it grows and learns.

Depending on their first moments of life, the researchers noticed that some robots even developed personal gaits.

"You can recognize someone coming down the hall because they have a particular footfall, right?" explains co-author Francisco Valero-Cuevas, a biomedical engineer at USC.

"Our robot uses its limited experience to find a solution to a problem that then becomes its personalized habit, or ‘personality’ - We get the dainty walker, the lazy walker, the champ... you name it."

It’s a feat that biologists and roboticists have long dreamed of, and the authors claim it could give future robots the "enviable versatility, adaptability, resilience and speed of vertebrates during everyday tasks."

The possibilities for the technology are only constrained by our imaginations.

With this powerful new algorithm, we might be able to provide more responsive prosthetics for people with disabilities, or we could send robots to safely explore space or perform search-and-rescue attempts in dangerous or unknown terrain.

"I envision muscle-driven robots, capable of mastering what an animal takes months to learn, in just a few minutes," says Dario Urbina-Melendez, another member of the team, and biomedical engineer at USC.

"Our work combining engineering, AI, anatomy and neuroscience is a strong indication that this is possible."

A paper detailing the research was recently published in the journal Nature Machine Intelligence.