stub RoboGrammar System Automates and Optimizes Robot Design - Unite.AI
Connect with us

Robotics

RoboGrammar System Automates and Optimizes Robot Design

Published

 on

Image: Courtesy of the researchers

The shape of a robot determines what types of tasks it can perform and environment it can operate in. With current technological limitations, there is no way to build and test each form, but a new system developed by researchers at MIT allows for these many forms to be simulated. After simulations, the best of them can be picked out of the group.

The new system is called RoboGrammar, and the first step is to inform it what types of robot parts are available, such as wheels and joints. You then indicate the type of terrain the robot will operate on, but that’s basically it. RoboGrammar then generates an optimized structure and control program.

Advancing the Field of Robotic Design

The new system is a big step forward in an advanced field that is still mostly manual.

Allan Zhao is lead author of the research and a PhD student in the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL).

“Robot design is still a very manual process,” Zhao says. The RoboGrammar system is “a way to come up with new, more inventive robot designs that could potentially be more effective.”

The research is set to be presented at the SIGGRAPH Asia conference this month.

According to Zhao, robots are built for all sorts of tasks but “they all tend to be very similar in their overall shape and design. When you think of building a robot that needs to cross various terrains, you immediately jump to a quadruped. We were wondering if that’s really the optimal design.”

The team believed that a new and more innovative design could produce better results and improve functionality, which is why they constructed a computer model for the tasks. The system was not influenced by prior convention and there were some rules applied.

Zhao writes that the collection of possible robot forms is “primarily composed of nonsensical designs. If you can just connect the parts in arbitrary ways, you end up with a jumble.”

RoboGrammar: Graph Grammar for Terrain-Optimized Robot Design

The Graph Grammar

The team set out to develop a “graph grammar,” which restricts how the robot’s components can be arranged. This is just so each computer-generated design works at a basic level, with constraints such as not being able to connect leg segments with each other instead of joints.

Zhao was inspired by animals, specifically arthropods, when designing the rules of the graph grammar.

Arthropods are “characterized by having a central body with a variable number of segments. Some segments may have legs attached,” Zhao says. “And we noticed that that’s enough to describe not only arthropods but more familiar forms as well.”

Through using the graph grammar, RoboGrammar operates in three steps. First, it defines the problem. Second, it designs possible robotic solutions. Third, it selects the optimal ones.

Human users are responsible for problem definition and inputing the set of available robotic components, such as motors, legs, and connecting segments. The user also inputs what type of terrain the robot will operate on.

“That’s key to making sure the final robots can actually be built in the real world,” Zhao says.

Hundreds of Thousands of Structures

RoboGrammar takes the graph grammar rules and designs hundreds of thousands of possible robot structures, with various different appearances.

“It was pretty inspiring for us to see the variety of designs,” Zhao says. “It definitely shows the expressiveness of the grammar.”

However, not all designs are good, and choosing the best means each robot’s movements and function must be evaluated.

“Up until now, these robots are just structures,” Zhao says.

The team developed a controller for each robot through an algorithm called Model Predictive Control, which prioritizes rapid forward movement, and this is what advances the structures.

“The shape and the controller of the robot are deeply intertwined, which is why we have to optimize a controller for every given robot individually,” says Zhao.

The researchers then use a neural network algorithm to find high-performing robots. The algorithm samples and evaluates different sets of robots and learns which designs work for what tasks.

Everything mentioned up until this point takes place without human intervention.

“This work is a crowning achievement in the 25-year quest to automatically design the morphology and control of robots,” says Hod Lipson, mechanical engineer and computer scientist at Columbia University. He was not involved with the research. “The idea of using shape-grammars has been around for awhile, but nowhere has this idea been executed as beautifully as in this work. Once we can get machines to design, make and program robots automatically, all bets are off.”

According to Zhao, RoboGrammar is “a tool for robot designers to expand the space of robot structures they draw upon.”

The team now plans to build and test some of the robots in the real world, and Zhao says that the system could move beyond terrain traversing and into areas such as virtual worlds.

“Let’s say in a video game you wanted to generate lots of kinds of robots, without having an artist to create each one. RoboGrammar would work for that almost immediately,” Zhao says.

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.