MIT debuts a major language-inspired method for teaching robots new skills

[ad_1]

MIT this week View new model To train robots. Rather than the standard set of concentrated data used to teach robots new tasks, the method goes at scale, mimicking the vast sets of information used to train large language models (LLMs).

The researchers note that imitation learning — in which an agent learns by following an individual performing a task — can fail when presented with small challenges. These can be things like lighting, a different environment, or new obstacles. In these scenarios, robots simply do not have enough data to draw from in order to adapt.

The team looked to models like GPT-4 for a kind of brute force data approach to solving problems.

“In language, all data is just sentences,” says Leroy Wang, lead author of the new study. “In robotics, given the heterogeneity in the data, if you want to pre-train in a similar way, we need a different architecture.”

The team introduced a new architecture called heterogeneous pre-trained transformer (HPT), which combines information from different sensors and different environments. The transformer was then used to aggregate the data into training models. The larger the transformer, the better the output.

Users then enter the robot’s design, configuration, and the task they want it to accomplish.

“Our dream is to have a universal robot brain that you can download and use for your robot without any training at all,” David Held, an assistant professor at Carnegie Mellon University, said of the research. “While we are only in the early stages, we will continue to push hard and hope that scaling will lead to a breakthrough in robotic policy, as has happened with large language models.”

This research was established in part by the Toyota Research Institute. Last year at TechCrunch Disrupt, TRI debuted a way to train bots overnight. Most recently, it struck a watershed partnership that unites robotics learning research with Boston Dynamics Instruments.

[ad_2]

Leave a Comment