Skip to main content

Boston Dynamics gave its Atlas robot an AI brain

The electric atlas from boston dynamics
Boston Dynamics

Boston Dynamics and Toyota Research Institute (TRI) announced on Tuesday that they are partnering to develop general-purpose humanoid robots. Boston Dynamics will contribute its new electric Atlas robot to the task, while TRI will utilize its industry-leading Large Behavior Models.

Boston Dynamics, which launched in 1992 as an offshoot from the Massachusetts Institute of Technology (MIT), has been at the forefront of robotics development for more than 30 years. It burst into the mainstream in 2009 with the BigDog and LittleDog quadrupedal systems and debuted the first iteration of its bipedal Atlas platform in 2013. Atlas’ capabilities have undergone a steady evolution in the past decade, enabling the robot to perform increasingly difficult acrobatics and dexterity tasks, from dancing and doing back flips to to conquering parkour courses and navigating simulated construction sites.

Farewell to HD Atlas

In April 2024, the company retired its long-standing, hydraulic Atlas platform in favor of a new generation driven by electric servos. The company describes the electric Atlas as “one of the most advanced humanoid robots ever built,” one that is “able to move in ways that exceed human capabilities.”

Recommended Videos

TRI, on the other hand, stands at the forefront of Large Behavior Model (LBM) development. LBMs are to robotics as LLMs are to chatbots. Just as LLM’s are trained on massive multimodal datasets to respond to a human like a human would, LBMs are trained on enormous corpora of human behaviors, enabling robots to move and act in a humanlike manner. They also help robots learn new behaviors and generalize across tasks. Per the announcement blog, “TRI’s work on LBMs aims to achieve multitask, vision-and-language-conditioned foundation models for dexterous manipulation.”

“Recent advances in AI and machine learning hold tremendous potential for advancing physical intelligence,” Gill Pratt, chief scientist for Toyota and CEO of TRI, said in a statement. “The opportunity to implement TRI’s state-of-the-art AI technology on Boston Dynamics’ hardware is game-changing for each of our organizations as we work to amplify people and improve quality of life.”

This news comes amid an increasingly crowded field of companies looking to incorporate robots into the future workforce. Agility Robotics’ Digit and Figure’s 01 and 02 models, for example, are already being tested in industrial settings such as BMW’s Spartanburg plant in South Carolina and a Spanx production facility in Flowery Branch, Georgia. Tesla’s Optimus is ostensibly also in the running, though even the most recent models still need to be remotely operated when performing more than the most basic tasks.

Andrew Tarantola
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
​​OpenAI spills tea on Musk as Meta seeks block on for-profit dreams
A digital image of Elon Musk in front of a stylized background with the Twitter logo repeating.

OpenAI has been on a “Shipmas” product launch spree, launching its highly-awaited Sora video generator and onboarding millions of Apple ecosystem members with the Siri-ChatGPT integration. The company has also expanded its subscription portfolio as it races toward a for-profit status, which is reportedly a hot topic of debate internally.

Not everyone is happy with the AI behemoth abandoning its nonprofit roots, including one of its founding fathers and now rival, Elon Musk. The xAI chief filed a lawsuit against OpenAI earlier this year and has also been consistently taking potshots at the company.

Read more
OpenAI’s Advanced Voice Mode can now see your screen and analyze videos
Advanced Santa voice mode

OpenAI's "12 Days of OpenAI" continued apace on Wednesday with the development team announcing a new seasonal voice for ChatGPT's Advanced Voice Mode (AVM), as well as new video and screen-sharing capabilities for the conversational AI feature.

Santa Mode, as OpenAI is calling it, is a seasonal feature for AVM, and offers St. Nick's dulcet tones as a preset voice option. It is being released to Plus and Pro subscribers through the website and mobile and desktop apps starting today and will remain so until early January. To access the limited-time feature, first sign in to your Plus or Pro account, then click on the snowflake icon next to the text prompt window.

Read more
OpenAI’s Sora doesn’t feel like the game-changer it was supposed to be
Sora's interpretation of gymnastics

OpenAI has teased, and repeatedly delayed, the release of Sora for nearly a year. On Tuesday, the company finally unveiled a fully functional version of the new video-generation model destined for public use and, despite the initial buzz, more and more early users of the release don't seem overly impressed. And neither am I.

https://x.com/OpenAI/status/1758192957386342435

Read more