Boston Dynamics is strengthening its partnership with NVIDIA as the robotics company works to advance artificial intelligence capabilities for its humanoid robot platform.
The collaboration centers on integrating NVIDIA’s Jetson Thor computing platform and robotics development tools into the next generation of Boston Dynamics’ Atlas humanoid. The companies say the effort aims to accelerate the development of complex AI systems capable of enabling robots to operate more autonomously in real-world environments.
The announcement highlights a broader shift underway in robotics development, where advances in computing hardware and simulation are becoming central to building more capable physical AI systems.
Boston Dynamics, long known for its highly dynamic robots, is increasingly pairing its expertise in mobility and mechanical design with large-scale AI training frameworks and high-performance edge computing.
A New Computing Backbone for Atlas
At the center of the collaboration is the adoption of NVIDIA’s Jetson Thor robotics computing platform.
The compact system is designed to run advanced AI models directly on robots, enabling real-time perception, decision-making, and control. For Atlas, the additional computing power allows developers to integrate multimodal AI models that combine visual perception, environmental awareness, and motion control.
These models work alongside Boston Dynamics’ existing whole-body control systems and manipulation software, which coordinate the robot’s movements across its limbs and joints.
According to Aaron Saunders, chief technology officer at Boston Dynamics, the integration of high-performance computing is essential for the next phase of humanoid robotics.
The current electric version of Atlas is designed as a research and development platform for exploring complex physical tasks. With the addition of more powerful onboard AI computing, the robot can process more sophisticated models while maintaining real-time responsiveness.
Training Robots in Virtual Worlds
The collaboration also extends to the development of new AI skills using NVIDIA’s Isaac Lab framework.
Isaac Lab provides a simulation environment where robots can train policies for locomotion and dexterity using reinforcement learning. Built on NVIDIA’s Isaac Sim and Omniverse technologies, the platform allows engineers to run large numbers of experiments in physically accurate virtual environments.
Training in simulation has become a key strategy in robotics development. It enables robots to practice tasks thousands or millions of times without the risk of hardware damage, while also exposing AI systems to a wide range of scenarios that would be difficult to reproduce physically.
Boston Dynamics says the platform is already producing advances in learned dexterity and locomotion, areas that remain among the most challenging aspects of humanoid robotics.
By combining simulation training with real-world testing, developers aim to accelerate how quickly robots acquire new capabilities.
Expanding AI Across Boston Dynamics’ Robot Fleet
While Atlas is the centerpiece of the collaboration, the company is also applying new AI capabilities across its broader robotics portfolio.
Boston Dynamics has introduced reinforcement learning tools and new AI models for Spot, its quadruped robot widely used for industrial inspections and safety applications. The company says these systems improve locomotion control and help the robot detect and avoid hazards during autonomous operation.
In parallel, the company continues developing Orbit, its cloud-based platform for managing fleets of robots and analyzing operational data.
Together, these systems reflect a larger trend in robotics: the convergence of advanced mobility hardware with increasingly powerful AI infrastructure.
As humanoid robots move from experimental prototypes toward potential commercial deployment, partnerships between robotics developers and AI computing companies are becoming more central to the industry’s progress.
For Boston Dynamics, the collaboration with NVIDIA underscores how the next phase of robotics development may depend not only on better robot bodies, but also on the computational platforms that allow those machines to learn and operate in the real world.