NVIDIA has announced a suite of AI technologies designed to accelerate the development of humanoid robots. The portfolio includes Isaac GR00T N1, hailed as the world’s first open and fully customisable foundation model for general-purpose humanoid reasoning and skills.
Alongside this core model, NVIDIA is also introducing simulation frameworks and blueprints, such as the NVIDIA Isaac GR00T Blueprint for generating synthetic data. Furthermore, the company unveiled Newton, an open-source physics engine currently under development in collaboration with Google DeepMind and Disney Research, specifically engineered for the creation of robots.
The GR00T N1 model is available now and represents the first in a planned family of fully customisable models that NVIDIA intends to pre-train and release to robotics developers globally. This initiative aims to dramatically speed up the transformation of industries currently grappling with global labour shortages, estimated to affect over 50 million people worldwide.
“The age of generalist robotics is here,” declared Jensen Huang, Founder and CEO of NVIDIA. “With NVIDIA Isaac GR00T N1 and new data-generation and robot-learning frameworks, robotics developers everywhere will open the next frontier in the age of AI.”
NVIDIA GR00T N1: AI for the development of humanoid robots
The GR00T N1 foundation model boasts a sophisticated dual-system architecture, drawing inspiration from the principles of human cognition. The first component, dubbed “System 1”, is a rapid-thinking action model, mirroring human reflexes or intuition. Complementing this is “System 2”, a slower-thinking model designed for deliberate and methodical decision-making.
Powered by a vision language model, System 2 analyses its surroundings and the instructions it has been given to formulate action plans. System 1 then translates these plans into precise and continuous robot movements. Critically, System 1 is trained on both human demonstration data and a vast amount of synthetic data generated by the NVIDIA Omniverse platform.
The capabilities of GR00T N1 allow it to easily generalise across common tasks, such as grasping and manipulating objects with one or both arms, and transferring items between arms. It can also perform complex, multi-step tasks requiring long-term context and a combination of general skills. These abilities have potential applications across various sectors, including material handling, packaging, and quality inspection.
Developers and researchers have the flexibility to further train GR00T N1 using their own real-world or synthetic data, tailoring it to their specific humanoid robot or task requirements.
During his keynote address at the recent GTC conference, Huang showcased a humanoid robot from 1X autonomously performing household tidying tasks. This demonstration utilised a post-trained policy built upon the GR00T N1 model, highlighting the effectiveness of the AI training collaboration between 1X and NVIDIA.
“The future of humanoids is about adaptability and learning,” stated Bernt Børnich, CEO of 1X Technologies. “While we develop our own models, NVIDIA’s GR00T N1 provides a significant boost to robot reasoning and skills.
“With minimal post-training data, we fully deployed on NEO Gamma—advancing our mission of creating robots that are not just tools, but companions capable of assisting humans in meaningful, immeasurable ways.”
Other developers of humanoid robots who have gained early access to GR00T N1 include Agility Robotics, Boston Dynamics, Mentee Robotics, and NEURA Robotics.
NVIDIA, Google DeepMind, and Disney Research collaborate on Newton physics engine
In a further boost to the robotics ecosystem, NVIDIA announced a collaborative effort with Google DeepMind and Disney Research to develop Newton, an open-source physics engine designed to enable robots to learn how to handle intricate tasks with greater accuracy.
Built upon the NVIDIA Warp framework, Newton will be optimised for robot learning and will be compatible with popular simulation frameworks such as Google DeepMind’s MuJoCo and NVIDIA Isaac Lab. Additionally, the collaborating companies intend to integrate Disney’s proprietary physics engine into Newton.
Google DeepMind and NVIDIA are also working together to develop MuJoCo-Warp, a technology expected to accelerate robotics machine learning workloads by over 70 times. This advancement will be made available to developers through Google DeepMind’s MJX open-source library, as well as through Newton.
Disney Research will be among the first to leverage Newton to further its robotic character platform, which powers next-generation entertainment robots. Examples include the expressive Star Wars-inspired BDX droids that joined Huang on stage during his GTC keynote.
“The BDX droids are just the beginning. We’re committed to bringing more characters to life in ways the world hasn’t seen before, and this collaboration with Disney Research, NVIDIA, and Google DeepMind is a key part of that vision,” said Kyle Laughlin, SVP at Walt Disney Imagineering Research & Development.
“This collaboration will allow us to create a new generation of robotic characters that are more expressive and engaging than ever before—and connect with our guests in ways that only Disney can.”
Furthermore, NVIDIA and Disney Research – in conjunction with Intrinsic – have announced an additional collaboration focused on building OpenUSD pipelines and establishing best practices for robotics data workflows.
NVIDIA addresses lack of data for training humanoid robots
The availability of large, diverse, and high-quality datasets is crucial for effective robot development, but capturing such data can be expensive. For humanoid robots, the amount of real-world human demonstration data is inherently limited.
To address this challenge, NVIDIA unveiled the Isaac GR00T Blueprint for synthetic manipulation motion generation. Built upon the Omniverse and NVIDIA Cosmos Transfer world foundation models, this blueprint allows developers to generate exponentially large quantities of synthetic motion data for manipulation tasks, starting from a small number of human demonstrations.
Utilising the initial components available for the blueprint, NVIDIA successfully generated 780,000 synthetic trajectories – equivalent to 6,500 hours, or nine continuous months, of human demonstration data – in just 11 hours. Subsequently, by combining this synthetic data with real-world data, NVIDIA achieved a 40% improvement in GR00T N1’s performance compared to using only real data.
To further empower the developer community with valuable training resources, NVIDIA is releasing the GR00T N1 dataset as part of a broader open-source physical AI dataset, also announced at GTC and now accessible on Hugging Face.
So, when is all this available?
The NVIDIA GR00T N1 training data and task evaluation scenarios for humanoid robots are available for immediate download from Hugging Face and GitHub. The NVIDIA Isaac GR00T Blueprint for synthetic manipulation motion generation is also now available as an interactive demonstration on build.nvidia.com or for download from GitHub.
The NVIDIA DGX Spark personal AI supercomputer, also announced today at GTC, offers developers a ready-to-use system to expand the capabilities of GR00T N1 for new robots, tasks, and environments without requiring extensive custom programming.
The Newton physics engine is anticipated to be available later this year.
See also: Gemini Robotics: Google DeepMind aims for helpful AI robots

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Intelligent Automation Conference, BlockX, Digital Transformation Week, and Cyber Security & Cloud Expo.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.