HomeNewsTechnologyThe Case for AI-Native Humanoid Robots, According to Humanoid

The Case for AI-Native Humanoid Robots, According to Humanoid

follow us on Google News

Humanoid wants to make one thing clear. The future of humanoid robotics is no longer a distant lab experiment. It is already being tested, trained, and deployed in environments that look a lot like the real world.

- Advertisement -

The UK based AI and robotics company is accelerating the pace of humanoid robot development by building deeply into NVIDIA’s robotics ecosystem. By integrating NVIDIA Jetson Thor, NVIDIA Isaac Sim, and NVIDIA Isaac Lab into its core technology stack, Humanoid has moved quickly from concept to capable alpha systems that are already operating outside controlled research settings. Those efforts recently culminated in a successful proof of concept with Schaeffler, signaling that these robots are beginning to cross the line from experimental to operational.

At the heart of Humanoid’s approach is Jetson Thor, NVIDIA’s latest edge computing platform, which powers the HMND 01 Alpha robots across two distinct platforms. One is a wheeled system designed for industrial environments, where mobility, robustness, and reliability matter most. The other is a newly released bipedal robot, aimed primarily at research and development for future service and household applications. In both cases, Jetson Thor functions as a powerful AI native brain at the edge, allowing Humanoid to run large robotic foundation models directly on the robot rather than relying on remote infrastructure.

That local intelligence matters. By simplifying system architecture and reducing wiring complexity, Jetson Thor improves manufacturability, reliability, and serviceability in the field. It also unlocks a level of autonomy and responsiveness that brings the robots closer to human environments, not just factory floors.

- Advertisement -

To get there faster, Humanoid has adopted a simulation first mindset. Using NVIDIA Isaac Lab and NVIDIA Isaac Sim, the team develops, tests, and refines robot behavior long before hardware touches the ground. Isaac Lab plays a central role in reinforcement learning, helping improve locomotion and manipulation skills through large scale simulated training. Meanwhile, Isaac Sim enables high fidelity digital twins that mirror the real robots down to their interfaces, allowing engineers to move seamlessly between virtual and physical systems.

That continuity is key. Engineers can test middleware, control systems, teleoperation, navigation, and full body control in simulation, then deploy the same software stack on real robots with confidence. Navigation and manipulation policies are validated months ahead of on site deployment, reducing surprises and accelerating iteration once robots arrive in real environments.

Simulation also shapes the hardware itself. By analyzing forces, motion, stability, and torque requirements in virtual environments, Humanoid can make informed decisions about actuator selection, joint strength, mass distribution, and balance before building physical prototypes. For the bipedal robot alone, the team evaluated six different leg configurations in simulation, narrowing down designs and reducing risk well ahead of manufacturing.

The payoff is speed. Humanoid designed and delivered its first alpha prototypes in just seven months, a dramatic compression compared to the typical eighteen to twenty four month hardware development cycle. That pace is further amplified by NVIDIA AI infrastructure, which Humanoid uses to train Vision Language Action models. What once required extensive post training pipelines now takes just a few hours, allowing the team to move from training to deployment at remarkable speed.

In practical terms, this means tasks that traditionally took months and large teams of specialists can now be completed far more quickly. Today, Humanoid can train a policy from scratch and deploy it on a real robot within twenty four hours. That kind of velocity changes how robotics development feels, shifting it closer to modern software iteration than traditional hardware engineering.

NVIDIA’s open libraries and frameworks also play a critical role in surfacing problems early. Using Isaac Sim, the team validated navigation and SLAM performance, stress tested stability and torque limits, and optimized sensor and camera placement to ensure reliable perception in complex real world environments. These insights feed directly back into both hardware and software decisions, tightening the feedback loop across the entire system.

- Advertisement -

This matters because Humanoid is not building robots to live only in demos or labs. As a commercially focused company, its strategy is to place robots into real operational environments as early as possible, gather feedback, and iterate fast. That approach is already producing results. Humanoid has received 20,500 pre orders, completed six proof of concepts, and currently has three additional pilot programs underway.

According to Jarad Cannon, Chief Technology Officer at Humanoid, NVIDIA’s open robotics development platform is helping the industry move beyond legacy industrial communication standards and toward modern, software defined systems. The company is working closely with NVIDIA and other partners on a new robotics networking system built on Jetson Thor and the Holoscan Sensor Bridge. The goal is to establish an open network standard for AI enabled robots, one that could reshape how robotic systems communicate, scale, and evolve across industries.

That vision comes to life on the CES show floor. Humanoid will publicly demonstrate the HMND 01 Alpha Robot at the Schaeffler Booth 7301 in the West Hall of the Las Vegas Convention Center. Founder Artem Sokolov will present the robot performing a near real world industrial task: bin picking metallic bearing rings in a cluttered environment, executed in a near production setting. Demonstrations will take place on Tuesday, January 6 at 10 a.m., 11 a.m., noon, 2 p.m., and 3 p.m.

It is a small task on the surface, but a telling one. Behind that bin picking demo sits a broader signal about where humanoid robotics is headed. Faster development cycles, AI native systems, simulation driven design, and robots that are increasingly ready to leave the lab and step into everyday environments.

Leave a Reply

More to Explore

Blender 5.1: The Precision Refinement Every Designer Needs

Released on March 17, 2026, Blender 5.1 arrives not as a radical departure, but as a masterclass in refinement. While version 5.0 was the...

How to Set Up Firefox’s New Free Built-in VPN and Use Native Split View

Digital privacy often feels like a full-time job, requiring users to juggle various extensions and subscriptions just to keep their personal data from leaking...

OpenAI Shuts Down Sora as Disney’s $1 Billion Deal Collapses

The sudden closure of OpenAI's AI video platform marks one of the most dramatic reversals in the brief history of generative AI, and leaves...

Sony’s Tokyo Studio Is Where the Future of Filmmaking Gets Made

Sony is bringing its global media production hub network to Japan, opening the Digital Media Production Center Japan (DMPC Japan) inside the company's Group...

Anthropic’s Claude Cowork Lets You Assign AI Tasks From Your Phone and Walk Away

Artificial intelligence is getting better at doing things. The harder challenge has always been getting it to do things without you watching. Anthropic's Claude...

NVIDIA’s Dynamo 1.0 Is Free, Open Source Software That Makes AI Inference Up to 7x Faster

Running AI models at scale is harder than it looks. Training a model is a one-time investment. Inference, the process of actually using that...

Adobe and NVIDIA Are Teaming Up to Reinvent Creative and Marketing Workflows With AI

Two of the most influential companies in creative technology are deepening a partnership that goes back more than two decades. Adobe and NVIDIA have...

NVIDIA Is Trying to Become the Default Platform for Every Kind of Robot

Jensen Huang has a bold prediction: every industrial company will become a robotics company. Whether or not that timeline plays out exactly as he...

NVIDIA and T-Mobile Want to Turn the 5G Network Into a Distributed AI Computer

Most conversations about AI infrastructure focus on data centers, the massive facilities packed with GPU racks that train and run the world's most powerful...

BYD, Nissan, Geely and More Are Building Self-Driving Cars on NVIDIA’s Platform — and Robotaxis Are Coming to Uber by 2027

Self-driving vehicles have been a promise for a long time. The technology has advanced significantly, but wide-scale deployment has remained perpetually just around the...

NVIDIA Wants to Be the Platform That Powers Every Enterprise AI Agent

Autonomous AI agents are moving from experiment to enterprise infrastructure faster than most organizations anticipated. The question is no longer whether companies will deploy...

NVIDIA Is Building a Coalition of AI Labs to Develop Open Frontier Models Together

The race to build the most powerful AI models has largely been a competition, with labs guarding their research, their data, and their techniques...

NVIDIA Is Releasing a Wave of Open AI Models Covering Everything From Robot Brains to Drug Discovery

NVIDIA does not just make chips. It has spent years building a parallel business in AI software and open models, and at GTC this...

NVIDIA’s NemoClaw Brings Security and Privacy to OpenClaw’s Fast-Growing AI Agent Platform

AI agents are getting good enough to actually be useful, and that is precisely when the uncomfortable questions start. If a piece of software...

NVIDIA Is Taking Its AI Chips to Space — Here’s What That Actually Means

NVIDIA has spent the last several years building the infrastructure that powers AI on Earth. Now it is setting its sights considerably higher. The...

Recommended for You

You Might Also Like