Humanoid Robots Prepare to Challenge Human Records at Beijing Android Half Marathon

Humanoid robots will compete in the world’s second half marathon for androids in Beijing, with teams aiming to dramatically improve performance and potentially challenge human running records.

By Rachel Whitman | Edited by Kseniia Klichova Published:
Humanoid Robots Prepare to Challenge Human Records at Beijing Android Half Marathon
Humanoid robots run during a long-distance endurance test ahead of Beijing’s android half marathon, where teams will evaluate locomotion, stability, and autonomous navigation. Photo: Beijing Humanoid Robot Innovation Center

Humanoid robots will return to Beijing next month for the world’s second half marathon designed specifically for androids, as developers push robotic locomotion toward levels once considered uniquely human.

Organizers say the event will bring together multiple robotics teams aiming to significantly improve performance compared with last year’s competition. Some developers have even suggested that robots may eventually challenge the pace of elite human runners.

The race has become an unusual but revealing testing ground for humanoid robotics. Unlike short demonstrations or controlled laboratory trials, long-distance running places sustained stress on nearly every component of a robot’s design – from motors and batteries to control algorithms and thermal management systems.

A Race to Close the Speed Gap

The inaugural humanoid half marathon in Beijing last year was won by Tiangong Ultra, a robot developed by the Beijing Humanoid Robot Innovation Center, which completed the course in two hours, 40 minutes, and 42 seconds.

That time remains far behind the human half-marathon world record of 57 minutes and 20 seconds, but robotics developers say progress has accelerated rapidly.

Tang Jian, chief technology officer of the company behind last year’s winning robot, said teams have upgraded both hardware and software ahead of this year’s race. Improvements include stronger joint torque, higher explosive power, and redesigned cooling systems intended to maintain stable performance during extended high-speed movement.

Developers have also refined motion control algorithms to produce a gait closer to human running mechanics, improving energy efficiency and reducing mechanical strain over long distances.

Battery technology has also advanced. Some robots competing this year may be able to complete the race without stopping for recharging, a significant improvement over earlier endurance tests.

From Controlled Experiments to Autonomous Racing

Another major change this year involves how the robots navigate the course.

During the previous race, some machines relied on human pacemakers or remote control to guide their movement along the track. For the upcoming event, teams are shifting toward greater autonomy.

Participants are expected to use onboard perception systems combined with electronic mapping tools to interpret the environment and plan their own routes in real time.

This shift mirrors broader trends in robotics development, where autonomy and environmental awareness are becoming as important as raw mechanical capability.

The course itself is also expected to be more complex than last year’s route, introducing terrain variations that will test robots’ ability to adapt their movement dynamically.

Endurance as a Test for Real World Robots

While the idea of robots running a marathon may appear symbolic, developers argue that endurance competitions provide valuable engineering insights.

Long-distance running tests the stability of locomotion systems, the reliability of sensors and controllers, and the efficiency of power management. These are the same factors that determine whether humanoid robots can eventually operate reliably in workplaces and public environments.

Developers say that sustained movement under real-world conditions can reveal weaknesses that shorter demonstrations fail to expose.

The event also highlights how quickly robotic athletic capabilities are advancing. Some robotics researchers believe humanoid robots could soon approach human-level sprint performance as well.

Unitree Robotics founder Wang Xingxing recently suggested that humanoid robots may eventually run a 100-meter sprint in under 10 seconds, a pace that would rival elite human athletes.

Whether robots can reach such milestones remains uncertain. But as competitions like Beijing’s android marathon continue to evolve, they are increasingly serving as real-world laboratories for the development of faster, more capable humanoid machines.

News, Robots & Robotics, Science & Tech

Genesis AI Unveils GENE-26.5 Foundation Model and Human-Scale Robotic Hand for Dexterous Manipulation

Genesis AI has unveiled GENE-26.5, a robotics foundation model paired with a human-scale robotic hand and a data-collection glove that enables 1:1 skill transfer from humans to robots, targeting complex long-horizon manipulation tasks at commercial scale.

By Laura Bennett | Edited by Kseniia Klichova Published: Updated:

Genesis AI has unveiled GENE-26.5, a robotics foundation model designed to give robots human-level dexterous manipulation capability, alongside a proprietary robotic hand and data-collection glove system built to generate training data at scale. The San Carlos, California-based company, which emerged from stealth with $105 million in funding last year, simultaneously announced that a first general-purpose robot built on the technology will be revealed soon.

The announcement addresses what Genesis AI frames as the central bottleneck in physical AI development: the shortage of high-quality manipulation data, and the gap between human hand capability and what robotic end effectors can reliably execute.

What GENE-26.5 Can Do

Genesis AI released a demonstration video showing GENE-26.5 performing a range of complex, multi-step manipulation tasks. These include cooking a 20-step meal involving chopping, one-handed egg cracking, and two-hand coordination; preparing a smoothie with mid-air serving; conducting laboratory experiments requiring pipetting and liquid transfer with delicate instrumentation; wire harnessing described by the company as one of the most difficult tasks in electronics manufacturing; solving a Rubik’s Cube through continuous in-air manipulation; simultaneously grasping four objects of varying sizes with one hand and sorting them into bins; and playing piano at human performance level.

The task range spans from domestic service to precision industrial applications – a deliberate demonstration that the model generalizes across contexts rather than being optimized for a single domain.

The Hardware System

The robotic hand mirrors the human hand in form and function, designed to close the embodiment gap that has historically limited robots’ ability to learn from human demonstration data. It pairs with a data-collection glove equipped with tactile-sensing electronic skin. When worn by a human operator, the glove creates a 1:1:1 mapping between the glove, the human hand, and the robotic hand – allowing human task execution to translate directly into robot training data without the lossy conversion that conventional teleoperation introduces.

Genesis AI says the glove costs 100 times less than typical data-collection hardware and achieves up to five times greater data collection efficiency than traditional teleoperation methods in internal testing. The company is engaging partners to deploy the glove in real-world work environments, where workers wearing the device during normal operations would continuously generate new categories of training data – building what Genesis AI describes as a potential global human skill library.

The data engine also draws on egocentric video from humans wearing cameras and large-scale internet video of human activity, giving the model exposure to the full range of how people interact with physical environments.

Simulation and the Sim-to-Real Gap

Genesis AI has developed a proprietary simulation system using hyper-realistic physics and rendering to narrow the gap between synthetic training environments and real-world conditions. The system allows teams to train and evaluate models significantly faster than physical testing, which is slow, expensive, and difficult to scale.

“General-purpose robotics stands to reshape the global economy while opening an entirely new chapter for AI,” said Eric Schmidt, former CEO of Google and an investor in Genesis AI. “This marks an important milestone for their team and the robotics industry more broadly.”

Genesis AI is backed by Eclipse, Bpifrance, and HSG, alongside Schmidt, Xavier Niel, and AI researchers Daniela Rus and Vladlen Koltun.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

1X Technologies Begins Full-Scale NEO Humanoid Production at Hayward, California Facility

1X Technologies has launched full-scale production of its NEO humanoid robot at a 58,000-square-foot facility in Hayward, California, with consumer shipments scheduled for 2026 and early units already operating on the factory floor to generate training data.

By Rachel Whitman | Edited by Kseniia Klichova Published: Updated:

1X Technologies has launched full-scale production of its NEO humanoid robot at a new 58,000-square-foot facility in Hayward, California. The factory is the primary manufacturing hub for NEO, a home-focused humanoid designed to operate quietly – at a noise level below that of a modern refrigerator – while navigating domestic spaces. Consumer shipments are scheduled to begin in 2026, with the first year’s production allocation already sold out after pre-orders were taken in five days following the October launch.

The Hayward facility is vertically integrated, processing raw materials into finished components on-site rather than sourcing from global suppliers. 1X describes the operation as a “factory OS” – a real-time production management system covering every stage from raw input to completed robot.

What the Factory Produces

The production floor is organized around specialized zones. Copper coils are wound on automated lines to produce custom motors. A joint and limb assembly area constructs the robot’s tendon-driven actuators and 3D-lattice cushioned limbs. A final integration stage is where robots stand for the first time and receive their machine-washable nylon knit suits, available in tan, gray, and dark brown. A dedicated reliability lab subjects hardware to more than 20 million stress-test cycles to identify failure modes before units reach consumers.

Early NEO units are already operating inside the Hayward facility, handling internal logistics and stocking parts. The dual function is deliberate: the robots perform useful labor while generating real-world operational data that feeds back into training the NEO Cortex, the robot’s AI brain built on the NVIDIA Jetson Thor platform.

“Humanoid robots require high-performance, real-time AI inference and continuous training and testing in simulation for safe and reliable operation,” said Deepu Talla, Vice President of Robotics and Edge AI at NVIDIA. “By using NVIDIA Jetson Thor as the brain and the NVIDIA Isaac open robotics platform as its training ground, 1X is able to accelerate the development and deployment of intelligent robots like NEO that can work safely alongside humans.”

Strategic Positioning

1X Technologies, headquartered in Palo Alto and backed by Norwegian investors, is positioning the Hayward facility as evidence that consumer humanoid production can be built and scaled in the United States. The vertically integrated model gives 1X tighter control over component quality and supply chain risk than an assembly-only approach would allow, at the cost of higher capital requirements and operational complexity.

“This is more than just a factory opening – it’s proof that the future of humanoid robotics is being built right here in the U.S.,” said Bernt Børnich, founder and CEO of 1X. “We’re not dreaming about abundance; we’re manufacturing it.”

The consumer home robotics market that NEO is targeting remains largely unproven at scale. 1X is among the first companies globally to move a home humanoid from pre-order into production, placing it ahead of most competitors on manufacturing readiness while still carrying the uncertainty of whether consumer demand at the necessary price point will materialize as units reach homes later this year.

News, Robots & Robotics, Science & Tech

Morgan Stanley: China’s Humanoid Robot Lead Will Drive Its Global Manufacturing Share to 16.5% by 2030

A Morgan Stanley report argues that China’s early investment in humanoid robotics mirrors its EV strategy a decade ago, and projects the country’s share of global manufacturing will rise from 15% to 16.5% by 2030 as robotics deployment and supply chain dominance accelerate.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Morgan Stanley: China’s Humanoid Robot Lead Will Drive Its Global Manufacturing Share to 16.5% by 2030
Humanoid robots operating on a Chinese manufacturing floor, part of a national strategy to expand automated production capacity across export industries. Photo: UBTECH Robotics

China’s early investment in humanoid robotics will help drive the next phase of its global manufacturing dominance, according to new research from Morgan Stanley. Economists led by Chetan Ahya, the bank’s chief Asia economist, project China’s share of global manufacturing will expand from 15% today to 16.5% by 2030, powered by humanoid robot deployment across its industrial base.

The report draws an explicit parallel with China’s EV strategy. A decade ago, China identified electric vehicles as a strategic growth driver and invested ahead of the curve in manufacturing capacity, battery technology, and supply chain control – a playbook that produced the world’s dominant EV and battery industries. Morgan Stanley’s economists argue that humanoid robotics has followed the same path.

Supply Chain as the Structural Advantage

China’s edge in humanoid robotics is not primarily a matter of individual robot performance but of supply chain depth. The country is building capacity across the full humanoid supply chain – from harmonic reducers and joint motors to sensors, actuators, and AI chips – giving it cost and scale advantages that competitors including the U.S., Japan, and South Korea cannot easily replicate in the near term, as those countries often rely on Chinese-manufactured components.

Government procurement is accelerating domestic adoption, creating a high-volume testing ground that is simultaneously a market and a development environment. Chinese tech parks, factories, and universities are among the most active humanoid deployment sites in the world, generating the operational data that improves AI systems faster than lab-based development allows.

“China has a track record of spotting the next big growth areas early and planning ahead,” Ahya wrote. “The robotics industry has followed a similar path.”

U.S. vs China: Different Approaches

The Morgan Stanley report contrasts the two countries’ strategic approaches. American firms, including Tesla, have focused on high-specification prototypes with an emphasis on testing and validation before scaling production. Chinese manufacturers have moved faster to deploy models at scale, using the domestic market as a live proving ground. The speed-versus-rigor tradeoff reflects different risk tolerances and capital structures – U.S. companies are more exposed to liability and regulatory scrutiny in production environments, while Chinese firms benefit from state support that absorbs some of the commercial risk of early deployment.

Protectionism as a Constraint

The report identifies trade restrictions as a meaningful risk to China’s humanoid export ambitions. Chinese EVs have encountered tariffs and market access barriers across the U.S., Europe, and other regions, and similar dynamics could emerge in robotics as the technology matures and strategic dependence concerns rise. However, the economists note that humanoid robotics is a newer industry with less existing domestic production to protect in importing countries, meaning the protectionist impulse may be slower to develop than it was with EVs.

The scale of the market at stake is significant. Morgan Stanley has separately projected the global humanoid robot market could reach $5 trillion by 2050. Whether China captures a dominant share of that market – or faces the same trade friction that has constrained its EV exports – will depend on how quickly other governments identify robotics as a strategic sector requiring protection.

News, Robots & Robotics, Science & Tech

Meta Develops Agentic AI Assistant and Internal Agent “Hatch” as It Accelerates Physical AI Push

Meta is developing a highly personalized agentic AI assistant powered by its Muse Spark model, alongside an internal agent codenamed Hatch, as the company builds toward an autonomous AI and robotics strategy across its consumer platforms.

By Laura Bennett | Edited by Kseniia Klichova Published:
Meta Develops Agentic AI Assistant and Internal Agent “Hatch” as It Accelerates Physical AI Push
More than 1 billion people use Meta AI every month. Photo: Meta

Meta is developing a highly personalized AI assistant designed to carry out everyday tasks autonomously for its billions of users, the Financial Times reported on May 5, citing people familiar with the matter. The assistant is powered by Meta’s new Muse Spark AI model and is currently being tested internally. The goal, according to the report, is to build a product with capabilities similar to OpenClaw, the agentic AI system owned by OpenAI that can connect hardware and software tools and learn from the data it generates with minimal human intervention.

Separately, The Information reported that Meta is training an internal AI agent codenamed Hatch, also inspired by OpenClaw, with a target of completing internal testing by the end of June. Meta is also planning to integrate a standalone agentic shopping tool into Instagram, with a launch targeted before Q4 2026.

What Agentic AI Means for Meta’s Robotics Ambitions

The agentic assistant development is strategically connected to Meta’s accelerating push into physical AI. Last week, Meta acquired Assured Robot Intelligence, a humanoid robotics startup whose founding team – Lerrel Pinto and Xiaolong Wang – joined Meta’s Superintelligence Labs division. The ARI acquisition was explicitly framed around developing foundation models for whole-body humanoid control and robot self-learning.

An agentic AI layer that can autonomously manage tasks, connect tools, and learn from real-world data is a prerequisite for robots that operate independently in human environments. The same architecture that allows a digital assistant to book a restaurant, manage a calendar, and process a purchase can, in a physical AI context, allow a robot to perceive a task, plan an approach, and execute it without continuous human instruction. Meta’s simultaneous development of consumer-facing agentic software and humanoid robot intelligence reflects a coherent underlying strategy – building the AI capability stack from both ends.

Competitive Context

Meta faces intensifying competition in the agentic AI space. OpenAI’s OpenClaw has established a reference point that multiple companies are now building toward. Google, Apple, and Amazon are each developing agentic systems with varying degrees of hardware integration. For Meta, whose primary distribution channel is social media rather than devices or cloud infrastructure, the Instagram shopping agent represents the most immediate commercial application – a constrained, high-intent environment where agentic AI can generate measurable revenue.

The broader ambition – a personalized assistant that manages everyday tasks across Meta’s platforms for billions of users – requires the kind of persistent personalization and real-world task completion that current AI assistants have not reliably achieved at scale. Meta has not responded to requests for comment on the reports.

Agibot A2 Humanoid Robot Makes Fashion Debut at Met Gala Pre-Event with Alexander Wang

Agibot deployed its A2 humanoid robot at The Mark Hotel in New York ahead of the 2026 Met Gala, in collaboration with designer Alexander Wang, marking the first appearance of an embodied AI robot at a Met Gala pre-event.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Agibot A2 Humanoid Robot Makes Fashion Debut at Met Gala Pre-Event with Alexander Wang
A full-size humanoid robot posing for photographers outside a luxury hotel in New York during a high-profile fashion event, dressed and styled for a red carpet appearance. Photo: AGIBOT

Chinese robotics company Agibot deployed its A2 humanoid robot at The Mark Hotel in New York on May 5, ahead of the 2026 Met Gala, in a collaboration with fashion designer Alexander Wang. The appearance marked the first time an embodied AI robot has attended a Met Gala pre-event, drawing significant attention from media, photographers, and bystanders gathered outside the hotel – a traditional staging ground for celebrity arrivals before the evening’s main event.

The A2 robot posed for photographers, changed posture on request, held items, and served drinks to guests – a set of tasks that demonstrated real-world manipulation and crowd navigation capability in one of the more operationally unpredictable environments a humanoid robot has been deployed in publicly. At one point the robot became briefly stuck exiting an elevator and required assistance from handlers before continuing. The moment was captured on video and circulated widely on social media alongside footage of the robot’s red carpet posing.

A2’s Operational Profile

The A2 is Agibot’s full-size humanoid platform, built with human-like proportions and bipedal locomotion designed for stability in crowded, uncontrolled environments. Its perception and decision-making systems allowed it to navigate the dense scene outside The Mark – cameras, crowds, and continuous movement – without significant incident beyond the elevator delay. The drink-serving and object-handling tasks performed during the event reflect the same dexterous manipulation capabilities Agibot has been demonstrating in its industrial deployments at Longcheer Technology’s electronics manufacturing facility.

Technology Meets Fashion

The collaboration with Alexander Wang aligned with the Met Gala’s 2026 theme of “Fashion is Art”. For Agibot, the appearance served a dual purpose: demonstrating the A2’s capability in a chaotic, high-visibility public environment, and positioning embodied AI as a presence in cultural spaces beyond its established industrial context.

The Met Gala deployment is the latest in a series of public-facing appearances by Chinese humanoid robots – from the Beijing half-marathon to the Spring Festival gala – that serve as demonstration platforms in front of large audiences. The A2’s appearance at one of the most globally watched fashion events of the year extends that pattern into Western cultural territory, with the added dimension of a commercial fashion partnership providing the access and framing.

Agibot has previously stated its ambition to expand embodied AI deployment from manufacturing and logistics into service and consumer environments. The Met Gala appearance, whatever its promotional character, put the A2 in a setting that tested navigation, interaction, and object handling in conditions no factory floor replicates.

News, Robots & Robotics