Morgan Stanley: China’s Humanoid Robot Lead Will Drive Its Global Manufacturing Share to 16.5% by 2030

A Morgan Stanley report argues that China’s early investment in humanoid robotics mirrors its EV strategy a decade ago, and projects the country’s share of global manufacturing will rise from 15% to 16.5% by 2030 as robotics deployment and supply chain dominance accelerate.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Morgan Stanley: China’s Humanoid Robot Lead Will Drive Its Global Manufacturing Share to 16.5% by 2030
Humanoid robots operating on a Chinese manufacturing floor, part of a national strategy to expand automated production capacity across export industries. Photo: UBTECH Robotics

China’s early investment in humanoid robotics will help drive the next phase of its global manufacturing dominance, according to new research from Morgan Stanley. Economists led by Chetan Ahya, the bank’s chief Asia economist, project China’s share of global manufacturing will expand from 15% today to 16.5% by 2030, powered by humanoid robot deployment across its industrial base.

The report draws an explicit parallel with China’s EV strategy. A decade ago, China identified electric vehicles as a strategic growth driver and invested ahead of the curve in manufacturing capacity, battery technology, and supply chain control – a playbook that produced the world’s dominant EV and battery industries. Morgan Stanley’s economists argue that humanoid robotics has followed the same path.

Supply Chain as the Structural Advantage

China’s edge in humanoid robotics is not primarily a matter of individual robot performance but of supply chain depth. The country is building capacity across the full humanoid supply chain – from harmonic reducers and joint motors to sensors, actuators, and AI chips – giving it cost and scale advantages that competitors including the U.S., Japan, and South Korea cannot easily replicate in the near term, as those countries often rely on Chinese-manufactured components.

Government procurement is accelerating domestic adoption, creating a high-volume testing ground that is simultaneously a market and a development environment. Chinese tech parks, factories, and universities are among the most active humanoid deployment sites in the world, generating the operational data that improves AI systems faster than lab-based development allows.

“China has a track record of spotting the next big growth areas early and planning ahead,” Ahya wrote. “The robotics industry has followed a similar path.”

U.S. vs China: Different Approaches

The Morgan Stanley report contrasts the two countries’ strategic approaches. American firms, including Tesla, have focused on high-specification prototypes with an emphasis on testing and validation before scaling production. Chinese manufacturers have moved faster to deploy models at scale, using the domestic market as a live proving ground. The speed-versus-rigor tradeoff reflects different risk tolerances and capital structures – U.S. companies are more exposed to liability and regulatory scrutiny in production environments, while Chinese firms benefit from state support that absorbs some of the commercial risk of early deployment.

Protectionism as a Constraint

The report identifies trade restrictions as a meaningful risk to China’s humanoid export ambitions. Chinese EVs have encountered tariffs and market access barriers across the U.S., Europe, and other regions, and similar dynamics could emerge in robotics as the technology matures and strategic dependence concerns rise. However, the economists note that humanoid robotics is a newer industry with less existing domestic production to protect in importing countries, meaning the protectionist impulse may be slower to develop than it was with EVs.

The scale of the market at stake is significant. Morgan Stanley has separately projected the global humanoid robot market could reach $5 trillion by 2050. Whether China captures a dominant share of that market – or faces the same trade friction that has constrained its EV exports – will depend on how quickly other governments identify robotics as a strategic sector requiring protection.

News, Robots & Robotics, Science & Tech

Genesis AI Unveils GENE-26.5 Foundation Model and Human-Scale Robotic Hand for Dexterous Manipulation

Genesis AI has unveiled GENE-26.5, a robotics foundation model paired with a human-scale robotic hand and a data-collection glove that enables 1:1 skill transfer from humans to robots, targeting complex long-horizon manipulation tasks at commercial scale.

By Laura Bennett | Edited by Kseniia Klichova Published: Updated:

Genesis AI has unveiled GENE-26.5, a robotics foundation model designed to give robots human-level dexterous manipulation capability, alongside a proprietary robotic hand and data-collection glove system built to generate training data at scale. The San Carlos, California-based company, which emerged from stealth with $105 million in funding last year, simultaneously announced that a first general-purpose robot built on the technology will be revealed soon.

The announcement addresses what Genesis AI frames as the central bottleneck in physical AI development: the shortage of high-quality manipulation data, and the gap between human hand capability and what robotic end effectors can reliably execute.

What GENE-26.5 Can Do

Genesis AI released a demonstration video showing GENE-26.5 performing a range of complex, multi-step manipulation tasks. These include cooking a 20-step meal involving chopping, one-handed egg cracking, and two-hand coordination; preparing a smoothie with mid-air serving; conducting laboratory experiments requiring pipetting and liquid transfer with delicate instrumentation; wire harnessing described by the company as one of the most difficult tasks in electronics manufacturing; solving a Rubik’s Cube through continuous in-air manipulation; simultaneously grasping four objects of varying sizes with one hand and sorting them into bins; and playing piano at human performance level.

The task range spans from domestic service to precision industrial applications – a deliberate demonstration that the model generalizes across contexts rather than being optimized for a single domain.

The Hardware System

The robotic hand mirrors the human hand in form and function, designed to close the embodiment gap that has historically limited robots’ ability to learn from human demonstration data. It pairs with a data-collection glove equipped with tactile-sensing electronic skin. When worn by a human operator, the glove creates a 1:1:1 mapping between the glove, the human hand, and the robotic hand – allowing human task execution to translate directly into robot training data without the lossy conversion that conventional teleoperation introduces.

Genesis AI says the glove costs 100 times less than typical data-collection hardware and achieves up to five times greater data collection efficiency than traditional teleoperation methods in internal testing. The company is engaging partners to deploy the glove in real-world work environments, where workers wearing the device during normal operations would continuously generate new categories of training data – building what Genesis AI describes as a potential global human skill library.

The data engine also draws on egocentric video from humans wearing cameras and large-scale internet video of human activity, giving the model exposure to the full range of how people interact with physical environments.

Simulation and the Sim-to-Real Gap

Genesis AI has developed a proprietary simulation system using hyper-realistic physics and rendering to narrow the gap between synthetic training environments and real-world conditions. The system allows teams to train and evaluate models significantly faster than physical testing, which is slow, expensive, and difficult to scale.

“General-purpose robotics stands to reshape the global economy while opening an entirely new chapter for AI,” said Eric Schmidt, former CEO of Google and an investor in Genesis AI. “This marks an important milestone for their team and the robotics industry more broadly.”

Genesis AI is backed by Eclipse, Bpifrance, and HSG, alongside Schmidt, Xavier Niel, and AI researchers Daniela Rus and Vladlen Koltun.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

1X Technologies Begins Full-Scale NEO Humanoid Production at Hayward, California Facility

1X Technologies has launched full-scale production of its NEO humanoid robot at a 58,000-square-foot facility in Hayward, California, with consumer shipments scheduled for 2026 and early units already operating on the factory floor to generate training data.

By Rachel Whitman | Edited by Kseniia Klichova Published: Updated:

1X Technologies has launched full-scale production of its NEO humanoid robot at a new 58,000-square-foot facility in Hayward, California. The factory is the primary manufacturing hub for NEO, a home-focused humanoid designed to operate quietly – at a noise level below that of a modern refrigerator – while navigating domestic spaces. Consumer shipments are scheduled to begin in 2026, with the first year’s production allocation already sold out after pre-orders were taken in five days following the October launch.

The Hayward facility is vertically integrated, processing raw materials into finished components on-site rather than sourcing from global suppliers. 1X describes the operation as a “factory OS” – a real-time production management system covering every stage from raw input to completed robot.

What the Factory Produces

The production floor is organized around specialized zones. Copper coils are wound on automated lines to produce custom motors. A joint and limb assembly area constructs the robot’s tendon-driven actuators and 3D-lattice cushioned limbs. A final integration stage is where robots stand for the first time and receive their machine-washable nylon knit suits, available in tan, gray, and dark brown. A dedicated reliability lab subjects hardware to more than 20 million stress-test cycles to identify failure modes before units reach consumers.

Early NEO units are already operating inside the Hayward facility, handling internal logistics and stocking parts. The dual function is deliberate: the robots perform useful labor while generating real-world operational data that feeds back into training the NEO Cortex, the robot’s AI brain built on the NVIDIA Jetson Thor platform.

“Humanoid robots require high-performance, real-time AI inference and continuous training and testing in simulation for safe and reliable operation,” said Deepu Talla, Vice President of Robotics and Edge AI at NVIDIA. “By using NVIDIA Jetson Thor as the brain and the NVIDIA Isaac open robotics platform as its training ground, 1X is able to accelerate the development and deployment of intelligent robots like NEO that can work safely alongside humans.”

Strategic Positioning

1X Technologies, headquartered in Palo Alto and backed by Norwegian investors, is positioning the Hayward facility as evidence that consumer humanoid production can be built and scaled in the United States. The vertically integrated model gives 1X tighter control over component quality and supply chain risk than an assembly-only approach would allow, at the cost of higher capital requirements and operational complexity.

“This is more than just a factory opening – it’s proof that the future of humanoid robotics is being built right here in the U.S.,” said Bernt Børnich, founder and CEO of 1X. “We’re not dreaming about abundance; we’re manufacturing it.”

The consumer home robotics market that NEO is targeting remains largely unproven at scale. 1X is among the first companies globally to move a home humanoid from pre-order into production, placing it ahead of most competitors on manufacturing readiness while still carrying the uncertainty of whether consumer demand at the necessary price point will materialize as units reach homes later this year.

News, Robots & Robotics, Science & Tech

Meta Develops Agentic AI Assistant and Internal Agent “Hatch” as It Accelerates Physical AI Push

Meta is developing a highly personalized agentic AI assistant powered by its Muse Spark model, alongside an internal agent codenamed Hatch, as the company builds toward an autonomous AI and robotics strategy across its consumer platforms.

By Laura Bennett | Edited by Kseniia Klichova Published:
Meta Develops Agentic AI Assistant and Internal Agent “Hatch” as It Accelerates Physical AI Push
More than 1 billion people use Meta AI every month. Photo: Meta

Meta is developing a highly personalized AI assistant designed to carry out everyday tasks autonomously for its billions of users, the Financial Times reported on May 5, citing people familiar with the matter. The assistant is powered by Meta’s new Muse Spark AI model and is currently being tested internally. The goal, according to the report, is to build a product with capabilities similar to OpenClaw, the agentic AI system owned by OpenAI that can connect hardware and software tools and learn from the data it generates with minimal human intervention.

Separately, The Information reported that Meta is training an internal AI agent codenamed Hatch, also inspired by OpenClaw, with a target of completing internal testing by the end of June. Meta is also planning to integrate a standalone agentic shopping tool into Instagram, with a launch targeted before Q4 2026.

What Agentic AI Means for Meta’s Robotics Ambitions

The agentic assistant development is strategically connected to Meta’s accelerating push into physical AI. Last week, Meta acquired Assured Robot Intelligence, a humanoid robotics startup whose founding team – Lerrel Pinto and Xiaolong Wang – joined Meta’s Superintelligence Labs division. The ARI acquisition was explicitly framed around developing foundation models for whole-body humanoid control and robot self-learning.

An agentic AI layer that can autonomously manage tasks, connect tools, and learn from real-world data is a prerequisite for robots that operate independently in human environments. The same architecture that allows a digital assistant to book a restaurant, manage a calendar, and process a purchase can, in a physical AI context, allow a robot to perceive a task, plan an approach, and execute it without continuous human instruction. Meta’s simultaneous development of consumer-facing agentic software and humanoid robot intelligence reflects a coherent underlying strategy – building the AI capability stack from both ends.

Competitive Context

Meta faces intensifying competition in the agentic AI space. OpenAI’s OpenClaw has established a reference point that multiple companies are now building toward. Google, Apple, and Amazon are each developing agentic systems with varying degrees of hardware integration. For Meta, whose primary distribution channel is social media rather than devices or cloud infrastructure, the Instagram shopping agent represents the most immediate commercial application – a constrained, high-intent environment where agentic AI can generate measurable revenue.

The broader ambition – a personalized assistant that manages everyday tasks across Meta’s platforms for billions of users – requires the kind of persistent personalization and real-world task completion that current AI assistants have not reliably achieved at scale. Meta has not responded to requests for comment on the reports.

Agibot A2 Humanoid Robot Makes Fashion Debut at Met Gala Pre-Event with Alexander Wang

Agibot deployed its A2 humanoid robot at The Mark Hotel in New York ahead of the 2026 Met Gala, in collaboration with designer Alexander Wang, marking the first appearance of an embodied AI robot at a Met Gala pre-event.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Agibot A2 Humanoid Robot Makes Fashion Debut at Met Gala Pre-Event with Alexander Wang
A full-size humanoid robot posing for photographers outside a luxury hotel in New York during a high-profile fashion event, dressed and styled for a red carpet appearance. Photo: AGIBOT

Chinese robotics company Agibot deployed its A2 humanoid robot at The Mark Hotel in New York on May 5, ahead of the 2026 Met Gala, in a collaboration with fashion designer Alexander Wang. The appearance marked the first time an embodied AI robot has attended a Met Gala pre-event, drawing significant attention from media, photographers, and bystanders gathered outside the hotel – a traditional staging ground for celebrity arrivals before the evening’s main event.

The A2 robot posed for photographers, changed posture on request, held items, and served drinks to guests – a set of tasks that demonstrated real-world manipulation and crowd navigation capability in one of the more operationally unpredictable environments a humanoid robot has been deployed in publicly. At one point the robot became briefly stuck exiting an elevator and required assistance from handlers before continuing. The moment was captured on video and circulated widely on social media alongside footage of the robot’s red carpet posing.

A2’s Operational Profile

The A2 is Agibot’s full-size humanoid platform, built with human-like proportions and bipedal locomotion designed for stability in crowded, uncontrolled environments. Its perception and decision-making systems allowed it to navigate the dense scene outside The Mark – cameras, crowds, and continuous movement – without significant incident beyond the elevator delay. The drink-serving and object-handling tasks performed during the event reflect the same dexterous manipulation capabilities Agibot has been demonstrating in its industrial deployments at Longcheer Technology’s electronics manufacturing facility.

Technology Meets Fashion

The collaboration with Alexander Wang aligned with the Met Gala’s 2026 theme of “Fashion is Art”. For Agibot, the appearance served a dual purpose: demonstrating the A2’s capability in a chaotic, high-visibility public environment, and positioning embodied AI as a presence in cultural spaces beyond its established industrial context.

The Met Gala deployment is the latest in a series of public-facing appearances by Chinese humanoid robots – from the Beijing half-marathon to the Spring Festival gala – that serve as demonstration platforms in front of large audiences. The A2’s appearance at one of the most globally watched fashion events of the year extends that pattern into Western cultural territory, with the added dimension of a commercial fashion partnership providing the access and framing.

Agibot has previously stated its ambition to expand embodied AI deployment from manufacturing and logistics into service and consumer environments. The Met Gala appearance, whatever its promotional character, put the A2 in a setting that tested navigation, interaction, and object handling in conditions no factory floor replicates.

News, Robots & Robotics

Boston Dynamics Demonstrates Production-Ready Atlas Performing Handstand and Gymnastics in First Live Video

Boston Dynamics has released the first live demonstration video of its production-ready Atlas humanoid robot, showing it performing a handstand and L-sit, as Hyundai accelerates toward a 2028 commercial deployment target for its manufacturing plants.

By Rachel Whitman | Edited by Kseniia Klichova Published: Updated:
Boston Dynamics Demonstrates Production-Ready Atlas Performing Handstand and Gymnastics in First Live Video
Boston Dynamics Atlas named 'Best Robot' in Best of CES™ 2026 awards by CNET Group. Photo: Hyundai Motor Group

Boston Dynamics released a video on May 6 showing its production-ready Atlas humanoid robot performing a handstand and transitioning into an L-sit – a static hold in which the robot supports its full body weight on its hands alone. The footage marks the first live demonstration of the mass-produced version of Atlas in motion, a distinction that separates it from earlier research prototype demonstrations of backflips and parkour.

The release comes as Hyundai, which acquired Boston Dynamics in 2021, faces mounting pressure to deliver humanoid robots at industrial scale. Hyundai shares closed 2% higher in Seoul on May 6 and are up more than 85% in 2026, driven in part by the Atlas iteration unveiled at CES in January.

Hardware Specifications

The production-ready Atlas features human-scale hands with tactile sensing and fully rotational joints. It can lift up to 50 kilograms and operate across a temperature range of minus 20 to 40 degrees Celsius – a thermal envelope designed to cover the conditions found in automotive manufacturing environments. The fully rotational joints allow the robot to achieve positions beyond the limits of human skeletal anatomy, which is what enables the handstand and L-sit movements demonstrated in the video.

The maneuver is not presented as a commercial use case but as a demonstration of balance precision and joint control – capabilities that translate directly into the stability and repeatability required for complex industrial assembly tasks.

Commercial Deployment Timeline

Hyundai plans to deploy Atlas at its manufacturing plants beginning in 2028, including at its facility in the U.S. state of Georgia. The company has invested billions of dollars into its robotics business, and the commercialization of Atlas is central to Hyundai’s broader repositioning as a mobility solutions provider amid tariff pressure and intensifying competition from Chinese automakers.

Yoo Jiwoong, an analyst at DAOL Investment and Securities, described the video as the clearest evidence yet that Hyundai is approaching commercial scale for humanoid production. Boston Dynamics is currently producing approximately four Atlas robots per month, a figure that remains well below the 30,000-unit annual capacity it announced at CES – a gap that reflects the engineering and manufacturing challenges of the transition from research hardware to production volume.

Boston Dynamics acknowledged the difficulty of that transition in a post on X: “Balancing commercial goals and robotics research can be tricky, but with Atlas we’re making it work.”

Competitive Context

The humanoid robot market that Hyundai and Boston Dynamics are entering is increasingly competitive. Chinese manufacturers currently lead in production volume and deployment scale, with companies including Agibot, Unitree, and UBTECH shipping thousands of units annually. Tesla’s Optimus program is targeting volume production from July 2026. For Boston Dynamics, which built its reputation on research robotics, the transition to commercial manufacturing at automotive scale represents a fundamental operational shift – one the May 6 video is designed, at least in part, to signal is underway.

News, Robots & Robotics, Science & Tech