Japan Targets Humanoid Robot Mass Production by 2027

Japan has launched a national effort to mass-produce humanoid robots by 2027, bringing together universities, electronics firms, and government support to address labor shortages and restore its robotics leadership.

By Daniel Krauss | Edited by Kseniia Klichova Published:
A collaborative alliance of Japanese universities and electronics firms is developing domestically produced humanoid robots for disaster response and industrial use, marking a national push into physical AI. Photo: Waseda University

Japan is preparing to mass-produce humanoid robots by 2027 as part of a coordinated effort between universities, electronics manufacturers, and semiconductor companies. The initiative reflects a broader national strategy to address labor shortages while reestablishing Japan’s role in a rapidly advancing global robotics sector.

The effort, led by a consortium known as the Kyoto Humanoid Association, brings together robotics developers such as Waseda University and tmsuk alongside major industrial firms including Murata Manufacturing, Renesas Electronics, and Sumitomo Heavy Industries. The group aims to complete a working prototype by March 2026, followed by commercial-scale manufacturing the following year.

The project marks one of Japan’s most ambitious attempts in recent decades to translate its robotics expertise into large-scale deployment of humanoid systems capable of operating in real-world environments.

A National Effort to Address Labor Shortages

Japan’s humanoid robotics initiative is driven in part by demographic pressure. The country’s working-age population continues to shrink as birth rates decline and the population ages, creating persistent labor shortages in construction, manufacturing, infrastructure maintenance, and emergency response.

Recent labor reforms limiting overtime hours have intensified workforce constraints, accelerating interest in automation solutions capable of operating in human-designed environments. Humanoid robots are seen as uniquely suited for such roles because their physical structure allows them to use existing tools, navigate standard buildings, and perform tasks without requiring major infrastructure changes.

The consortium is developing two primary humanoid platforms. One is a large disaster-response robot standing approximately 250 centimeters tall with the ability to lift loads exceeding 50 kilograms. The second is a research-oriented humanoid with human-like proportions and greater mobility, designed to accelerate software and AI development.

Government support is expected to play a central role in scaling these systems. Japan’s Cabinet Office plans to introduce a national AI Robotics Strategy in fiscal year 2026, aimed at accelerating deployment and creating early demand in areas such as disaster response, inspection, and infrastructure maintenance. The government’s Moonshot Research and Development Program is also funding work toward a general-purpose humanoid AI platform by 2030.

Physical AI Becomes a Strategic Priority

The initiative reflects Japan’s recognition that robotics is entering a new phase defined by the integration of artificial intelligence and physical systems, often described as physical AI. While Japan historically led humanoid robotics through projects such as Honda’s ASIMO and Murata Manufacturing’s balancing robot, recent advances in AI-driven perception and motion control have shifted momentum toward companies in the United States and China.

Advances in machine learning, sensor technology, and simulation have made humanoid robots more viable for practical applications, but the key challenge remains data. Humanoid robots must learn to operate in complex environments by gathering visual and spatial information and adjusting their movements in real time. Training these systems requires large volumes of real-world data collected from human-centered perspectives.

Industry leaders involved in the consortium emphasize that collaboration is essential to overcoming these challenges. Japanese companies are contributing specialized technologies, including precision motors, sensors, and microcontrollers, to accelerate development and ensure compatibility across hardware and software layers.

This coordinated approach differs from earlier robotics efforts that were often driven by individual companies. By pooling expertise across universities, semiconductor firms, and heavy industry, Japan is attempting to build a scalable ecosystem capable of competing internationally.

A Race to Reclaim Robotics Leadership

Japan’s move comes amid intensifying global competition in humanoid robotics. Companies such as Tesla, Figure AI, and Agility Robotics in the United States, along with rapidly advancing Chinese robotics firms, are pushing toward commercial deployment of humanoid systems in factories and logistics operations.

For Japan, the current initiative represents both an economic opportunity and a strategic necessity. Humanoid robots could help stabilize productivity as the labor force declines while strengthening domestic capabilities in semiconductor integration, AI, and precision manufacturing.

The success of the effort will depend on whether Japan can move beyond prototype development and achieve reliable, cost-effective production at scale. Mass production by 2027 would mark a shift from demonstration systems to deployable machines capable of performing meaningful work in industry and public infrastructure.

If successful, the initiative could signal Japan’s reemergence as a central player in humanoid robotics, this time driven not only by mechanical engineering but by the convergence of AI, semiconductor technology, and industrial-scale manufacturing.

Automation, News, Robots & Robotics, Science & Tech

A&K Robotics Raises CAD $8 Million to Scale Autonomous Passenger Mobility in Airports

A&K Robotics has closed a CAD $8 million Series A round led by BDC’s Industrial Innovation Venture Fund and Vantage Futures, funding the expansion of its Cruz autonomous mobility pods into permanent airport deployments globally.

By Daniel Krauss | Edited by Kseniia Klichova Published:
An autonomous passenger mobility pod navigating a busy airport terminal, carrying a traveler toward a gate through crowded indoor space. Photo: A&K Robotics

A&K Robotics, a Canadian company building autonomous mobility infrastructure for airports, has closed a CAD $8 million Series A investment. The round was led by BDC Capital’s Industrial Innovation Venture Fund and Vantage Futures, the corporate venture arm of Vantage Group, a global airport and transportation infrastructure operator. Additional investors include RiSC Capital, Grep VC, Nimbus Synergies, and Dan Gelbart, co-founder of Creo and Kardium.

The funding will support A&K’s transition from pilot programs to permanent deployments, the expansion of production capacity, and the acceleration of adoption across major airport networks globally.

The Problem and the Product

Roughly 17% of the global population lives with mobility challenges, and requests for airport assistance are growing 10-15% annually – a rate that consistently outpaces overall passenger growth. Airports face a structural gap between rising accessibility demand and the labor available to address it.

A&K’s response is Cruz, a self-driving mobility pod purpose-built for dense indoor environments. Passengers select a destination and the vehicle navigates autonomously using onboard sensors and AI, dynamically adjusting its path to move safely through crowds. The system is designed for continuous operation, enabling airports to deliver consistent accessible mobility without scaling headcount proportionally.

Cruz is powered by the company’s Kinesos AI platform, which A&K describes as a system for socially intelligent autonomy – enabling vehicles to move naturally through dynamic, unpredictable human environments rather than following fixed routes or requiring controlled corridors.

Current Deployments

Cruz is already operating in live airport environments. Deployed customers include Vancouver International Airport, ranked Best Airport in North America 15 times by aviation research firm Skytrax, and Madrid-Barajas Airport, operated by Aena – the world’s largest airport operator by passenger volume, serving more than 380 million travelers annually. The combination of a North American and a major European reference deployment gives A&K a geographic footprint that supports broader enterprise sales conversations.

“While others focus on roads, we’re tackling the harder problem – navigating dense, unpredictable airport crowds,” said Jessica Yip, COO of A&K Robotics. “Autonomous mobility is already standard in warehouses. We are bringing it into the most complex indoor environments.”

Expanding Manufacturing Capacity

The Series A will fund the establishment of a third facility in Surrey, British Columbia, expanding into a 55,000 square-foot site at Manterra Technologies. The move is expected to increase manufacturing output from dozens to hundreds of autonomous vehicles per year. A new rapid prototyping and R&D facility has also been established to accelerate iteration on deployment-ready systems.

The manufacturing expansion reflects a deliberate shift in focus from technology validation to production scale. Airport operators evaluating permanent deployments require supply chain certainty that pilot-stage companies cannot always provide – the new facility is designed to address that constraint directly.

“Their ability to deploy in dense, high-traffic airport environments positions them as a key partner for operators looking to improve operational efficiency, enhance passenger experience, and scale autonomous mobility across global networks,” said Matthew Handford, Executive Managing Director at Vantage Futures.

Business & Markets, News, Robots & Robotics

Zoomlion Debuts Robot Ops Platform at Hannover Messe 2026, Targeting Industrial AI Deployment

Zoomlion has made the global debut of Robot Ops, an embodied intelligence operating system designed to standardize and accelerate robot deployment across industrial, construction, and logistics applications, at Hannover Messe 2026.

By Laura Bennett | Edited by Kseniia Klichova Published:
A wheeled humanoid robot and logistics mobile robot collaborating on a sorting task during a live demonstration at an industrial technology exhibition. Photo: Zoomlion

Zoomlion Heavy Industry Science and Technology, the Hong Kong-listed Chinese industrial machinery manufacturer, made the global debut of Robot Ops at Hannover Messe 2026 this week. The platform is a full-stack embodied intelligence operating system designed to standardize the development and deployment of robots across industrial, logistics, construction machinery, and autonomous driving applications.

The launch positions Zoomlion – better known for cranes, concrete equipment, and agricultural machinery – as an active participant in the industrial AI software layer, not just a hardware operator. The company is exhibiting at the show alongside Amazon Web Services and is participating in the China Pavilion’s Invest in China launch ceremony.

What Robot Ops Does

Robot Ops is built around an engineering concept the company describes as “Data, Software, and Agents,” integrating three operational disciplines – DevOps, DataOps, and AgentOps – into a unified platform. The system covers the full lifecycle of robot deployment: data collection, model training, simulation verification, application development, and ongoing deployment maintenance.

The platform comprises four modules covering basic development tools, imitation learning, reinforcement learning, and task orchestration. Zoomlion says the system improves closed-loop iteration efficiency by more than 50% and is designed to lower the technical barrier for organizations building and deploying robotic systems at scale.

The platform directly targets four challenges that have slowed industrial robotics adoption: high technical barriers to entry, difficulty migrating robot behaviors across different scenarios, data bottlenecks in training pipelines, and the absence of structured lifecycle management tools.

Live Demonstration at Hannover

At the show, Zoomlion is running live multi-robot demonstrations under Robot Ops scheduling. A wheeled humanoid robot and a logistics mobile robot collaborate on a logistics-sorting scenario, with the platform managing algorithm coordination, task orchestration, and on-site execution across both systems simultaneously. The company’s first-generation mass-produced humanoid robot, the Z1, is also on display performing dynamic motion-control demonstrations.

The multi-robot setup is designed to demonstrate Robot Ops’ capacity to coordinate heterogeneous robot types – different hardware, different task profiles – within a single orchestration layer, which is the core engineering claim the platform is built around.

Broader Industrial AI Context

Zoomlion is also presenting its Industry 5.0 intelligent manufacturing solutions at the show, including its Smart Industrial City initiative, which integrates digital twins, intelligent scheduling, industrial AI, and end-to-end logistics automation into manufacturing operations.

The Robot Ops debut reflects a pattern visible across Hannover Messe 2026 more broadly: established industrial companies using the event to announce software and AI platforms that sit above their existing hardware operations, targeting the orchestration and deployment layer rather than competing purely on robot specifications. For Zoomlion, whose core business is heavy construction equipment, the move into embodied intelligence software represents a deliberate effort to participate in the higher-margin, faster-growing segment of the industrial automation stack.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

Agibot Declares 2026 “Deployment Year One”, Unveils Five Robot Platforms and Open AI Architecture

Agibot used its annual partner conference to declare 2026 the first year of large-scale commercial deployment for embodied AI, unveiling five new robotic platforms, eight AI models, and an open-source development architecture called AIMA.

By Rachel Whitman | Edited by Kseniia Klichova Published:
: A lineup of humanoid and wheeled robotic platforms displayed at an embodied AI product conference. Photo: Agibot

Agibot used its annual partner conference, APC 2026, to declare 2026 as “Deployment Year One” for embodied AI – the point at which the industry transitions from validating robot capabilities to generating measurable productivity at scale. The company unveiled five new robotic platforms, eight AI models, and a full-stack open development architecture called AIMA, framing the announcements as the infrastructure layer for its next phase of commercialization.

The declaration is grounded in operational data. Agibot rolled out its 10,000th humanoid robot in March 2026, and its humanoid robot revenue grew more than 22-fold in 2025 to become the company’s largest revenue stream, according to figures the company has previously reported.

Five Platforms, One Unified Architecture

Agibot positioned itself at APC 2026 as the only company offering a full-series lineup spanning humanoids, wheeled platforms, and multi-form robots across different sizes and deployment scenarios. The five new platforms are built on what the company calls a “One Robotic Body with Three Intelligences” framework, integrating motion intelligence, interaction intelligence, and operation intelligence into a unified system.

The architecture is designed to address a core limitation of earlier robotic deployments: systems optimized for a single task or environment. By coupling perception, decision-making, and physical execution within one hardware and software stack, Agibot argues that robots can generalize across complex real-world environments rather than operating within narrowly defined parameters.

Seven Commercial Solutions

To support faster enterprise adoption, Agibot introduced seven standardized productivity solutions targeting specific industrial scenarios: loading and unloading, industrial handling, logistics sorting, guidance and retail assistance, retail service stations, security patrol, and industrial and commercial cleaning. Each solution bundles hardware, AI models, and data infrastructure into a repeatable deployment package, reducing the integration complexity that has historically extended robotics pilot timelines.

“The industry is moving from proving what robots can do, to proving what value they can consistently deliver at scale,” said Edward Deng, Founder, Chairman, and CEO of Agibot.

AIMA: An Open Architecture for Embodied AI

The most structurally significant announcement was the launch of AIMA – AI Machine Architecture – described as the first complete open technology system for embodied intelligence. Built on a “1+3+X” design, AIMA consists of a unified robot operating system called Link-U OS, three development platforms covering motion creation, interaction design, and task development, and an extensible layer supporting third-party applications and the AGIBOT Embodied Agent Framework.

The architecture provides an end-to-end toolchain from low-level system control to high-level application development, and Agibot intends to continue open-sourcing components to attract developers and partners. The company plans to invest more than 2 billion yuan over five years to expand the ecosystem, targeting partnerships with universities, industry operators, and a large-scale developer community.

A Three-Stage Industry Roadmap

Agibot also presented a long-term framework for the embodied AI industry structured around three development curves. The current phase, running through 2026, covers foundational development and early adoption. The period from 2026 to 2030 is characterized as a deployment growth stage, during which robot productivity is expected to approach human levels and scenario-based deployment scales significantly. From 2030 onward, the company projects a qualitative leap in generalization capability, collective intelligence, and robots beginning to surpass human productivity in selected domains.

The roadmap is a strategic positioning exercise as much as a technical forecast. With more than 150 humanoid robot manufacturers active in China alone, the company that can credibly claim platform and ecosystem leadership – rather than competing on individual robot specifications – is likely to capture a disproportionate share of the emerging enterprise market.

Artificial Intelligence (AI), Business & Markets, News, Robots & Robotics, Science & Tech

NVIDIA and Partners Demonstrate Production-Ready AI Manufacturing Systems at Hannover Messe 2026

NVIDIA and more than a dozen industrial partners used Hannover Messe 2026 to demonstrate AI-driven manufacturing systems already operating in live production environments, from humanoid robots in electronics factories to vision AI agents on automotive assembly lines.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Robotic systems and AI-driven automation equipment operating on a factory floor during a major industrial technology exhibition. Photo: HANNOVER MESSE

NVIDIA used Hannover Messe 2026 to present a coordinated set of industrial AI deployments across robotics, simulation, vision AI, and edge computing – alongside partners spanning Siemens, Microsoft, ABB, Dassault Systèmes, and a range of specialized software and hardware firms. The common thread across the demonstrations was an emphasis on systems already operating in production rather than technology previews, with several partners presenting quantified outcomes from live deployments.

The show ran April 20-24 in Hannover, Germany, and served as a staging ground for NVIDIA’s physical AI ecosystem, built around its Omniverse, Isaac, Jetson, and IGX compute platforms.

Humanoid Robots in Live Production

The most concrete robotics demonstration came from Humanoid, whose HMND 01 wheeled humanoid has completed autonomous logistics operations at a Siemens electronics factory in Erlangen, Germany – described as a first proof of concept within a live production environment. The robot runs NVIDIA’s Jetson Thor edge AI module for on-device compute and was developed using Isaac Sim and Isaac Lab for simulation and reinforcement learning.

A simulation-first development approach compressed what typically takes up to two years of hardware development down to seven months, according to Humanoid.

A second notable humanoid deployment involves Hexagon Robotics, whose AEON robot is preparing for assembly operations at a BMW plant in Leipzig – one of the first humanoid deployments in a German production environment. The system was developed using NVIDIA’s Physical AI Data Factory Blueprint and IGX Thor for industrial-grade edge compute with functional safety certification.

SCHUNK’s GROW automation cell demonstrated a standardized, deployable form of physical AI for small and medium-sized manufacturers. The system uses NVIDIA Omniverse and Isaac simulation to train and validate robot behavior before deployment, with Wandelbots’ NOVA platform managing continuous refinement on the shop floor and EY designing the operating model for European SME rollout.

Vision AI Agents on the Factory Floor

Several partners demonstrated vision AI systems built on NVIDIA’s Metropolis and Cosmos platforms, targeting quality control, safety monitoring, and operational intelligence.

Invisible AI launched its Vision Execution System at the show, an agent-based platform that captures and analyzes every production cycle in real time using the NVIDIA Metropolis VSS Blueprint and Cosmos Reason 2 models. The system is already deployed at major automotive manufacturers including Toyota.

Tulip Interfaces showcased Factory Playback, which synchronizes machine telemetry, operator workflows, quality events, and video into a searchable operational timeline. Terex, an industrial equipment manufacturer operating more than 40 plants, uses the platform and is projected to achieve a 3% yield increase and 10% reduction in rework.

Fogsphere demonstrated vision AI deployment in high-risk industrial environments, with Saipem using the platform to detect and respond in real time to safety and environmental events on energy infrastructure.

Sovereign AI Infrastructure

Underlying many of the deployments on display is the Industrial AI Cloud, built in Germany by Deutsche Telekom on NVIDIA infrastructure and designed as a sovereign AI platform for European industry. The facility provides a secure foundation for running AI workloads – from factory-scale digital twins to software-defined robotics – under European data governance requirements.

ABB, Dassault Systèmes, Kongsberg Digital, Microsoft, and Siemens each demonstrated digital twin capabilities built on NVIDIA Omniverse libraries, with applications ranging from real-time asset performance analysis to stress-testing factory configurations before physical changes are made.

QNX expanded its collaboration with NVIDIA to cover safety-critical edge AI, with QNX OS for Safety 8.0 now integrated on NVIDIA IGX Thor alongside the NVIDIA Halos safety stack – a combination targeting robotics, medical, and industrial applications where functional safety certification is a deployment requirement.

NEURA Robotics and AWS Partner to Scale Physical AI Training and Deploy Robots in Amazon Fulfillment Centers

NEURA Robotics and Amazon Web Services have announced a strategic collaboration to train, validate, and deploy cognitive robots at scale, with Amazon exploring deployment of NEURA systems in select fulfillment centers as a real-world data source for Physical AI development.

By Laura Bennett | Edited by Kseniia Klichova Published:
A cognitive humanoid robot operating in a warehouse fulfillment environment alongside human workers. Photo: NEURA Robotics

NEURA Robotics, the German cognitive robotics company, and Amazon Web Services have announced a strategic collaboration to accelerate the development and global deployment of physical AI systems. AWS will serve as the primary cloud provider for NEURA’s Neuraverse platform, handling AI training, real-world data processing, and shared intelligence across robot fleets. Amazon will separately explore deploying NEURA robotic systems in select fulfillment centers, providing production-environment data to accelerate the development of new robotic capabilities in logistics and warehouse operations.

The partnership addresses what both companies describe as the central constraint on physical AI progress: the data gap. Unlike large language models trained on trillions of internet-sourced data points, robotic AI systems have access to a fraction of that volume, and the data they need can only be generated through real-world operation.

Three Areas of Collaboration

The agreement spans cloud infrastructure, AI development, and real-world validation. AWS will provide the computational backbone for the Neuraverse, NEURA’s platform for training and sharing robotic intelligence across fleets. NEURA Gym – a purpose-built training environment where robots practice complex tasks in controlled settings alongside high-fidelity simulation – will integrate with Amazon SageMaker to accelerate joint training pipelines across NEURA and partner use cases.

The real-world validation component is the most strategically significant element. Amazon’s fulfillment centers represent one of the most operationally demanding and data-rich environments available for robotic deployment – high throughput, variable product mix, and continuous operation at global scale. Each deployment generates the kind of sensor data, task variety, and edge-case exposure that controlled training environments cannot replicate.

“Physical AI will only reach its full potential if intelligence can be trained, validated, and continuously improved in the real world,” said David Reger, CEO and founder of NEURA Robotics. “With AWS, we gain the infrastructure to scale the Neuraverse globally. With Amazon, we have the opportunity to bring Physical AI into one of the most advanced operational environments in the world.”

The Data Infrastructure Problem

The collaboration is built around a structural challenge that applies across the robotics sector. Simulation can approximate physical environments but cannot fully replicate the variability of real-world conditions – surface irregularities, lighting changes, unexpected object configurations, and human interaction patterns. Continuous feedback loops between simulation and real-world deployment are required to close that gap over time.

AWS’s role is to make those loops faster and more scalable. By running the Neuraverse on cloud infrastructure with global reach, NEURA can distribute trained intelligence across its entire robot fleet in near real time, so improvements derived from one deployment environment propagate across all systems.

Ecosystem and Scale

The AWS partnership extends a network NEURA has been assembling across cloud, semiconductors, and industrial deployment. The company’s existing partners include Kawasaki – one of the ten largest robotics companies globally – alongside Schaeffler, Bosch, and Qualcomm Technologies. The stated goal is to enable millions of cognitive robots by 2030.

NEURA has not disclosed the financial terms of the AWS agreement, the timeline for Amazon fulfillment center deployments, or the specific robotic systems under consideration for those pilots. The fulfillment center component is framed as exploratory, meaning commercial deployment at scale remains contingent on performance outcomes from the initial trials.

Business & Markets, News, Robots & Robotics, Science & Tech
Exit mobile version