Siemens Partnership Launches UK’s First Customizable AMR Manufacturing Capability

Siemens and two robotics partners have launched the UK’s first fully customizable autonomous mobile robot manufacturing capability, aimed at modernizing factory logistics and industrial automation.

By Laura Bennett | Edited by Kseniia Klichova Published:
A new Siemens-led collaboration is enabling the UK’s first fully customizable autonomous mobile robot manufacturing capability for industrial logistics. Photo: Siemens

Siemens has partnered with Expert Technologies Group and RMGroup to create the United Kingdom’s first manufacturing capability dedicated to fully customizable autonomous mobile robots (AMRs). The initiative aims to provide domestic manufacturers with flexible automation systems designed to improve factory logistics and material handling.

The collaboration marks a strategic effort to strengthen the UK’s industrial automation ecosystem by developing robotics systems locally rather than relying on imported technologies. By combining Siemens’ automation software with robotics integration expertise from the two partner companies, the project creates an end-to-end platform for deploying AMR systems in manufacturing and warehouse environments.

Autonomous mobile robots are increasingly used to move materials within factories and distribution centers, offering an alternative to traditional automated guided vehicles that rely on fixed tracks or dedicated infrastructure.

Flexible Robotics for Modern Manufacturing

AMRs differ from earlier factory transport systems by navigating environments using onboard sensors, real-time mapping, and intelligent path planning. This allows them to move dynamically through busy industrial settings while avoiding obstacles and adapting to layout changes.

The robots produced through the new partnership are designed to be customizable for different industrial environments. Manufacturers can configure systems to support tasks such as delivering components to assembly lines, transporting finished goods to storage areas, or supplying materials to production cells.

The technology platform integrates Siemens’ SIMOVE control software, which enables robots to coordinate movements and manage logistics operations across facilities. Expert Technologies Group contributes its FlexDrive AMR platform, providing modular drive systems and navigation capabilities. RMGroup adds robotics integration and safety systems designed for industrial environments.

Together, these technologies allow factories to deploy robotics fleets tailored to their operational requirements, rather than adapting workflows to standardized equipment.

Building Domestic Robotics Capability

The partnership reflects growing interest in strengthening domestic robotics manufacturing in the UK. Many automation systems used in British factories are imported, which can create challenges related to integration, maintenance, and technical support.

By building robots locally and supporting them with domestic engineering teams, the collaboration aims to provide manufacturers with more reliable deployment and long-term support. The partners say this approach addresses a common problem in automation projects where overseas suppliers cannot provide sufficient integration support.

Local development also allows robotics systems to be adapted more easily to specific industry requirements, including aerospace, automotive, food processing, and logistics operations.

In addition to robotics hardware, the system incorporates wireless connectivity technologies such as industrial Wi-Fi and 5G to support real-time communication between robots, factory infrastructure, and digital management systems.

Autonomous Logistics Becomes Central to Industrial Automation

Material movement inside factories and warehouses represents a significant operational challenge for manufacturers. Traditionally, these tasks rely heavily on manual labor or fixed automation systems that are difficult to modify when production changes.

Autonomous mobile robots offer a more adaptable solution by enabling dynamic logistics operations. Fleets of AMRs can coordinate movements, optimize routes, and respond to changing workloads without requiring major infrastructure modifications.

These capabilities also allow manufacturers to scale automation gradually, starting with small deployments and expanding fleets as operational needs grow.

As manufacturing becomes more digital and data-driven, AMRs are increasingly integrated with digital twin simulations and factory management systems. This allows companies to analyze workflows, optimize operations, and improve productivity.

The Siemens-led collaboration represents a step toward building a domestic robotics ecosystem capable of supporting these advanced manufacturing technologies. By combining automation software, robotics hardware, and integration expertise, the partnership aims to help UK manufacturers deploy flexible, intelligent logistics systems as automation becomes an essential component of modern industrial production.

Business & Markets, News, Robots & Robotics, Science & Tech

China Establishes First National Standards for Humanoid Robots

China has introduced its first national standard system for humanoid robotics, aiming to unify technical specifications and accelerate commercial deployment across industries.

By Laura Bennett | Edited by Kseniia Klichova Published:
Officials and industry experts gather in Beijing to unveil China’s first national standard system for humanoid robotics, aimed at accelerating commercialization and ensuring safety alignment.

China has formally introduced its first national standard system for humanoid robotics, marking a coordinated effort to structure one of the country’s fastest-growing technology sectors.

The framework was unveiled at the Humanoid Robots and Embodied Intelligence Standardization meeting in Beijing. It establishes unified technical guidelines intended to streamline development, reduce fragmentation, and accelerate the transition from pilot projects to commercial deployment.

The move signals that policymakers view humanoid robotics not as an experimental field, but as an emerging industrial category requiring formal governance.

Six Pillars for Industrial Alignment

The standard system is organized around six core pillars: foundational and common standards, neuromorphic and intelligent computing, limbs and key components, full-system integration, application scenarios, and safety and ethics.

Together, these categories define technical specifications, interface protocols, and evaluation benchmarks. Committee experts involved in the initiative said the goal is to reduce coordination friction between suppliers, lower production costs, and shorten iteration cycles across the value chain.

By clarifying interfaces and performance metrics, the framework is designed to enable interoperability between hardware platforms, software systems, and embodied AI models. It also embeds safety and ethical considerations into early-stage development, reflecting regulatory awareness as robots move into workplaces and homes.

From Prototypes to Scaled Deployment

According to China’s Ministry of Industry and Information Technology, 2024 marked the country’s first year of humanoid robot mass production. More than 140 domestic companies released over 330 models, with deployments expanding into manufacturing, household services, healthcare, and elderly care.

Until now, much of that growth has occurred in a relatively fragmented environment, with companies developing proprietary architectures and evaluation criteria. National standards are expected to impose structure on a rapidly expanding ecosystem.

The framework could also serve a strategic function. As Chinese firms compete globally in embodied AI and humanoid robotics, standardized technical benchmarks may strengthen export readiness and ecosystem coordination.

While many humanoid deployments remain in early stages, the introduction of national standards suggests the industry is entering a new phase, where commercialization and regulatory alignment advance in parallel.

News, Policy & Regulation, Robots & Robotics

University of Southampton Develops Adaptive Robot Fin for Underwater Stability

Researchers at the University of Southampton have developed a flexible robotic fin with embedded electronic skin that automatically adapts to changing water currents, improving underwater robot stability and efficiency.

By Daniel Krauss | Edited by Kseniia Klichova Published:
The adaptive robotic fin developed at the University of Southampton integrates electronic skin and hydraulic actuation to automatically counteract flow disturbances in underwater environments. Photo: University of Southampton

Autonomous underwater vehicles are built to withstand unpredictable ocean conditions, but their rigid fins often require significant energy to counteract sudden currents and turbulence. Researchers at the University of Southampton are proposing a different approach: fins that sense water flow and adjust their shape in real time.

The team has developed a flexible robotic fin embedded with electronic skin capable of detecting subtle changes in water movement. The system automatically modifies the fin’s stiffness and curvature to stabilize underwater robots while reducing energy consumption.

The research, published in npj under the title “Harnessing proprioception in aquatic soft wings enables hybrid passive-active disturbance rejection,” reflects a broader push toward soft robotics and adaptive control in marine environments.

Inspired by Biological Sensing

The design draws from biological proprioception mechanisms observed in birds and fish. Birds detect airflow changes through sensory feedback in their feathers, while fish rely on lateral line systems and fin rays to perceive water disturbances.

To replicate similar sensing capabilities, the Southampton engineers embedded flexible liquid metal wiring inside a silicone fin. When water flow deforms the fin, the integrated electronic skin registers changes in electrical resistance. These signals are transmitted to a hydraulic system inside the robot’s body, which adjusts internal pressure through connected hoses to alter the fin’s shape.

Rather than relying solely on active propulsion corrections, the system combines passive flexibility with active hydraulic adjustment.

Reducing Energy Use in Turbulent Waters

Rigid AUVs typically expend substantial energy to maintain orientation when struck by waves or shifting currents. According to the researchers, the adaptive fin significantly improves disturbance rejection.

In controlled tests, the fin reduced unwanted buoyancy effects caused by sudden water flow by 87 percent compared with a similar vehicle using rigid fins. The robot demonstrated improved self-stabilization and maneuverability while consuming less energy to maintain position.

The findings suggest potential advantages for underwater inspection, environmental monitoring, and defense applications where energy efficiency and stability are critical.

Technical Constraints Remain

Despite promising results, integration challenges remain. Scaling the flexible system to larger vehicles and embedding it into rigid hull designs could complicate deployment. Long-term durability of the electronic skin and hydraulic components in harsh marine environments also requires further validation.

The researchers note that more robust actuators and structural refinements may help address these constraints.

The project illustrates how bio-inspired sensing and soft robotics are reshaping underwater vehicle design. As offshore energy, marine research, and subsea infrastructure monitoring expand, adaptive control systems such as this may become increasingly relevant to improving endurance and operational stability in dynamic ocean conditions.

News, Robots & Robotics, Science & Tech

MWC 2026 Marks Shift From AI Apps to AI Native Hardware

Mobile World Congress 2026 highlighted a decisive shift as AI moved beyond apps and into physical devices, from humanoid robots and AI glasses to smartphones with mechanical motion systems.

By Rachel Whitman | Edited by Kseniia Klichova Published:
Humanoid robots, AI glasses and AI-integrated smartphones on display at MWC 2026 reflect a broader industry shift toward AI-native hardware design. Photo: MWC

Mobile World Congress 2026 underscored a structural change in the AI industry: artificial intelligence is no longer confined to apps running on smartphones. It is beginning to reshape the hardware itself.

Across the exhibition floor in Barcelona, companies presented humanoid robots controlled entirely by voice, AI glasses positioned as daily computing devices, and smartphones equipped with mechanical camera systems that physically move. The theme was consistent: large AI models are evolving from software layers into defining elements of device architecture.

Smartphone Makers Enter Robotics

Several Chinese smartphone manufacturers used MWC to demonstrate ambitions beyond handsets.

Honor unveiled its first humanoid robot during its global launch event, showcasing AI-driven motion control and multimodal interaction. The demonstration included acrobatic movements and coordinated choreography, signaling technical progress in embodied control systems.

Xiaomi, which introduced its CyberOne humanoid in 2022, did not display a robot on the show floor but reported new milestones. According to the company, its humanoid systems have begun operating in automotive factories, performing tasks such as self-tapping nut installation and material transport. Chairman Lei Jun said large-scale deployment in Xiaomi’s factories could occur within five years.

The move into robotics comes as smartphone growth slows. IDC estimates that China’s smartphone shipments reached roughly 284 million units in 2025, a slight year-on-year decline. For manufacturers with in-house chips, operating systems, and AI models, robotics represents an adjacent growth market built on overlapping technologies.

Lu Weibing, president of Xiaomi’s mobile division, has argued that investments in proprietary silicon, operating systems, and foundational AI are interconnected and transferable to robotics platforms.

Other technology firms are also advancing embodied systems. At MWC, iFlytek demonstrated a humanoid guide robot powered by upgraded multimodal voice interaction, eliminating the need for handheld remote controls. China Mobile presented an unmanned restaurant concept in which embodied robots collaborated on ordering, food preparation, and delivery.

These deployments suggest that large models are increasingly acting as real-time control interfaces rather than conversational add-ons.

AI Glasses and the Search for Monetization

While AI apps saw a surge in daily active users during China’s Spring Festival promotions, retention and revenue models remain uncertain. Several internet companies are now shifting attention toward AI hardware.

Alibaba’s Qwen brand introduced its first AI glasses at MWC, embedding large language models into wearable devices capable of translation, transcription, photography, and object recognition. The devices are positioned for both consumer and professional use.

IDC forecasts that global smart glasses shipments will exceed 23 million units by 2026, including nearly 5 million units in China. Compared with heavily subsidized AI apps, glasses offer a direct hardware revenue stream and clearer monetization path.

iFlytek also debuted lightweight AI glasses weighing approximately 40 grams, emphasizing multimodal recording and translation capabilities.

Redefining the Smartphone Form

AI integration is also altering the smartphone itself.

ZTE showcased AI-powered devices that embed assistants directly into the system layer, enabling cross-application control via natural language. Rather than functioning as standalone apps, these AI agents are integrated into core operating system workflows.

Honor introduced a more experimental concept: a “Robot Phone” featuring a motorized multi-axis gimbal paired with a 200-megapixel sensor. The device can physically rotate and track users during video calls, combining AI vision with mechanical motion.

The common thread across categories is the shift from AI-enabled hardware to AI-defined hardware. Large models are beginning to influence device structure, interaction methods, and mechanical design.

MWC 2026 did not present a single dominant form factor. Instead, it revealed a competitive search for the most natural interface between AI systems and the physical world. Whether that interface proves to be humanoid robots, wearable glasses, or reengineered smartphones remains unsettled. What is clear is that AI is no longer just inside devices. It is beginning to shape what those devices become.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

Georgia Tech Researchers Develop Robot Pollinator for Indoor Farms

Researchers at Georgia Tech have developed a robot pollinator that uses computer vision and 3D modeling to automate flower pollination in indoor farms.

By Laura Bennett | Edited by Kseniia Klichova Published:
A prototype robot pollinator developed at Georgia Tech uses computer vision to determine flower orientation before performing targeted pollination. Photo: Georgia Tech Research Institute

Researchers at Georgia Tech have developed a robotic system designed to automate pollination inside indoor farms, addressing one of the most labor-intensive challenges in vertical agriculture.

The prototype, created by engineers at the Georgia Tech Research Institute (GTRI) and the George W. Woodruff School of Mechanical Engineering, uses computer vision and robotic manipulation to pollinate flowering plants without human intervention.

As indoor farming expands in urban environments, automating pollination has become a critical bottleneck in scaling production.

Pollination without Bees

Indoor farms offer several advantages over traditional agriculture, including year-round production, reduced water use, and minimal pesticide reliance. However, enclosed growing environments prevent natural pollinators such as bees from accessing crops.

For many flowering plants grown indoors – including strawberries and tomatoes – farmers must manually transfer pollen using brushes or vibrating tools. The process is repetitive and time-consuming, limiting scalability.

The Georgia Tech team’s robot is designed to pollinate plants that contain both male and female reproductive structures within the same flower. These plants require pollen transfer within a single bloom rather than cross-pollination between separate flowers.

By automating this step, researchers aim to reduce labor demands and increase consistency in crop yields.

Teaching a Robot to Understand Flower Orientation

One of the central technical challenges was enabling the robot to recognize the “pose” of each flower – its orientation, symmetry, and position relative to the stem.

Accurate pose detection is critical because pollen must be delivered precisely to the reproductive structures at the front of the flower. Even small alignment errors can reduce pollination effectiveness.

To solve this, the team developed a computer vision pipeline that reconstructs flowers in 3D from multiple camera images. The 3D model is then converted into depth-enhanced 2D representations that can be processed by object detection algorithms.

The researchers used a real-time object detection system known as YOLO (You Only Look Once) to classify flower features in a single processing pass. By converting 3D data into structured 2D inputs, they leveraged the abundance of training resources available for 2D computer vision systems.

The approach enabled the robot to estimate flower orientation with sufficient precision to approach and manipulate the stem correctly.

From Detection to Physical Interaction

Once the robot identifies the flower’s pose, it grips the stem and applies controlled vibration to dislodge and distribute pollen within the bloom.

Unlike simple mechanical vibration tools, the system integrates perception, positioning, and actuation into a single workflow. This coordination is essential in dense vertical farming environments where flowers vary in size, spacing, and orientation.

The prototype was built in Georgia Tech’s Safe Robotics Lab and remains in testing.

Adding Microscopic Feedback

Beyond basic pollination, the system includes an inspection capability that allows it to evaluate pollination success. The robot can perform close-up imaging of flower structures to assess whether pollen has been effectively transferred.

This feedback loop is a notable feature, as most manual pollination methods offer no immediate verification of success.

The research team has documented its technical approach in a paper accepted to the 2025 International Conference on Robotics and Automation (ICRA).

Automation Expands in Controlled Agriculture

Indoor farming is often promoted as a solution to urban food supply challenges and climate variability. However, high labor costs and operational complexity have slowed widespread adoption.

Automating tasks such as pollination could help reduce those barriers. Robotics in agriculture has traditionally focused on harvesting and monitoring, but pollination represents a more delicate and technically demanding process.

The Georgia Tech prototype demonstrates how advances in AI perception and robotic control can be applied to biological systems.

While the system remains in early development, it illustrates how robotics may increasingly support food production in controlled environments – where precision, repeatability, and data-driven feedback are essential for scaling output.

News, Robots & Robotics, Science & Tech

Revobots Launches All-Weather Autonomous Patrol Robot for Outdoor Security

Revobots has introduced TASKBOT SCOUT XT, an all-weather autonomous patrol robot designed for outdoor enforcement and campus monitoring under a Robots-as-a-Service model.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Revobots’ TASKBOT SCOUT XT is designed for outdoor patrol, featuring an all-wheel-drive chassis and weather-resistant enclosure. Photo: Campus Innovation

Revobots has introduced an all-weather version of its autonomous patrol robot, expanding its security robotics platform beyond indoor facilities and into outdoor environments.

The new system, called TASKBOT SCOUT XT, is engineered for exterior enforcement and monitoring tasks across campuses, parking lots, and mixed-use spaces. The Phoenix-based company says the robot is designed to address one of the longstanding limitations of autonomous patrol systems: reliable operation in unpredictable weather and uneven terrain.

The launch reflects growing demand for robotics solutions that can supplement security staffing in environments where labor shortages and operational costs continue to rise.

Hardware Upgrades for Outdoor Deployment

SCOUT XT builds on Revobots’ indoor patrol platform but incorporates significant hardware modifications to withstand environmental exposure.

The robot features an IP65-rated enclosure designed to protect against dust and water ingress. Its extended-wheelbase, all-wheel-drive chassis is intended to provide stability across uneven pavement, gravel, and surface transitions.

Outdoor-calibrated vision systems allow the robot to operate in variable lighting conditions, including bright daylight and low-light evening environments. Longer-range perception capabilities are designed to accommodate open spaces with fewer visual landmarks than indoor corridors.

All-terrain wheels further support navigation across cracked pavement, curb transitions, and mixed surfaces common in parking facilities and campus grounds.

Autonomous Operation with Human Oversight

SCOUT XT operates on Revobots’ existing backend infrastructure, including its Robots-as-a-Service subscription model and REVO Pilot human-in-the-loop oversight system.

By default, the robot navigates autonomously, using onboard AI to conduct patrol routes and monitor designated areas. When conditions exceed predefined thresholds – such as ambiguous detections or unusual environmental scenarios – the system can escalate to human supervisors for intervention.

This hybrid autonomy model is increasingly common in commercial robotics deployments, particularly in security applications where accountability and reliability are critical.

Campus Deployment Highlights Practical Use Case

Revobots said SCOUT XT recently completed pilot testing at Xavier University in Cincinnati. During the trial, the robot supported automated license plate recognition enforcement across multiple campus parking areas.

The deployment was designed to expand monitoring coverage without increasing staffing levels, a key consideration for educational institutions and other organizations managing large facilities.

Integration with existing campus infrastructure was supported through collaboration with Campus Innovation and its C-Park platform.

The university pilot demonstrates how outdoor patrol robots can supplement traditional security operations, particularly in structured environments such as campuses, business parks, and residential communities.

Expanding the Scope of Security Robotics

Autonomous security robots have typically been deployed indoors, where environmental variables are more predictable. Extending patrol capabilities outdoors introduces challenges including weather exposure, uneven terrain, and dynamic lighting.

By adapting its existing platform rather than building an entirely new system, Revobots is pursuing incremental expansion of its task-adaptive robotics model.

The broader security robotics market is evolving toward service-based deployment models, where customers subscribe to robotics coverage rather than purchase hardware outright. This approach lowers upfront costs and allows providers to maintain centralized oversight and software updates.

As robotics companies seek commercially viable applications, outdoor patrol represents a practical step toward broader real-world autonomy.

While fully autonomous security operations remain a long-term ambition, platforms like SCOUT XT illustrate how robotics companies are addressing specific operational gaps – expanding coverage, improving consistency, and reducing reliance on human patrol staffing in large, open environments.

Automation, News, Robots & Robotics
Exit mobile version