Texas Instruments and NVIDIA Partner to Accelerate Physical AI and Robotics

Texas Instruments and NVIDIA are expanding their collaboration to accelerate robots and other physical AI systems by combining advanced sensing, power electronics, and AI computing platforms.

By Daniel Krauss Published: Updated:
Texas Instruments and NVIDIA Partner to Accelerate Physical AI and Robotics
Texas Instruments and NVIDIA are combining sensing, power electronics, and AI computing platforms to accelerate the development of robots and other physical AI systems. Photo: Texas Instruments

Texas Instruments and NVIDIA have expanded their collaboration to accelerate the development of robots and other machines powered by physical AI. The initiative brings together Texas Instruments’ sensing and power technologies with NVIDIA’s AI computing platforms to support the next generation of autonomous systems operating in the physical world.

The partnership reflects a broader industry shift as artificial intelligence moves beyond data centers and digital applications into machines capable of sensing, reasoning, and acting in real environments. From industrial robots and autonomous vehicles to intelligent infrastructure, physical AI systems depend on tight integration between sensors, control electronics, and high-performance AI processors.

Bridging Sensing Hardware With AI Computing

Physical AI systems rely on a continuous loop of perception, decision-making, and action. Texas Instruments supplies many of the semiconductor components that allow machines to capture real-world signals, manage energy, and control motion with high precision.

Through the expanded collaboration, TI’s sensing, power management, and real-time control technologies will work alongside NVIDIA’s AI computing platforms used in robotics and autonomy.

“Our collaboration with NVIDIA will help engineers accelerate the development of autonomous machines by combining TI’s expertise in sensing and power with NVIDIA’s AI computing platforms,” said Amichai Ron, senior vice president of embedded processing at Texas Instruments.

By linking sensing hardware with high-performance computing, the companies aim to simplify the architecture required to build intelligent robots and autonomous machines that must operate safely in unpredictable environments.

Building Infrastructure for the Physical AI Era

The collaboration also underscores the growing importance of technology infrastructure designed specifically for machines interacting with the real world. Physical AI systems require hardware and software capable of interpreting sensor data, navigating complex environments, and executing precise mechanical actions in real time.

NVIDIA has been investing heavily in platforms that support robotics and autonomy, positioning physical AI as a major growth area across industries.

“Physical AI will enable a new generation of intelligent machines that can perceive, reason and act in the real world,” said Jensen Huang, founder and CEO of NVIDIA.

Together, NVIDIA’s computing platforms and Texas Instruments’ sensing and control technologies form a foundational stack for companies developing robots, autonomous vehicles, and industrial automation systems. As robotics moves into more dynamic real-world environments, such integrated ecosystems are expected to play a central role in scaling physical AI deployment across industries.

Surgeon in London Removes Prostate via Robot 1,500 Miles Away in Gibraltar

A surgeon in London remotely controlled a robotic system to remove a prostate from a patient in Gibraltar, demonstrating how teleoperated robotics could expand access to specialized surgical care.

By Laura Bennett Published: Updated:
Surgeon in London Removes Prostate via Robot 1,500 Miles Away in Gibraltar
A robotic surgical system performs a prostate removal in Gibraltar while the surgeon operates remotely from London, demonstrating how robotics and high-speed networks enable long-distance medical procedures. Photo: MicroPort / LinkedIn

A surgeon in London has successfully removed a patient’s prostate using a robotic system located roughly 1,500 miles away in Gibraltar, highlighting the growing potential of long-distance robotic surgery, reports The Guardian.

The procedure involved a 62-year-old patient, Paul Buxton, who underwent a robotic prostatectomy at St Bernard’s Hospital in Gibraltar while the operation was conducted remotely from London’s Harley Street district. The surgery was performed by Prof. Prokar Dasgupta, a leading urologist and head of the robotic centre of excellence at The London Clinic.

Using a specialized surgical console, Dasgupta controlled the Toumai Robotic System developed by Shanghai-based MicroPort’s MedBot. The robot, equipped with four articulated arms and a high-definition 3D camera, executed the delicate procedure inside the operating theatre while transmitting real-time visual feedback to the surgeon.

A Milestone in Remote Robotic Surgery

The operation relied on a high-speed fibre optic connection linking London and Gibraltar, supported by a backup 5G network to ensure continuity in case of connectivity issues. According to the medical team, the system maintained a delay of only 60 milliseconds between the surgeon’s commands and the robot’s movements.

“We operated on an NHS patient in Gibraltar from the London Clinic 2,400km away using a robot with a 3D HD camera with four arms,” Dasgupta said. “The robot is completely controlled from a console using high-speed lines with a time delay of only 0.06 seconds.”

Medical staff were present in the operating room in Gibraltar throughout the procedure, prepared to intervene if the connection failed or complications arose. The surgery was completed successfully, and Buxton reported feeling “fantastic” within days of the operation.

For the patient, who has lived in Gibraltar for four decades, the alternative would likely have involved travelling to the United Kingdom for treatment and spending weeks waiting for surgery.

“If I hadn’t gone for the telesurgery in Gibraltar, I would have had to fly to London and go on the NHS waiting list,” Buxton said, adding that taking part in the procedure felt like being involved in “medical history.”

Rapid Growth of the Toumai Surgical Robot

The Toumai system used in the operation is part of a rapidly expanding global surgical robotics platform. According to its developer, Toumai has surpassed 200 commercial orders worldwide across nearly 50 countries and regions, with close to 130 systems already installed in hospitals.

Adoption has accelerated quickly, doubling from just over 100 orders in late 2025 to more than 200 within a few months. Growth has been particularly strong in emerging healthcare markets such as India and Brazil, while hospitals in developed markets including Spain and Australia are also expanding deployments.

The platform has supported thousands of procedures across urology, thoracic surgery, general surgery, gynecology, and head and neck operations. Nearly 800 remote robotic surgeries have already been performed in more than 20 countries, all reportedly completed successfully.

Expanding Access to Specialist Care

Remote robotic surgery has long been viewed as a way to connect expert surgeons with patients in regions where specialized care is difficult to access. Instead of transporting patients long distances, the technology allows experienced doctors to perform procedures remotely while local medical teams provide on-site support.

Dasgupta said the potential humanitarian impact could be significant, particularly for patients in remote locations or smaller healthcare systems that lack highly specialized surgeons. “I think it is very, very exciting,” he said. “The humanitarian benefit is going to be significant.”

The success of the Gibraltar procedure comes as hospitals and technology providers worldwide continue experimenting with telesurgery, aided by improvements in robotics, fiber-optic networks, and low-latency communication systems. Dasgupta is expected to repeat the remote procedure soon while broadcasting it to thousands of surgeons attending a major European urology conference.

While remote surgery still requires rigorous safeguards and reliable connectivity, the milestone operation suggests that robotics may increasingly enable expert medical care to reach patients regardless of distance.

MWC 2026: HONOR Wins More than 70 Awards for AI Devices and Robotics

HONOR captured more than 70 media awards at Mobile World Congress 2026, highlighting its push into AI-powered devices, robotics concepts, and next-generation foldables.

By Daniel Krauss Published: Updated:
MWC 2026: HONOR Wins More than 70 Awards for AI Devices and Robotics
HONOR showcases its latest AI-powered devices and robotics concepts at Mobile World Congress 2026, where the company received more than 70 industry awards for innovation. Photo: HONOR

At Mobile World Congress 2026 in Barcelona, Chinese technology company HONOR received more than 70 media and industry awards for its latest devices and AI initiatives, underscoring the company’s growing focus on artificial intelligence, robotics-inspired hardware, and next-generation foldable smartphones.

The recognition highlighted HONOR’s broader strategy around “Augmented Human Intelligence,” a vision aimed at integrating AI capabilities deeply into consumer electronics.

The company showcased several new products and concepts at the event, including the Magic V6 foldable smartphone, the MagicPad 4 tablet, the MagicBook Pro 14 laptop, and experimental AI-driven hardware concepts such as its “Robot Phone.”

Robotics Concepts And Embodied AI

One of the most discussed demonstrations at the event was HONOR’s Robot Phone, a concept device designed to showcase how AI-powered hardware might physically interact with users in future devices.

The prototype combines robotic-style movement with AI-powered imaging and sensing capabilities. According to reports from major media outlets including Bloomberg and Reuters, the device represents an early experiment in embodied AI, where computing systems combine perception, reasoning, and physical interaction.

A video demonstration released during the event showed how the device can visually recognize objects and move to capture photos automatically. Coverage from Bloomberg, Reuters, and CNBC highlighted the concept as part of a wider shift among technology companies exploring physical AI devices.

Technology publications also discussed the device’s design approach. Engadget described it as resembling a compact personal robot integrated into a smartphone form factor, while GadgetMatch suggested it offered a preview of devices that behave more like intelligent assistants than traditional electronics.

Magic V6 Highlights Foldable Innovation

Alongside its experimental concepts, HONOR also received strong recognition for its flagship Magic V6 foldable smartphone, which reviewers widely described as one of the most advanced foldable devices introduced at the event.

Technology outlets including TechRadar, Android Authority, and Trusted Reviews praised the device for its thin design, durability improvements, and advanced battery technology.

The smartphone incorporates a silicon-carbon battery, a newer battery chemistry that improves energy density while allowing thinner device designs. According to reports from Stuff and GSMArena, the device also runs on Qualcomm’s Snapdragon 8 Elite Gen 5 processor.

HONOR’s battery innovation also received the Best Disruptive Device Innovation award at the Global Mobile Awards, presented during MWC. The winners were announced by the GSMA during the event’s annual ceremony, according to the official MWC GLOMO awards announcement.

Expanding AI Device Ecosystem

Beyond smartphones, HONOR used the Barcelona event to expand its broader AI device ecosystem.

The MagicPad 4 tablet and MagicBook Pro 14 laptop were presented as productivity-focused devices designed to integrate tightly with AI services and cross-device workflows. Reviewers highlighted the tablet’s lightweight design and performance improvements, while the laptop emphasized AI-assisted computing features.

Several outlets, including TechRadar and TechAdvisor, placed HONOR devices on their “Best of MWC 2026” lists, reflecting strong reception from technology journalists covering the show.

As smartphone makers increasingly compete on AI capabilities, HONOR’s presentation at MWC suggests the company is positioning itself not just as a smartphone manufacturer but as a broader AI hardware platform provider, experimenting with robotics-inspired designs and intelligent device ecosystems.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

China Establishes First National Standards for Humanoid Robots

China has introduced its first national standard system for humanoid robotics, aiming to unify technical specifications and accelerate commercial deployment across industries.

By Laura Bennett | Edited by Kseniia Klichova Published:
China Establishes First National Standards for Humanoid Robots
Officials and industry experts gather in Beijing to unveil China’s first national standard system for humanoid robotics, aimed at accelerating commercialization and ensuring safety alignment.

China has formally introduced its first national standard system for humanoid robotics, marking a coordinated effort to structure one of the country’s fastest-growing technology sectors.

The framework was unveiled at the Humanoid Robots and Embodied Intelligence Standardization meeting in Beijing. It establishes unified technical guidelines intended to streamline development, reduce fragmentation, and accelerate the transition from pilot projects to commercial deployment.

The move signals that policymakers view humanoid robotics not as an experimental field, but as an emerging industrial category requiring formal governance.

Six Pillars for Industrial Alignment

The standard system is organized around six core pillars: foundational and common standards, neuromorphic and intelligent computing, limbs and key components, full-system integration, application scenarios, and safety and ethics.

Together, these categories define technical specifications, interface protocols, and evaluation benchmarks. Committee experts involved in the initiative said the goal is to reduce coordination friction between suppliers, lower production costs, and shorten iteration cycles across the value chain.

By clarifying interfaces and performance metrics, the framework is designed to enable interoperability between hardware platforms, software systems, and embodied AI models. It also embeds safety and ethical considerations into early-stage development, reflecting regulatory awareness as robots move into workplaces and homes.

From Prototypes to Scaled Deployment

According to China’s Ministry of Industry and Information Technology, 2024 marked the country’s first year of humanoid robot mass production. More than 140 domestic companies released over 330 models, with deployments expanding into manufacturing, household services, healthcare, and elderly care.

Until now, much of that growth has occurred in a relatively fragmented environment, with companies developing proprietary architectures and evaluation criteria. National standards are expected to impose structure on a rapidly expanding ecosystem.

The framework could also serve a strategic function. As Chinese firms compete globally in embodied AI and humanoid robotics, standardized technical benchmarks may strengthen export readiness and ecosystem coordination.

While many humanoid deployments remain in early stages, the introduction of national standards suggests the industry is entering a new phase, where commercialization and regulatory alignment advance in parallel.

News, Policy & Regulation, Robots & Robotics

University of Southampton Develops Adaptive Robot Fin for Underwater Stability

Researchers at the University of Southampton have developed a flexible robotic fin with embedded electronic skin that automatically adapts to changing water currents, improving underwater robot stability and efficiency.

By Daniel Krauss | Edited by Kseniia Klichova Published:
University of Southampton Develops Adaptive Robot Fin for Underwater Stability
The adaptive robotic fin developed at the University of Southampton integrates electronic skin and hydraulic actuation to automatically counteract flow disturbances in underwater environments. Photo: University of Southampton

Autonomous underwater vehicles are built to withstand unpredictable ocean conditions, but their rigid fins often require significant energy to counteract sudden currents and turbulence. Researchers at the University of Southampton are proposing a different approach: fins that sense water flow and adjust their shape in real time.

The team has developed a flexible robotic fin embedded with electronic skin capable of detecting subtle changes in water movement. The system automatically modifies the fin’s stiffness and curvature to stabilize underwater robots while reducing energy consumption.

The research, published in npj under the title “Harnessing proprioception in aquatic soft wings enables hybrid passive-active disturbance rejection,” reflects a broader push toward soft robotics and adaptive control in marine environments.

Inspired by Biological Sensing

The design draws from biological proprioception mechanisms observed in birds and fish. Birds detect airflow changes through sensory feedback in their feathers, while fish rely on lateral line systems and fin rays to perceive water disturbances.

To replicate similar sensing capabilities, the Southampton engineers embedded flexible liquid metal wiring inside a silicone fin. When water flow deforms the fin, the integrated electronic skin registers changes in electrical resistance. These signals are transmitted to a hydraulic system inside the robot’s body, which adjusts internal pressure through connected hoses to alter the fin’s shape.

Rather than relying solely on active propulsion corrections, the system combines passive flexibility with active hydraulic adjustment.

Reducing Energy Use in Turbulent Waters

Rigid AUVs typically expend substantial energy to maintain orientation when struck by waves or shifting currents. According to the researchers, the adaptive fin significantly improves disturbance rejection.

In controlled tests, the fin reduced unwanted buoyancy effects caused by sudden water flow by 87 percent compared with a similar vehicle using rigid fins. The robot demonstrated improved self-stabilization and maneuverability while consuming less energy to maintain position.

The findings suggest potential advantages for underwater inspection, environmental monitoring, and defense applications where energy efficiency and stability are critical.

Technical Constraints Remain

Despite promising results, integration challenges remain. Scaling the flexible system to larger vehicles and embedding it into rigid hull designs could complicate deployment. Long-term durability of the electronic skin and hydraulic components in harsh marine environments also requires further validation.

The researchers note that more robust actuators and structural refinements may help address these constraints.

The project illustrates how bio-inspired sensing and soft robotics are reshaping underwater vehicle design. As offshore energy, marine research, and subsea infrastructure monitoring expand, adaptive control systems such as this may become increasingly relevant to improving endurance and operational stability in dynamic ocean conditions.

News, Robots & Robotics, Science & Tech

MWC 2026 Marks Shift From AI Apps to AI Native Hardware

Mobile World Congress 2026 highlighted a decisive shift as AI moved beyond apps and into physical devices, from humanoid robots and AI glasses to smartphones with mechanical motion systems.

By Rachel Whitman | Edited by Kseniia Klichova Published:
MWC 2026 Marks Shift From AI Apps to AI Native Hardware
Humanoid robots, AI glasses and AI-integrated smartphones on display at MWC 2026 reflect a broader industry shift toward AI-native hardware design. Photo: MWC

Mobile World Congress 2026 underscored a structural change in the AI industry: artificial intelligence is no longer confined to apps running on smartphones. It is beginning to reshape the hardware itself.

Across the exhibition floor in Barcelona, companies presented humanoid robots controlled entirely by voice, AI glasses positioned as daily computing devices, and smartphones equipped with mechanical camera systems that physically move. The theme was consistent: large AI models are evolving from software layers into defining elements of device architecture.

Smartphone Makers Enter Robotics

Several Chinese smartphone manufacturers used MWC to demonstrate ambitions beyond handsets.

Honor unveiled its first humanoid robot during its global launch event, showcasing AI-driven motion control and multimodal interaction. The demonstration included acrobatic movements and coordinated choreography, signaling technical progress in embodied control systems.

Xiaomi, which introduced its CyberOne humanoid in 2022, did not display a robot on the show floor but reported new milestones. According to the company, its humanoid systems have begun operating in automotive factories, performing tasks such as self-tapping nut installation and material transport. Chairman Lei Jun said large-scale deployment in Xiaomi’s factories could occur within five years.

The move into robotics comes as smartphone growth slows. IDC estimates that China’s smartphone shipments reached roughly 284 million units in 2025, a slight year-on-year decline. For manufacturers with in-house chips, operating systems, and AI models, robotics represents an adjacent growth market built on overlapping technologies.

Lu Weibing, president of Xiaomi’s mobile division, has argued that investments in proprietary silicon, operating systems, and foundational AI are interconnected and transferable to robotics platforms.

Other technology firms are also advancing embodied systems. At MWC, iFlytek demonstrated a humanoid guide robot powered by upgraded multimodal voice interaction, eliminating the need for handheld remote controls. China Mobile presented an unmanned restaurant concept in which embodied robots collaborated on ordering, food preparation, and delivery.

These deployments suggest that large models are increasingly acting as real-time control interfaces rather than conversational add-ons.

AI Glasses and the Search for Monetization

While AI apps saw a surge in daily active users during China’s Spring Festival promotions, retention and revenue models remain uncertain. Several internet companies are now shifting attention toward AI hardware.

Alibaba’s Qwen brand introduced its first AI glasses at MWC, embedding large language models into wearable devices capable of translation, transcription, photography, and object recognition. The devices are positioned for both consumer and professional use.

IDC forecasts that global smart glasses shipments will exceed 23 million units by 2026, including nearly 5 million units in China. Compared with heavily subsidized AI apps, glasses offer a direct hardware revenue stream and clearer monetization path.

iFlytek also debuted lightweight AI glasses weighing approximately 40 grams, emphasizing multimodal recording and translation capabilities.

Redefining the Smartphone Form

AI integration is also altering the smartphone itself.

ZTE showcased AI-powered devices that embed assistants directly into the system layer, enabling cross-application control via natural language. Rather than functioning as standalone apps, these AI agents are integrated into core operating system workflows.

Honor introduced a more experimental concept: a “Robot Phone” featuring a motorized multi-axis gimbal paired with a 200-megapixel sensor. The device can physically rotate and track users during video calls, combining AI vision with mechanical motion.

The common thread across categories is the shift from AI-enabled hardware to AI-defined hardware. Large models are beginning to influence device structure, interaction methods, and mechanical design.

MWC 2026 did not present a single dominant form factor. Instead, it revealed a competitive search for the most natural interface between AI systems and the physical world. Whether that interface proves to be humanoid robots, wearable glasses, or reengineered smartphones remains unsettled. What is clear is that AI is no longer just inside devices. It is beginning to shape what those devices become.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech