DEEPX, a South Korean AI semiconductor company specializing in ultra-low-power inference chips, has announced a strategic partnership with Hyundai Motor Group’s Robotics LAB to co-develop a next-generation AI computing platform for robotic systems. The two organizations have been working together on low-power edge AI technology for robotics over the past three years, and the new agreement formalizes that collaboration into a joint architecture program.
The partnership targets a specific technical problem: running large-scale generative AI models in real time on robotic hardware, without relying on cloud connectivity or data center-level power consumption.
The Technical Challenge
Modern robotics AI is increasingly built around Vision-Language-Action and Vision-Language Model architectures – systems that allow robots to interpret camera input, process natural language instructions, and make autonomous decisions in real time. These models are computationally intensive and have historically required significant power and connectivity to run, which limits their deployment in mobile, battery-powered, or field-based robotic systems.
The collaboration will focus on four areas: ultra-low-power AI semiconductor architecture, AI computing hardware systems for robotics, a physical AI software stack, and robotics application AI libraries. The goal is a cohesive computing platform that can support VLA and VLM models at the edge – on the robot itself – rather than offloading inference to external infrastructure.
At the center of the technical effort is DEEPX’s DX-M2, a next-generation chip the company describes as a Physical GenAI semiconductor, designed specifically to run large-scale AI models in ultra-low-power environments for robotics, autonomous mobile systems, and industrial automation applications.
Why On-Device Inference Matters
The shift toward on-device AI computation in robotics has direct implications for deployment viability. Robots operating in warehouses, factories, or outdoor environments cannot always maintain low-latency cloud connections, and the power budgets of mobile platforms place hard limits on the compute hardware they can carry. A chip capable of running generative AI models locally removes both constraints.
“The AI industry is rapidly shifting from data center-centric models to a Physical AI era,” said Lokwon Kim, CEO of DEEPX. “Ultra-low-power computing capable of running AI in real-world systems will become the core infrastructure.”
Hyundai Motor Group’s Robotics LAB frames the partnership as part of a broader strategy to build a proprietary technology ecosystem for robots that operate alongside people. “In the era of Physical AI, robots are becoming the closest point of contact between AI technology and people,” said Dong Jin Hyun, Vice President and Head of Robotics LAB at Hyundai Motor Group.
Market Context
The physical AI semiconductor market is projected to reach approximately $123 billion by 2030, with robotics and humanoid systems identified as the primary demand drivers. The segment is attracting attention from both established chipmakers and specialized startups, as the requirement for on-device AI inference in physical systems creates demand that general-purpose data center chips are not optimized to meet.
DEEPX and Hyundai have not disclosed a product timeline or the specific robotic platforms the DX-M2 is intended to power. The partnership agreement covers joint architecture development, suggesting the platform is still in the design phase rather than approaching commercial deployment.