Tesla Signals Shift From Model S and X Toward Optimus Robots

Tesla is signaling a strategic pivot away from its Model S and Model X vehicles as it prioritizes investment in autonomous driving systems and its Optimus humanoid robot program.

By Laura Bennett Published: | Updated:
Tesla-built industrial robots operate on an automated vehicle assembly line, illustrating the company’s long-standing investment in factory automation and AI-driven manufacturing. Photo: Steve Jurvetson / Flickr

Tesla is signaling a major strategic realignment, with reports indicating the company is preparing to wind down production of its Model S and Model X vehicles while accelerating investment in autonomous driving and humanoid robotics. The move underscores Tesla’s growing emphasis on physical AI as a core pillar of its future business.

The Model S and Model X, once central to Tesla’s brand and early success, now represent a shrinking share of the company’s overall vehicle sales. As production volumes concentrate around the Model 3 and Model Y, Tesla appears increasingly willing to deprioritize its higher-end legacy models in favor of technologies with longer-term growth potential.

From Premium EVs to Physical AI

Tesla’s pivot aligns with repeated statements from Elon Musk, who has described humanoid robots as potentially more valuable than the company’s car business over time. Musk has argued that autonomous labor, enabled by general-purpose robots, could unlock economic value far beyond vehicle manufacturing.

The company’s Optimus humanoid robot is designed to perform repetitive and physically demanding tasks in factories, warehouses, and eventually service environments. Tesla has already begun testing Optimus inside its own facilities, using controlled environments to train robots while extracting immediate operational value.

Musk has framed this approach as a natural extension of Tesla’s autonomy stack. The same AI systems used for full self-driving vehicles — including vision-based perception, neural networks, and custom AI chips — are being adapted for humanoid robotics.

Billions Directed Toward Autonomy and Robots

Tesla is expected to invest tens of billions of dollars over the coming years into autonomous vehicles and humanoid robotics. Reports suggest combined spending on self-driving technology, AI infrastructure, and robotics could reach as much as $20 billion, reflecting the scale of the company’s ambitions.

These investments include expanded data center capacity, custom silicon development, and large-scale AI training. For Tesla, the goal is to create systems that can reason, navigate, and act reliably in complex real-world environments – whether on roads or factory floors.

While timelines remain aggressive, Musk has said that Optimus could reach limited commercial deployment within the next few years, initially focused on internal use cases before broader rollout.

A Broader Industry Shift

Tesla’s evolving priorities mirror a wider trend across the technology and automotive sectors. As electric vehicles mature and competition intensifies, manufacturers are searching for new growth engines beyond car sales. Robotics and physical AI have emerged as leading candidates, promising scalable applications across manufacturing, logistics, healthcare, and consumer services.

Ending or scaling back Model S and X production would mark the end of an era for Tesla’s original premium lineup. But it would also reinforce Musk’s long-held view that Tesla is not ultimately a car company, but an AI and robotics company that happens to build vehicles.

Whether Optimus can deliver on that vision remains an open question. Still, Tesla’s willingness to shift resources away from iconic vehicles highlights how seriously the company is betting on a robot-driven future.

Davos 2026: Elon Musk Predicts Robot-Dominated Future in Davos Debut

Elon Musk told World Economic Forum attendees that humanoid robots could outnumber humans and enter mass markets as early as 2027, reshaping labor, economics, and daily life.

By Daniel Krauss Published: | Updated:

Elon Musk used his first appearance at the World Economic Forum to deliver some of his boldest predictions yet about the future of robotics, artificial intelligence, and human labor. Speaking to business and political leaders in Davos, Musk said humanoid robots could become widespread within the next few years and eventually surpass humans in number.

“Humanoid robots are coming very fast,” Musk said, predicting that commercially viable systems could reach the market by 2027. He described a future in which robots perform much of the physical work currently done by humans, fundamentally altering economic structures and labor markets worldwide.

Musk framed robotics as the next major technological wave after AI software, arguing that intelligence without a physical body is only part of the equation. Once AI systems can move, manipulate objects, and operate autonomously in the real world, he said, their economic impact will expand dramatically.

Robots, Labor, and Economic Disruption

According to Musk, the widespread adoption of humanoid robots could eliminate many forms of manual labor while creating entirely new economic models. He suggested that abundance driven by robotics could eventually reduce the cost of goods and services to near zero, forcing societies to rethink employment and income distribution.

“Once you have robots that can do everything humans can do physically, the output of the economy becomes enormous,” Musk said. He added that this transition could make traditional notions of scarcity less relevant, though it would also require careful management to avoid social disruption.

Musk acknowledged that such a shift would raise serious questions about jobs and inequality, but argued that technological progress has historically created more opportunity than it has destroyed. He pointed to automation in manufacturing and agriculture as precedents, though he conceded that humanoid robots would operate at a far broader scale.

Humanoid Robots by 2027

Musk’s timeline places humanoid robots closer to mass deployment than many industry analysts currently expect. He reiterated that Tesla is working toward commercializing its Optimus humanoid robot, initially targeting factory and industrial environments before expanding to broader applications.

He said early versions of humanoid robots would likely focus on repetitive and physically demanding tasks, helping to address labor shortages in manufacturing, logistics, and construction. Over time, Musk suggested, these systems could move into service roles and even household environments.

While Musk has previously set aggressive timelines that slipped, his Davos remarks reflect growing confidence across the technology sector that physical AI is approaching a tipping point. Advances in perception, motion control, and large-scale AI models are making robots more adaptable and economically viable.

Beyond Robotics

In addition to robotics, Musk also touched on broader themes including artificial general intelligence, space exploration, and long-term human survival. He reiterated his belief that AI development must be managed carefully to ensure it benefits humanity, while also expressing optimism that technological progress can solve many global challenges.

Despite skepticism from some attendees, Musk’s appearance underscored why robotics and physical AI have become central topics at Davos. As governments and corporations grapple with slowing productivity growth and demographic shifts, robots are increasingly viewed as both an opportunity and a disruption.

Musk concluded by suggesting that the question is no longer whether robots will become ubiquitous, but how societies choose to integrate them. If his predictions hold, the next decade could redefine not just technology, but the structure of human work itself.

Artificial Intelligence (AI), News, Robots & Robotics

OpenAI Quietly Expands Robotics Lab, Focuses on Data and Robotic Arms

OpenAI has quietly built a growing robotics lab with around 100 employees, focusing on data collection using robotic arms rather than developing a full humanoid robot.

By Rachel Whitman Published: | Updated:
Robotic arms used in OpenAI’s internal robotics lab perform everyday manipulation tasks as the company experiments with physical AI and real-world data collection. Photo: Simon Kadula / Unsplash

OpenAI has quietly returned to hands-on robotics, building an internal lab dedicated to experimenting with physical AI systems, as first reported by Business Insider. According to people familiar with the work, the lab was launched in early 2025 and has since grown to around 100 employees, signaling a renewed commitment to understanding how artificial intelligence can operate in the physical world.

The effort marks a notable shift for OpenAI, which previously stepped back from robotics after winding down an earlier program in 2020. That earlier initiative produced a high-profile robotic hand capable of solving a Rubik’s Cube, but leadership ultimately chose to refocus on large-scale AI models. While OpenAI exited direct robot development at the time, it continued to back the sector financially, investing in companies such as Figure, 1X, and Physical Intelligence.

By late 2024, internal discussions reportedly resurfaced around whether OpenAI should once again build robotic systems in-house. Those conversations have now translated into an operational lab, although the current work remains deliberately narrow in scope.

A Data-First Approach to Robotics

Rather than developing a full humanoid robot, OpenAI’s lab is focused on gathering high-quality training data. Inside the facility, robotic arms perform repetitive household-style tasks such as placing bread into a toaster, folding laundry, or rearranging objects. These systems run for extended periods, generating large volumes of interaction data that engineers use to refine perception, control, and decision-making models.

The emphasis reflects a broader challenge in robotics: unlike language models, robots do not benefit from vast, readily available datasets. Physical interaction data must be created deliberately, often through slow and expensive real-world experimentation. OpenAI’s approach suggests it is prioritizing the foundations of physical intelligence before committing to a specific robot form factor.

Engineers reportedly evaluate whether each training cycle leads to measurable improvements, adjusting models and task setups accordingly. The process is incremental, but it allows OpenAI to study how general-purpose AI systems learn from repeated physical interaction.

From Robots to “Brains” for Robots

It remains unclear whether OpenAI ultimately intends to build its own humanoid robot. People close to the project suggest that the company may instead aim to develop general-purpose control and reasoning models that could power robots built by partners.

That strategy would be consistent with OpenAI’s broader role in the AI ecosystem, where it supplies foundational models rather than finished products. A robotics-focused foundation model could potentially be deployed across multiple platforms, from industrial arms to mobile manipulators and humanoid systems.

The company’s existing investments in robotics startups also point toward a collaborative rather than vertically integrated approach. By supplying the intelligence layer, OpenAI could influence the future of robotics without competing directly with hardware manufacturers.

Robotics Returns to the AI Agenda

OpenAI’s renewed interest in robotics comes amid a wider industry push toward physical AI. Advances in large language models, vision systems, and reinforcement learning have revived optimism that robots can move beyond tightly scripted tasks and operate more flexibly in real environments.

Still, the company appears cautious. While reports suggest OpenAI has discussed opening a second robotics lab, no public timeline exists for commercial products or humanoid robots. For now, the focus remains on experimentation, data, and learning how intelligence transfers from screens into the physical world.

Whether OpenAI ends up building robots, powering them, or both, its quiet return to robotics underscores a growing consensus across the AI sector: true general intelligence will not be confined to software alone.

Airbus Orders Six-Figure Humanoid Robots From UBTech for Aircraft Manufacturing

Airbus has placed a six-figure order for humanoid robots from UBTech, marking one of the largest industrial deployments of humanoid robotics in aerospace manufacturing to date.

By Daniel Krauss Published: | Updated:
UBTech humanoid robots are being prepared for industrial deployment as Airbus expands the use of physical AI inside aircraft manufacturing facilities. Photo: UBTech / X

Airbus has taken a significant step toward automation in aircraft production by ordering a six-figure number of humanoid robots from UBTech Robotics. The purchase represents one of the largest commercial commitments yet for humanoid robots in heavy manufacturing and highlights how physical AI is beginning to move from pilot projects into real industrial workflows.

The robots are expected to be deployed across Airbus facilities to support repetitive and physically demanding tasks involved in aircraft assembly. While Airbus has long used industrial automation, the move toward humanoid robots reflects a shift toward more flexible systems that can operate in environments originally designed for human workers.

UBTech’s shares surged following reports of the deal, underlining growing investor confidence that humanoid robotics is transitioning from experimentation to scalable industrial use.

From Industrial Automation to Humanoid Labor

Unlike traditional industrial robots that are fixed in place and optimized for a narrow set of motions, humanoid robots are designed to navigate complex workspaces, manipulate tools, and interact with equipment built for human hands.  For aerospace manufacturing, where production lines involve tight spaces, variable tasks, and frequent reconfiguration, this flexibility is increasingly valuable.

Airbus has been evaluating humanoid robots as a way to address labor shortages, improve ergonomics, and increase consistency in assembly operations. The robots are expected to assist with material handling, inspection, and other repetitive processes that can strain human workers over long shifts.

The order places Airbus among a small but growing group of global manufacturers experimenting with humanoid robotics at meaningful scale, alongside automotive and logistics operators.

UBTech’s Industrial Push

UBTech, best known for its humanoid robot platforms designed for research and service applications, has been expanding aggressively into industrial markets. The company’s latest humanoid systems are built to operate autonomously in factory environments, combining computer vision, motion planning, and AI-driven manipulation.

The Airbus deal signals growing confidence in UBTech’s ability to meet industrial reliability and safety requirements, which remain a major barrier for humanoid robots operating alongside human workers. Aerospace manufacturing, in particular, demands high precision, repeatability, and compliance with strict safety standards.

For UBTech, the agreement represents a major validation of its strategy to position humanoid robots as a practical workforce augmentation tool rather than a futuristic novelty.

A Broader Shift Toward Physical AI

The Airbus order reflects a wider trend across global manufacturing, where companies are exploring physical AI systems that can reason, adapt, and act in real-world environments. Unlike conventional automation, humanoid robots promise to reduce the need for costly factory redesigns by fitting into existing workflows.

As labor markets tighten and production complexity increases, manufacturers are increasingly willing to test new forms of automation that offer both flexibility and scalability. Humanoid robots, while still early in their adoption curve, are emerging as a potential bridge between human labor and fully automated systems.

For Airbus, the deployment is expected to begin incrementally, with performance data guiding future expansion. If successful, humanoid robots could become a permanent fixture on aircraft assembly lines, reshaping how aerospace manufacturing is performed.

The deal underscores a turning point for the humanoid robotics industry: major industrial players are no longer just experimenting. They are beginning to place real orders.

Asus Halts New Smartphone Development to Pivot Toward AI and Robotics

Asus is winding down new smartphone development as it redirects resources toward AI computing, robotics, and physical AI systems for industrial and enterprise markets.

By Daniel Krauss Published: | Updated:
Asus showcases AI and robotics technologies as the company shifts its strategy away from new smartphone development toward physical AI and intelligent systems. Photo: Liudmyla Shalimova / Pexels

ASUS is scaling back new smartphone development as part of a broader strategic shift toward artificial intelligence, robotics, and advanced computing systems. The move reflects changing priorities inside the Taiwanese technology company, which is reallocating engineering talent and capital away from consumer handsets toward higher-growth segments tied to physical AI and intelligent automation.

Asus will continue to sell and support existing smartphone models in select markets, but it no longer plans to invest heavily in new flagship phone platforms. The decision follows years of intense competition in the global smartphone market, where margins have tightened and growth has slowed, particularly outside of Apple and Samsung’s ecosystems.

Company executives have indicated that future innovation efforts will focus on AI infrastructure, edge computing, robotics platforms, and intelligent devices designed for enterprise and industrial use cases.

From Consumer Phones to Physical AI

Asus has been steadily expanding its footprint in AI hardware, including servers optimized for accelerated computing, edge AI platforms, and embedded systems used in robotics and automation. These systems are increasingly deployed in factories, logistics centers, healthcare environments, and smart infrastructure projects.

The company has also invested in robotics-related research, including autonomous mobile systems, AI vision platforms, and human-machine interfaces. Rather than building consumer-facing robots, Asus is positioning itself as an enabling technology provider, supplying the compute, sensing, and control systems that power physical AI applications.

This transition mirrors a broader industry shift, where growth is increasingly concentrated in AI-driven systems that operate in the physical world. Robotics, autonomous machines, and intelligent infrastructure require high-performance, energy-efficient computing platforms, an area where Asus believes it can compete more effectively than in consumer smartphones.

Robotics and Edge AI as Growth Drivers

Asus’ robotics strategy centers on edge intelligence, where AI models run directly on devices rather than relying on cloud infrastructure. This approach is critical for robots and autonomous systems that must operate with low latency, high reliability, and strong data privacy guarantees.

The company’s hardware portfolio now includes AI-ready industrial PCs, robotic controllers, and edge servers designed to support computer vision, motion planning, and real-time decision-making. These systems are being adopted across manufacturing, smart cities, and healthcare automation.

By exiting aggressive smartphone development, Asus frees up resources to deepen partnerships in robotics ecosystems and accelerate product cycles in AI-driven markets. Industry analysts view this as a pragmatic move, given the capital-intensive nature of smartphone development and the uncertain returns in a saturated market.

A Broader Industry Realignment

Asus’ pivot comes amid a wider reassessment of consumer electronics strategies across the technology sector. As smartphones mature, many manufacturers are looking beyond handsets for long-term growth, turning instead to AI infrastructure, robotics, and intelligent systems that can scale across industries.

Physical AI, which combines perception, reasoning, and action in real-world environments, is emerging as a central theme in this transition. Robotics platforms require continuous upgrades in compute performance, sensing accuracy, and software integration, creating recurring demand for specialized hardware and systems.

For Asus, the shift represents a move from volume-driven consumer markets toward fewer, higher-value deployments. While smartphones once defined the company’s consumer identity, its future growth is increasingly tied to the machines, factories, and autonomous systems that will shape the next phase of industrial digitization.

The decision underscores a growing consensus in the technology industry: the next major wave of innovation will not be defined by screens in pockets, but by intelligent machines operating alongside humans in the physical world.

Elon Musk Says Tesla’s Robots Will Surpass Its EV Business

Elon Musk says Tesla’s humanoid robots could eventually become more valuable than its electric vehicle business, positioning robotics as the company’s long-term growth engine.

By Laura Bennett Published: | Updated:
Tesla’s Optimus humanoid robot is presented as part of the company’s long-term vision for physical AI and autonomous labor. Photo: Tesla Optimus / X

Elon Musk has once again pushed Tesla’s ambitions beyond cars, predicting that the company’s humanoid robots could ultimately outgrow its electric vehicle business. Speaking in recent public remarks and investor discussions, Musk framed robotics not as a side project, but as a future pillar that could redefine Tesla’s identity over the coming decade.

Tesla’s humanoid robot, known as Optimus, is designed to perform repetitive and physically demanding tasks in factories, warehouses, and eventually homes. Musk has repeatedly argued that the long-term economic value of autonomous labor far exceeds that of vehicle manufacturing, particularly as global labor shortages intensify and wages rise across industrial economies.

While Tesla remains one of the world’s most valuable automakers, Musk suggested that robots could unlock an entirely new market measured in trillions of dollars. Unlike cars, which are constrained by consumer purchasing power and replacement cycles, general-purpose robots could be deployed continuously across manufacturing, logistics, healthcare, and domestic services.

Robots as Tesla’s Next Growth Engine

Tesla’s robotics program draws heavily from the same core technologies that power its vehicles. The Optimus platform uses Tesla’s full self-driving neural networks, computer vision systems, and custom AI chips, allowing the company to reuse years of autonomy research. Musk has emphasized that this software-first approach gives Tesla a structural advantage over robotics startups that must build perception, planning, and control systems from scratch.

Optimus is expected to operate initially inside Tesla’s own factories, handling material transport and basic assembly tasks. These controlled environments allow the company to train robots at scale while generating real operational value. Musk has indicated that internal deployments could begin ramping before broader commercial availability.

Over time, Tesla envisions Optimus evolving into a general-purpose worker capable of understanding instructions, navigating complex spaces, and manipulating objects with human-like dexterity. If successful, this would place Tesla among a small group of companies attempting to commercialize humanoid robots at scale.

Physical AI Beyond Autonomous Vehicles

Musk’s comments reflect a broader industry shift toward what many executives now call physical AI — systems that can perceive, reason, and act in the real world. Unlike digital AI products, physical AI must meet far higher safety, reliability, and cost constraints, especially when operating alongside humans.

Tesla’s strategy mirrors developments across the robotics sector, where companies are racing to combine large-scale AI models with real-world embodiment. Musk argues that once robots reach sufficient intelligence and reliability, manufacturing capacity becomes the primary constraint, not demand.

He has suggested that a mature robotics business could eventually dwarf Tesla’s vehicle revenues, even if EV sales continue to grow. In Musk’s framing, cars may become just one application of a much larger AI and robotics platform.

Skepticism and Execution Risk

Despite the bold vision, significant challenges remain. Humanoid robots must operate safely in unpredictable environments, manipulate a vast range of objects, and perform tasks reliably over long periods. Battery life, actuator durability, and cost-efficient manufacturing all remain open questions.

Analysts also note that Tesla has a history of ambitious timelines that often slip. While Optimus prototypes have demonstrated walking, object handling, and basic autonomy, large-scale commercial deployment is still unproven.

Nevertheless, Musk’s prediction underscores Tesla’s long-term direction. Rather than viewing robotics as experimental, Tesla is positioning humanoid robots as a central business line that could redefine how work is performed across the global economy.

If Tesla succeeds, the company best known for electric cars could ultimately be remembered for something far more transformative: machines that replace human labor at scale.

NEURA Robotics and Bosch Join Forces to Scale German-Made Humanoid Robotics

NEURA Robotics and Bosch have formed a strategic partnership to industrialize humanoid robots and physical AI in Germany, combining real-world data collection with large-scale manufacturing expertise.

By Daniel Krauss Published: | Updated:
NEURA Robotics and Bosch executives mark the launch of a strategic partnership to industrialize humanoid robotics and physical AI technologies in Germany. Photo: NEURA Robotics

NEURA Robotics and Bosch have announced a strategic partnership aimed at accelerating the industrial deployment of humanoid robots and physical AI technologies developed in Germany. The collaboration brings together NEURA’s fast-moving robotics platform and Bosch’s manufacturing scale, signaling a coordinated European push into one of the most competitive emerging technology markets.

“NEURA aims to position Europe as the global leader in one of the most significant future markets, humanoid robotics,” said David Reger, founder and CEO of NEURA Robotics. “Our mission is to set the global benchmark for physical AI and humanoid robotics, establishing a European alternative to the major platform players in the U.S. and China. The partnership with Bosch is a powerful signal that Germany and Europe are investing in next-generation technologies developed independently.”

Reger said access to real-world physical training data remains the largest constraint in robotics development. “Physical training data is the biggest challenge in robotics; no one has it,” he said. “At NEURA, we have turned this challenge into our competitive advantage, and now, with Bosch, we have the opportunity to capture, structure, and leverage real-world data.”

The partnership is positioned around a shared objective: moving humanoid robots from experimental systems into reliable, scalable tools for real-world work environments. Both companies describe humanoid robotics as a technological shift comparable in impact to the rise of the personal computer or smartphone.

Building the Data Foundation for Humanoid Robots

A central pillar of the collaboration is the joint collection of real-world physical data inside Bosch facilities. Using advanced sensor suits, the partners will capture human motion, task execution, and environmental interaction data during everyday industrial work. This type of physical training data is scarce but critical for teaching humanoid robots how to move, manipulate objects, and operate safely alongside people.

By grounding robot learning in real workplace conditions rather than purely simulated environments, NEURA and Bosch aim to accelerate deployment timelines and improve reliability. The data will feed directly into NEURA’s AI models, enabling faster learning cycles and more adaptable robotic behavior across diverse tasks.

In parallel, the companies will co-develop AI-based core software, functional robotics modules, and intuitive user interfaces designed for industrial use. This software collaboration is intended to bridge perception, reasoning, and physical action into a cohesive operating layer for humanoid robots.

From Scale-Up Innovation to Industrial Production

Bosch will play a key role in supporting NEURA’s transition from development to large-scale production. This includes optimizing manufacturing workflows, scaling embedded software, and potentially supplying robotic components such as motors and actuators. The partnership also leaves room for Bosch to support final assembly and motor production for future humanoid platforms.

NEURA enters the collaboration with a reported order book exceeding one billion euros and is actively expanding its production capacity. The company’s industrial scaling effort is led by executives with deep experience in Bosch’s own manufacturing systems, reinforcing the operational alignment between the two organizations.

The focus on industrialization reflects a broader industry shift. Customers are increasingly demanding robots that can deliver consistent performance, meet safety requirements, and integrate into existing facilities without extensive redesign.

An Open Ecosystem for Physical AI

At the software level, NEURA is advancing an open robotics ecosystem known as the Neuraverse. The concept centers on connected humanoid robots that share skills, data, and learned behaviors across a distributed network. Improvements made by one robot can propagate across the fleet through software updates, creating a continuous feedback loop between deployment and development.

Combined with Bosch’s manufacturing and systems expertise, this approach is designed to accelerate innovation while maintaining industrial reliability. Rather than closed, application-specific robots, the partners are betting on adaptable, general-purpose systems that improve over time.

The partnership underscores a broader ambition to establish a European alternative in humanoid robotics, at a time when major efforts are concentrated in the United States and China. By pairing real-world data acquisition with scalable production, NEURA and Bosch are positioning Germany as a central hub for the next phase of physical AI.

Skild AI Raises $1.4B, Reaching $14B Valuation in Physical AI Bet

Skild AI raised nearly $1.4 billion in a funding round led by SoftBank, valuing the robotics AI company at more than $14 billion as it scales a unified foundation model for robots.

By Rachel Whitman Published: | Updated:

Skild AI has raised close to $1.4 billion in new funding, pushing the Pittsburgh-based robotics AI company to a valuation exceeding $14 billion. The round was led by SoftBank Group and included participation from NVIDIA Ventures, Macquarie Capital, Jeff Bezos, and a wide range of strategic and institutional investors, underscoring growing confidence in foundational AI models for the physical world.

The financing positions Skild AI as one of the most highly valued companies in the emerging Physical AI sector. Rather than building robots themselves, the company is focused on developing a general-purpose robotics foundation model, known as the Skild Brain, designed to operate across virtually any robotic body and task.

Building a General Brain for Robots

At the core of Skild AI’s strategy is the idea of an omni-bodied intelligence. Unlike traditional robotics systems that are tightly coupled to specific hardware, the Skild Brain is designed to control robots without prior knowledge of their exact physical form. The model can operate humanoids, quadrupeds, mobile manipulators, tabletop arms, and other machines that can move.

Skild AI says this approach allows robots to perform a wide range of activities, from everyday household tasks such as cleaning, loading a dishwasher, or cooking simple meals, to physically demanding work like navigating unstable terrain or handling heavy payloads. The company’s long-term vision is that any machine capable of motion could eventually be operated by the same underlying AI brain.

A central challenge in robotics AI is the lack of large-scale, standardized training data. Unlike language models, which can be trained on vast amounts of text, robots do not have an equivalent “internet of robotics.” Skild AI addresses this gap by pre-training its model on a combination of human video data and large-scale physics-based simulations, allowing it to learn general physical behavior without being tied to a single robot design.

Adaptation Without Retraining

One of Skild AI’s key technical claims is that the Skild Brain can adapt in real time to unexpected changes without retraining or fine-tuning. This includes scenarios such as damaged limbs, jammed wheels, increased payloads, or being deployed on an entirely new robotic body.

According to the company, this adaptability is driven by in-context learning. When the model encounters a new environment or embodiment where its initial actions fail, it adjusts its behavior based on live experience. Skild AI describes this as a major departure from conventional robotics approaches, which often require extensive retraining for each new scenario.

“The Skild Brain can control robots it has never trained on, adapting in real time to extreme changes in form or environments,” said Deepak Pathak, CEO and co-founder of Skild AI. He added that forcing the model to adapt rather than memorize is critical to building intelligence that works reliably in the real world.

Rapid Growth and Commercial Focus

Skild AI is also reporting rapid commercial traction. The company said it grew from zero to approximately $30 million in revenue within a few months in 2025 and is deploying its technology across multiple enterprise settings. Current use cases include security and facility inspection, warehouse operations, manufacturing, data centers, construction, and delivery tasks.

While consumer robotics remains a long-term goal, Skild AI is prioritizing enterprise and industrial deployments, where robots can be rolled out at scale and generate continuous data to improve the model. The company believes this creates a reinforcing data flywheel, allowing the Skild Brain to improve with every deployment regardless of hardware type or task.

Abhinav Gupta, co-founder and president of Skild AI, said this generality is essential for building intelligent systems that can operate safely and dynamically alongside humans. He described omni-bodied learning as a foundational requirement for bringing advanced AI into everyday physical environments.

Investors Signal Strategic Importance

The breadth of the investor group reflects both commercial and strategic interest in Physical AI. In addition to financial investors, the round included strategic backers such as Samsung, LG, Schneider, CommonSpirit, and Salesforce, pointing to potential applications across manufacturing, healthcare, and enterprise automation.

SoftBank Investment Advisers described Skild AI as foundational infrastructure for the future of robotics, while other investors emphasized the long-term economic and strategic significance of solving intelligence for the physical world.

Founded in 2023, Skild AI operates across Pittsburgh, the San Francisco Bay Area, and Bengaluru. With its latest funding, the company plans to scale training of its foundation model and expand deployments, aiming to establish a shared intelligence layer for robots across industries as Physical AI moves closer to mainstream adoption.

Artificial Intelligence (AI), Business & Markets, News, Robots & Robotics, Startups & Venture

Wing and Walmart Expand Drone Delivery to 150 Stores in Coast-to-Coast Push

Wing and Walmart will add drone delivery to 150 more stores, aiming to reach over 40 million Americans and build the largest residential drone delivery network in the U.S.

By Rachel Whitman Published: | Updated:
A Wing autonomous delivery drone prepares to transport a small package from a Walmart location as part of the companies’ expanding drone delivery program. Photo: Wing

Wing and Walmart are significantly expanding their drone delivery partnership, announcing plans to add drone service to 150 additional Walmart stores across the United States over the next year. The move is designed to transform drone delivery from a regional convenience into a nationwide retail option, ultimately reaching more than 40 million Americans.

The expansion builds on years of testing and commercial operations in select markets, particularly in the Dallas-Fort Worth area and Metro Atlanta. In those regions, drone delivery has moved beyond novelty status and become a routine part of shopping behavior for many customers. According to the companies, usage has accelerated sharply, with deliveries tripling over the past six months and a core group of customers placing multiple orders per week.

By 2027, Wing and Walmart expect to operate more than 270 drone delivery locations across the country, forming what they describe as the largest residential drone delivery network in the world.

From Regional Pilots to National Coverage

The next phase of expansion will introduce drone delivery to major metropolitan areas including Los Angeles, St. Louis, Cincinnati, and Miami. These additions build on previously announced rollouts in cities such as Houston, Orlando, Tampa, and Charlotte. Operations in Houston are scheduled to begin in mid-January, marking one of the largest single-market launches to date.

The companies said the goal is not simply geographic growth, but consistency and scale. Drone delivery is being integrated directly into Walmart’s existing store operations, allowing orders to be fulfilled from local inventory rather than specialized distribution centers. This approach shortens delivery times and reduces friction for customers placing last-minute or urgent orders.

Wing’s drones are primarily used for lightweight items, including groceries, household essentials, and over-the-counter medications. Orders are delivered in minutes, often faster than traditional same-day or curbside options.

Why Drone Delivery Is Gaining Traction

Executives from both companies argue that the growing adoption reflects a shift in consumer expectations rather than a short-term experiment. Walmart sees drone delivery as a way to address immediate needs, particularly for time-sensitive purchases that do not justify a full shopping trip.

“Drone delivery plays an important role in our ability to deliver what customers want, exactly when they want it,” said Greg Cathey, Walmart’s senior vice president of digital fulfillment transformation. He pointed to strong adoption in existing markets as evidence that customers are willing to embrace the technology when it is reliable and easy to use.

Wing, which is owned by Alphabet, has spent years refining its aircraft, navigation systems, and air traffic coordination to support dense residential operations. The company emphasizes that its drones are designed to operate autonomously while meeting strict safety and noise standards, a key factor in securing regulatory approvals and community acceptance.

The Economics of Ultra-Fast Delivery

The scale of the expansion highlights a broader shift in retail logistics. While drone delivery has often been viewed as expensive or experimental, Wing and Walmart argue that high utilization changes the equation. In markets where demand is strong, drones can complete many short trips per hour, lowering the cost per delivery and reducing reliance on human drivers.

“We believe even the smallest package deserves the speed and reliability of a great delivery service,” said Adam Woodworth, chief executive of Wing. He said working with Walmart has allowed the company to demonstrate that drone delivery can operate as part of everyday retail, not just as a premium or niche offering.

Industry analysts note that Walmart’s national footprint gives the partnership a structural advantage. Thousands of stores located close to residential neighborhoods make it easier to launch drone services without building new infrastructure. If successful, the model could pressure other retailers to accelerate their own investments in autonomous delivery.

A Glimpse of Retail’s Next Phase

The coast-to-coast rollout signals growing confidence that drone delivery can move beyond pilot programs and into mainstream commerce. While regulatory hurdles and airspace coordination remain challenges, the scale of this expansion suggests that large retailers now see drones as a practical complement to trucks, vans, and gig-economy drivers.

For consumers, the promise is simple: faster access to everyday essentials. For the retail industry, the partnership represents a test of whether autonomous delivery can reliably operate at national scale.

As Wing and Walmart extend their network from Los Angeles to Miami, the question may no longer be whether drone delivery works, but how quickly it becomes an expected part of shopping in American cities.

CES 2026 Puts Physical AI and Robots at the Center of Tech’s Next Wave

CES 2026 marked a turning point for Physical AI, as humanoids, autonomous machines, and AI-driven platforms moved from experiments to scalable, real-world systems.

By Daniel Krauss Published: | Updated:
Robots and autonomous machines take center stage on the CES 2026 show floor as Physical AI becomes a defining theme of the event. Photo: Consumer Technology Association (CTA)

CES 2026 made one thing clear: artificial intelligence is no longer just software. This year’s show was defined by Physical AI, a category where intelligence is embedded directly into machines that move, lift, drive, and operate in the real world. From humanoid robots and autonomous trucks to construction equipment and household helpers, robotics shifted from the fringes of CES to its core narrative.

What distinguished CES 2026 from previous years was not the novelty of robots, but their readiness. Across industries, companies emphasized scale, safety, and integration, signaling that Physical AI is moving beyond pilots and into sustained deployment.

From Vision to Deployment Across Industries

In construction, Doosan Bobcat showcased how AI and autonomy are reshaping worksites. Its RX3 autonomous concept loader illustrated how compact equipment can operate quietly, electrically, and with modular configurations, while AI-driven systems like Jobsite Companion and collision avoidance highlighted how intelligence is being embedded directly into machines operators already use.

Logistics and transportation saw similar momentum. Kodiak AI and Bosch presented a production-grade autonomous trucking platform designed for scale, emphasizing redundant hardware and automotive-grade components. The message was clear: autonomy is no longer confined to test routes but is being engineered for industrial reliability.

At the software and platform level, Mobileye’s acquisition of Mentee Robotics underscored how autonomy stacks developed for vehicles are converging with humanoid robotics. The deal positioned Physical AI as a shared foundation across cars and robots, built on perception, planning, and safety systems that can operate in human environments.

Humanoids Grow Up

Humanoid robots were among the most visible symbols of CES 2026’s shift toward Physical AI. Boston Dynamics, working again with Google, revealed a product-ready version of Atlas, highlighting industrial-grade specifications, multiple control modes, and AI-driven autonomy. The focus was less on acrobatics and more on repeatable, real-world work.

In the home, LG Electronics introduced its CLOiD home robot as part of a broader “Zero Labor Home” vision. Rather than a standalone gadget, CLOiD was positioned as a mobile AI hub capable of coordinating appliances, understanding routines, and performing household tasks through vision-based Physical AI.

These examples reflected a broader trend: humanoids are no longer pitched as distant futures, but as platforms designed to fit into existing environments, whether warehouses, factories, or homes.

Platforms and Chips Power the Physical AI Stack

Underpinning many of these robots were platform providers positioning themselves as the infrastructure layer for Physical AI. Qualcomm used CES to introduce its Dragonwing robotics platform and IQ10 processor, framing them as the “brain of the robot” for everything from service robots to full-size humanoids. The emphasis on power efficiency, edge AI, and safety-grade performance highlighted how critical compute platforms are becoming as robots scale.

Across the show floor, the same themes repeated: end-to-end stacks, simulation-first development, and software-defined robotics. Companies increasingly described robots as evolving systems, improved through data and deployment rather than fixed-function machines.

A Turning Point for CES and Robotics

CES has long been known for bold prototypes, but CES 2026 felt more grounded. The conversation shifted from what robots might do someday to how they are being deployed today. Partnerships, acquisitions, and production-ready platforms dominated announcements, reflecting a more mature phase of the robotics cycle.

Physical AI now sits at the intersection of several forces: advances in foundation models, cheaper and more capable hardware, and rising demand for automation across labor-constrained industries. CES 2026 captured that convergence in real time.

If earlier CES editions introduced robots as curiosities, CES 2026 presented them as infrastructure. As Physical AI moves from show floors into jobsites, roads, and homes, this year’s event may be remembered as the moment robotics stopped being a sideshow and became one of technology’s main stages.

Omdia Ranks AGIBOT as Global Leader in Humanoid Robot Shipments

Market data from Omdia shows AGIBOT leading global humanoid robot shipments in 2025, highlighting a shift from prototypes to early commercial deployment in physical AI.

By Daniel Krauss Edited by Kseniia Klichova Published: | Updated:
Market data from Omdia highlights AGIBOT’s rise as the leading global supplier of humanoid robots, signaling early commercial momentum in embodied AI. Photo: AGIBOT

According to a new market assessment from Omdia, Shanghai-based AGIBOT ranked first worldwide in humanoid robot shipments in 2025. The ranking reflects delivered units rather than pilot announcements or experimental deployments, marking an important milestone as humanoid robotics moves beyond research labs and controlled demonstrations.

Omdia’s Market Radar: General-Purpose Embodied Intelligent Robots 2026 frames humanoid robots as an emerging product category entering its earliest phase of commercialization. Instead of highly customized machines built for single tasks, the market is beginning to favor standardized humanoid platforms designed to operate across multiple environments such as logistics hubs, public spaces, educational settings, and light industrial facilities.

The report emphasizes that shipment volume is becoming a critical indicator of progress in physical AI. Each deployed humanoid robot not only performs tasks but also generates real-world interaction data, accelerating learning cycles and improving system robustness over time.

From Research Prototypes to Scaled Shipments

AGIBOT’s lead in shipments reflects a strategy centered on controlled scalability rather than headline-grabbing prototypes. The company has focused on deploying humanoid platforms that balance mobility, stability, and cost efficiency, prioritizing environments where robots can operate alongside humans under predictable conditions.

Analysts note that early shipment leadership does not guarantee long-term dominance, but it creates structural advantages. Real-world data remains one of the scarcest resources in humanoid robotics, and companies with active deployments gain faster feedback on perception, manipulation, navigation, and human-robot interaction.

Omdia also highlights that competition in the humanoid sector remains intense. Companies in the United States, Europe, and Japan are investing heavily in software-centric approaches, while automotive and electronics manufacturers increasingly view humanoid robots as an extension of their embodied AI strategies.

Humanoid Robotics Moves Toward Commercial Reality

The report situates humanoid robots within a broader transition toward general-purpose embodied intelligence – machines designed to function in spaces originally built for people. Rather than replacing traditional industrial robots, humanoids are positioned to complement existing automation by handling tasks that require flexibility, mobility, and contextual awareness.

Omdia identifies Asia, particularly China, as the most aggressive region in early humanoid commercialization, driven by manufacturing scale, integrated supply chains, and sustained investment in robotics infrastructure. At the same time, demand is expected to broaden globally as enterprises explore service, healthcare, and customer-facing applications.

While the humanoid robotics market remains in its formative stage, shipment data suggests a clear shift is underway. Leadership is increasingly defined by deployed systems rather than conceptual ambition, signaling that physical AI is beginning its transition from promise to presence.

CES 2026: Caterpillar and NVIDIA Push Physical AI Into Heavy Industry

Caterpillar and NVIDIA deepened their partnership at CES 2026, outlining how Physical AI will transform construction, mining, manufacturing, and industrial supply chains.

By Rachel Whitman Published: | Updated:

CES 2026 marked another milestone in the rise of Physical AI, with Caterpillar and NVIDIA unveiling an expanded collaboration aimed at reshaping heavy industry. The partnership signals how artificial intelligence is moving beyond digital workflows and into the machines, factories, and jobsites that power the global economy.

“As AI moves beyond data to reshape the physical world, it is unlocking new opportunities for innovation – from job sites and factory floors to offices,” said Joe Creed, CEO of Caterpillar. “Caterpillar is committed to solving our customers’ toughest challenges by leading with advanced technology in our machines and every aspect of business. Our collaboration with NVIDIA is accelerating that progress like never before.”

For Caterpillar, the collaboration is about embedding intelligence directly into iron. For NVIDIA, it extends its AI platforms into some of the most demanding physical environments on earth – construction zones, mines, and industrial plants – where reliability, safety, and scale matter more than novelty.

Machines Built for the AI Era

At the core of the partnership is NVIDIA’s Jetson Thor platform, which Caterpillar plans to deploy across construction, mining, and power-generation equipment. Running advanced AI models at the edge allows Cat machines to process massive volumes of sensor data in real time, enabling smarter decision-making in unpredictable environments.

This shift lays the groundwork for AI-assisted and autonomous operations at scale. Caterpillar described future machines as part of a “digital nervous system” for jobsites, where fleets continuously analyze conditions, adapt to terrain, and optimize productivity. In-cab AI features will also play a growing role, providing operators with real-time coaching, safety alerts, and performance insights tailored to specific tasks and environments.

Rather than replacing operators, Caterpillar is positioning AI as an augmentation layer – one that helps crews work faster, safer, and with greater confidence as jobsites become more complex.

Debuting the Cat AI Assistant

One of the most visible announcements at CES 2026 was the debut of the Cat AI Assistant. Designed as a proactive digital partner, the assistant integrates voice-based interaction directly into Caterpillar’s onboard and digital systems. Built using NVIDIA’s Riva speech models, it delivers natural, conversational responses while drawing on Caterpillar’s own equipment and maintenance data.

In practical terms, this means operators and fleet managers can ask questions about machine health, parts, troubleshooting, or maintenance schedules and receive context-aware guidance instantly. Inside the cab, voice activation can adjust settings, guide diagnostics, and connect users to the right tools without interrupting work.

The assistant reflects a broader trend at CES 2026: Physical AI systems are increasingly conversational, intuitive, and embedded directly into workflows rather than accessed through separate dashboards.

NVIDIA AI Factory and the Reinvention of Industrial Operations

Beyond the jobsite, Caterpillar is leveraging NVIDIA AI Factory to transform manufacturing and supply chain operations. AI Factory provides the accelerated computing infrastructure, software frameworks, and AI libraries needed to train, deploy, and continuously improve large-scale industrial AI systems.

Caterpillar is using this infrastructure to automate and optimize core manufacturing processes such as production forecasting, scheduling, and quality control. By running these workloads on AI Factory, Caterpillar can process vast datasets faster, adapt to changing demand, and improve resilience across its global production network.

A major component of this effort is the creation of physically accurate digital twins of Caterpillar factories using NVIDIA Omniverse and OpenUSD technologies. These digital environments allow teams to simulate factory layouts, test production changes, and optimize workflows before implementing them in the real world — reducing downtime, risk, and cost.

Physical AI Moves From Concept to Infrastructure

The Caterpillar–NVIDIA collaboration fits squarely into the broader narrative of CES 2026, where Physical AI emerged as a unifying theme across robotics, autonomy, logistics, and heavy industry. From autonomous construction equipment to AI-driven factories, intelligence is becoming embedded directly into physical systems.

By combining Caterpillar’s century-long experience in industrial machinery with NVIDIA’s AI platforms and AI Factory infrastructure, the two companies are signaling that Physical AI is no longer experimental. It is becoming foundational infrastructure for how industries build, move, and power the world.

As Caterpillar CEO Joe Creed noted, AI is no longer just analyzing data – it is actively reshaping how work gets done. In heavy industry, that transformation is now moving at full speed.

Artificial Intelligence (AI), Automation, News

Humanoid Builds HMND 01 Alpha in 7 Months Using NVIDIA Robotics Stack

London-based startup Humanoid moved from concept to a functional alpha prototype of its HMND 01 robot in seven months, compressing a development cycle that typically takes up to two years.

By Laura Bennett Published: | Updated:

London-based robotics startup Humanoid has compressed the traditional hardware development timeline by moving from concept to a functional alpha prototype of its HMND 01 system in just seven months.

The milestone stands in contrast to the typical 18 to 24 months required to develop comparable humanoid or industrial robotic platforms, highlighting how simulation-first development and edge AI are reshaping robotics engineering.

The HMND 01 Alpha program includes two robot variants: a wheeled platform designed for near-term industrial deployment and a bipedal system intended primarily for research and future service or household applications.

Both platforms are currently undergoing field tests and proof-of-concept demonstrations, including a recent industrial evaluation with automotive supplier Schaeffler.

At the center of Humanoid’s accelerated development cycle is a tightly integrated software and hardware stack built on NVIDIA robotics technologies.

Edge Compute and Foundation Models at the Core

The HMND 01 Alpha robots use NVIDIA Jetson Thor as their primary edge computing platform. By consolidating compute, sensing, and control onto a single high-performance system, Humanoid simplified internal architecture, wiring, manufacturability, and field serviceability.

Jetson Thor allows the robots to run large robotic foundation models directly on-device rather than relying on cloud processing. This enables real-time execution of vision-language-action models that support perception, reasoning, and task execution in dynamic environments.

Humanoid reported that training these models using NVIDIA’s AI infrastructure has reduced post-training processing times to just a few hours. This faster turnaround significantly shortens the loop between data collection, model refinement, and deployment on physical robots, allowing the company to iterate at software speed rather than hardware speed.

Simulation-First Development and Hardware Optimization

Humanoid’s workflow is built around a simulation-to-reality pipeline using NVIDIA Isaac Lab and Isaac Sim. Engineers use Isaac Lab to train reinforcement learning policies for locomotion and manipulation, while Isaac Sim provides a high-fidelity environment for testing navigation, perception, and full-body control.

Through a custom hardware-in-the-loop validation system, Humanoid created digital twins that mirror the software interfaces of the physical robots. This allows middleware, control logic, teleoperation, and SLAM systems to be tested virtually before deployment on real hardware. According to the company, new control policies can be trained from scratch and deployed onto physical robots within roughly 24 hours.

Simulation also plays a direct role in mechanical engineering decisions. During development of the bipedal robot, Humanoid evaluated six different leg configurations in simulation, analyzing torque requirements, joint stability, and mass distribution before committing to physical prototypes.

Engineers also optimized actuator selection, sensor placement, and camera positioning using simulated perception data, reducing the risk of blind spots and interference in industrial settings.

These physics-based simulations contributed to the robots’ performance during early industrial trials and helped avoid costly redesigns later in the development cycle.

Toward Software-Defined Robotics Standards

Humanoid views HMND 01 as part of a broader shift toward software-defined robotics. The company is working with NVIDIA to move away from legacy industrial communication standards and toward modern networking architectures designed for AI-enabled robots.

“NVIDIA’s open robotics development platform helps the industry move past legacy industrial communication standards and make the most of modern networking capabilities,” said Jarad Cannon, chief technology officer of Humanoid.

He added that the company is collaborating on a new robotics networking system built on Jetson Thor and the Holoscan Sensor Bridge, with the goal of enabling more flexible and scalable robot architectures.

Founded in 2024 by Artem Sokolov, Humanoid has grown to more than 200 engineers and researchers across offices in London, Boston, and Vancouver. The company reports 20,500 pre-orders, six completed proof-of-concept projects, and three active pilot programs.

While the bipedal HMND 01 remains focused on research and long-term service robotics, the wheeled variant is positioned for near-term industrial use. Humanoid’s strategy emphasizes early deployment in operational environments to gather real-world data and continuously refine its software-defined architecture, signaling a shift in how humanoid and industrial robots are developed and brought to market.

News, Robots & Robotics

CES 2026: Doosan Bobcat Unveils RX3 Autonomous Loader and AI Jobsite Tech

Doosan Bobcat introduced the RX3 autonomous concept loader and a suite of AI-powered jobsite technologies at CES 2026, signaling a shift toward smart, electrified construction equipment.

By Daniel Krauss Published: | Updated:
Scott Park, CEO of Doosan Bobcat, presents the company’s next-generation construction equipment vision at CES 2026 Media Day on January 5. Photo: Doosan Bobcat

Doosan Bobcat has unveiled a new generation of autonomous and AI-enabled construction technologies at CES 2026, headlined by the RX3 autonomous concept loader and a growing ecosystem of intelligent jobsite systems. The announcements reflect the company’s push to integrate autonomy, electrification, and artificial intelligence into compact construction equipment designed for real-world deployment.

Presented during CES Media Day in Las Vegas, the technologies are part of what Bobcat describes as a “Smart Construction Jobsite,” where machines assist operators, reduce complexity, and improve safety and productivity. While several systems remain in concept or prototype form, the company emphasized that many are moving steadily toward commercialization.

RX3 Autonomous Concept Loader

The Bobcat RogueX3, or RX3, represents the third generation of Bobcat’s autonomous loader concept. The electric-powered machine is designed to match the size and footprint of existing manned Bobcat equipment, allowing it to operate in current jobsites without major workflow changes. It uses tracked mobility to provide traction across uneven or challenging surfaces while operating quietly and without emissions.

A key feature of the RX3 is its modular design. The platform can be configured with or without a cab, equipped with wheels or tracks, and paired with different lift arms depending on the task. Bobcat said the concept could ultimately support multiple powertrains, including electric, diesel, hybrid, and hydrogen, offering flexibility as energy infrastructure evolves.

“For nearly 70 years, Bobcat has led the compact equipment industry by solving real problems for real people,” said Scott Park, vice chairman and CEO of Doosan Bobcat. “As jobsites become more complex, we’re responding with intelligent systems that help people accomplish more, faster, and smarter.”

Bobcat is also working with Agtonomy as a technology partner, using its perception and fleet management software to enable autonomous and semi-autonomous operation in agricultural and construction contexts.

AI Comes Into the Cab

Alongside the RX3, Doosan Bobcat introduced the Bobcat Jobsite Companion, described as the compact equipment industry’s first AI voice control system. Powered by a proprietary large language model running entirely onboard, the system allows operators to manage more than 50 machine functions using natural voice commands.

Operators can adjust attachment settings, engine speed, lighting, and other controls without taking their hands off the controls. Because the system does not rely on cloud connectivity, it can respond in real time even in remote or connectivity-limited jobsites.

“Jobsite Companion lowers the barrier to entry for new operators while helping experienced professionals work faster and more precisely,” said Joel Honeyman, vice president of global innovation at Doosan Bobcat.

Bobcat also announced Service.AI, an AI-powered support platform designed for dealers and technicians. The system provides instant access to diagnostics, repair manuals, service histories, and troubleshooting guidance, aiming to reduce downtime and speed up maintenance.

Safety, Displays, and Energy Systems

Doosan Bobcat showcased several additional technologies that support its smart jobsite vision. A radar-based collision warning and avoidance system uses imaging radar to monitor surroundings and can automatically slow or stop a machine to prevent accidents.

The company also revealed an advanced display concept using transparent MicroLED screens integrated into cab windows. These displays overlay 360-degree camera views, machine performance data, alerts, and asset tracking directly into the operator’s field of vision.

Powering these systems is the Bobcat Standard Unit Pack, or BSUP, a modular and rugged battery system designed for harsh construction environments. The fast-charging packs are scalable across Bobcat’s equipment lineup and are intended to support broader electrification efforts, including potential use by other manufacturers.

Toward a Smarter Jobsite

Doosan Bobcat said the technologies unveiled at CES 2026 form an integrated ecosystem rather than isolated features. By combining AI, autonomy, electrification, and connectivity, the company aims to redefine how compact equipment is operated and supported.

“We’ll combine AI, autonomy, electrification, and connectivity to create new jobsite standards,” Park said during the Media Day presentation.

While the RX3 and several systems remain concept-stage, Bobcat’s messaging at CES emphasized near-term impact rather than distant vision. The company framed these developments as practical steps toward safer, more productive jobsites where intelligent machines actively support human workers.

CES 2026: Mobileye to Acquire Mentee Robotics for $900M to Accelerate Physical AI Push

Mobileye agreed to acquire humanoid robotics startup Mentee Robotics for $900 million, expanding its autonomy technology from vehicles into Physical AI and general-purpose humanoid robots.

By Rachel Whitman Published: | Updated:
A Mentee Robotics humanoid robot demonstrates autonomous box handling during a Physical AI showcase at CES 2026. Photo: Mentee Robotics

Mobileye has agreed to acquire Mentee Robotics in a $900 million transaction that marks a major strategic shift beyond autonomous driving and into humanoid robotics and Physical AI. Announced during CES 2026 in Las Vegas, the deal positions Mobileye to apply its autonomy technology to machines designed to work directly alongside humans in physical environments.

The acquisition combines Mobileye’s large-scale perception, planning, and safety systems with Mentee’s vertically integrated humanoid robot platform. Together, the companies aim to build general-purpose robots capable of understanding context, inferring intent, and executing tasks safely and autonomously in real-world settings such as factories, warehouses, and industrial facilities.

From Vehicle Autonomy to Embodied Intelligence

Mobileye’s core business has been built around vision-based autonomy for vehicles, with systems designed to interpret complex scenes, predict behavior, and make safety-critical decisions. Those same challenges increasingly define humanoid robotics, where machines must navigate spaces built for people while interacting with objects, equipment, and coworkers.

The company said the acquisition represents a decisive move toward Physical AI, a class of systems that not only perceive the world but also act within it reliably and at scale. Mobileye’s autonomy stack has evolved beyond navigation toward context-aware and intent-aware reasoning, providing a foundation for robots that can operate productively without constant supervision.

The move also reflects Mobileye’s effort to diversify as competition intensifies in autonomous driving and commercialization timelines extend. By expanding into humanoid robotics, the company gains exposure to a parallel growing market where autonomy software may become the primary differentiator.

Mentee’s Humanoid Platform and Learning Approach

Founded four years ago, Mentee Robotics has developed a third-generation humanoid robot designed for scalable deployment rather than laboratory demonstrations. The platform is vertically integrated, with in-house development of hardware, embedded systems, and AI software.

Mentee’s approach emphasizes rapid learning and adaptability. Its robots are trained primarily in simulation, reducing reliance on large-scale real-world data collection and minimizing the gap between simulated and physical performance. The system is designed to acquire new skills through limited human demonstrations and intent cues, rather than continuous teleoperation.

This learning framework enables autonomous, end-to-end task execution, including locomotion, navigation, and safe manipulation of rigid objects. In demonstrations, Mentee robots have shown the ability to perform multi-step material handling tasks with stability and accuracy, supporting the company’s focus on real-world utility.

Deal Structure and Commercial Roadmap

Under the terms of the agreement, Mobileye will pay $900 million for Mentee Robotics, consisting of approximately $612 million in cash and up to 26.2 million shares of Mobileye Class A stock, subject to adjustments. The transaction is expected to close in the first quarter of 2026, pending customary approvals.

Mentee will operate as an independent unit within Mobileye, allowing continuity while gaining access to Mobileye’s AI infrastructure and production expertise. First customer proof-of-concept deployments are planned for 2026, with autonomous operation as a core requirement. Series production and broader commercialization are targeted for 2028.

Mobileye said the acquisition will modestly increase operating expenses in 2026 but aligns with its long-term growth strategy.

CES 2026 and the Rise of Physical AI

Physical AI emerged as a central theme at CES 2026, with humanoid robots, service robots, and embodied AI systems moving beyond concept stages. The Mobileye-Mentee announcement underscored how autonomy is becoming a shared foundation across vehicles and robots, rather than a domain-specific technology.

Mobileye highlighted strong momentum in its core automotive business, citing an $24.5 billion revenue pipeline over the next eight years. Company executives framed the acquisition as a way to extend that success into a second transformative market without abandoning its safety-first philosophy.

“Today marks a new chapter for robotics and automotive AI,” said Mobileye President and CEO Amnon Shashua. “By combining Mentee’s breakthroughs in humanoid robotics with Mobileye’s expertise in autonomy and productization, we have an opportunity to lead Physical AI at a global scale.”

Mentee CEO Lior Wolf said the partnership accelerates the company’s mission to deliver safe, cost-effective humanoid robots capable of meaningful work in human environments.

As CES 2026 made clear, the race to define Physical AI is accelerating. With this acquisition, Mobileye signals that the next phase of autonomy may unfold not just on roads, but across factories, warehouses, and workplaces worldwide.