MIT’s Aerial Microrobot Matches Insect Speed with AI-Controlled Flight

MIT engineers developed an AI-controlled aerial microrobot capable of flying with speed and agility comparable to insects, marking a major advance in micro-scale robotics.

By Daniel Krauss Published: Updated:

Engineers at MIT have demonstrated an aerial microrobot capable of flying with speed and agility approaching that of real insects, overcoming a long-standing limitation in micro-scale robotics. The insect-sized robot can execute aggressive maneuvers, including repeated midair somersaults, while maintaining stability even in windy conditions. The breakthrough was enabled by a new artificial intelligence-based control system that dramatically improves flight performance.

Tiny flying robots have long been viewed as promising tools for applications such as search-and-rescue operations, where they could navigate through narrow gaps and unstable environments inaccessible to larger drones. Until now, however, aerial microrobots have been constrained to slow, smooth flight paths due to limitations in control systems and onboard computation.

AI-Based Control Unlocks Agile Flight

The MIT team designed a two-part, AI-driven control framework that balances high performance with real-time efficiency. Compared to previous versions of the robot, the new system increased flight speed by approximately 450 percent and acceleration by about 250 percent. In testing, the robot completed 10 consecutive somersaults in just 11 seconds while staying within a few centimeters of its intended trajectory.

“We want to be able to use these robots in scenarios that more traditional quadcopter robots would have trouble flying into, but that insects could navigate,” said Kevin Chen, an associate professor of electrical engineering and computer science at MIT and co-senior author of the study. “With our bioinspired control framework, the flight performance of our robot is comparable to insects in terms of speed, acceleration, and pitching angle.”

The microrobot itself is roughly the size of a microcassette and weighs less than a paperclip. It uses flexible artificial muscles to rapidly flap oversized wings, generating lift and enabling sharp directional changes. While recent hardware improvements made more aggressive flight possible, earlier versions of the robot relied on manually tuned controllers that limited overall performance.

From Predictive Planning to Real-Time Action

To overcome these constraints, Chen’s team collaborated with researchers led by Jonathan P. How in MIT’s Department of Aeronautics and Astronautics. Together, they developed a two-step control strategy that combines predictive planning with machine learning.

The first component is a model-predictive controller that uses a mathematical model of the robot’s dynamics to plan optimal flight maneuvers. This controller can account for uncertainties in aerodynamics, external disturbances such as wind, and physical limits on force and torque. While powerful, it is too computationally demanding to run continuously in real time.

To address this, the researchers used the predictive controller to train a deep-learning-based policy through imitation learning. The resulting AI model captures the behavior of the high-performance planner in a compact form that can run extremely fast, allowing the robot to respond instantly to changing conditions.

“If small errors creep in and you try to repeat a flip multiple times, the robot will crash,” How said. “We need robust flight control.”

Toward Real-World Deployment

The robot’s agility allowed it to demonstrate flight behaviors commonly seen in insects, including rapid pitch-and-stop maneuvers known as saccades. These movements help insects stabilize their vision and orient themselves in space, and similar capabilities could support future onboard sensing.

“This bio-mimicking flight behavior could help us when we start putting cameras and sensors on board the robot,” Chen said. “This work signals a paradigm shift. It shows that we can build control architectures that are both high-performing and computationally efficient, even at insect scale.”

The research team plans to focus next on adding onboard sensors and autonomy so the robots can operate outside laboratory motion-capture environments. Coordinated flight among multiple microrobots and collision avoidance are also areas of active investigation.

The study was published in the journal Science Advances and highlights how advanced AI control methods are enabling microrobots to move beyond experimental demonstrations toward practical, real-world use.

Artificial Intelligence (AI), News, Robots & Robotics, Science & Tech

Rivian-Founded Mind Robotics Secures $500 Million for Industrial AI

Mind Robotics, a startup spun out of electric vehicle maker Rivian, has raised $500 million to develop AI-powered industrial robots designed for more adaptable factory automation.

By Daniel Krauss | Edited by Kseniia Klichova Published:
Rivian-Founded Mind Robotics Secures $500 Million for Industrial AI
RJ Scaringe, founder and CEO of Rivian and chairman of Mind Robotics, the industrial AI robotics startup that recently raised $500 million to develop next-generation factory automation systems. Photo: Rivian

A robotics startup spun out of electric vehicle manufacturer Rivian has raised $500 million to build a new generation of industrial robots powered by artificial intelligence.

The company, called Mind Robotics, announced the Series A round this week, bringing its total funding to approximately $615 million only months after its launch. The investment values the company at around $2 billion and was co-led by venture firms Accel and Andreessen Horowitz.

Mind Robotics was created in late 2025 by Rivian founder and CEO RJ Scaringe, who now serves as chairman of the robotics startup.

The company’s goal is to address one of the biggest limitations in modern factory automation: the difficulty robots have performing tasks that require dexterity, adaptability, and real-world reasoning.

A Different Approach to Industrial Robotics

Industrial robots have been used in manufacturing for decades, but most systems remain limited to highly structured tasks such as welding, assembly, or material handling.

These machines perform best when working with predictable objects and fixed production lines.

Mind Robotics is attempting to develop robots capable of operating in more dynamic manufacturing environments where parts vary, conditions change, and tasks require human-like manipulation.

The startup plans to build AI systems that allow robots to interpret their surroundings and adapt their movements in real time.

Unlike many robotics startups that are focusing on humanoid machines, Mind Robotics is taking a more traditional approach to hardware design.

Scaringe has suggested that the company’s focus is on practical factory automation rather than building robots designed to resemble humans.

Training Robots with Factory Data

One advantage the startup brings to the robotics industry is access to manufacturing data from Rivian’s electric vehicle factories.

These facilities provide a real-world environment where robotic systems can be trained and tested on production tasks.

The company aims to use this data to develop AI models that help robots understand physical interactions and perform tasks requiring precision and adaptability.

According to Mind Robotics, much of the value generated inside factories today still depends on human workers performing tasks that traditional automation cannot easily replicate.

By combining robotics hardware with AI models capable of learning from real-world data, the company hopes to automate a broader range of manufacturing activities.

A Growing Investment Wave in Physical AI

The large funding round reflects growing investor interest in robotics companies building AI-driven physical systems.

Over the past year, venture capital firms have increasingly backed startups focused on what many researchers call physical AI – systems that combine machine learning with robots operating in the real world.

Mind Robotics is part of a broader shift toward integrating artificial intelligence directly into industrial automation.

Scaringe has said the company expects to deploy significant numbers of its robots within factories before the end of the year, suggesting an aggressive timeline for moving from research to deployment.

Ties to Rivian’s Technology Ecosystem

Although Mind Robotics operates as an independent company, its relationship with Rivian could extend beyond manufacturing data.

Rivian has recently developed custom semiconductor chips designed to run autonomous driving software inside its vehicles.

Those processors could potentially be used to power robotics systems as well, creating a shared technology foundation between the two companies.

The spinout is also part of a broader pattern emerging at Rivian, which has begun launching new technology ventures alongside its core automotive business.

In 2025 the company also created another startup focused on electric mobility platforms for small cargo vehicles and e-bikes.

Together, these efforts suggest that Rivian is positioning itself not only as a vehicle manufacturer but as a broader developer of robotics and AI technologies.

For Mind Robotics, the next challenge will be proving that AI-powered robots can deliver tangible productivity gains on real factory floors.

Zoox Expands Robotaxi Testing to Phoenix and Dallas

Zoox is expanding testing of its autonomous driving system to Phoenix and Dallas while preparing to deploy its purpose-built robotaxi and integrate its service with the Uber platform.

By Rachel Whitman | Edited by Kseniia Klichova Published:
Zoox Expands Robotaxi Testing to Phoenix and Dallas
Zoox’s purpose-built robotaxi is designed for autonomous ride-hailing, featuring a bidirectional vehicle layout and face-to-face seating for passengers. Photo: Zoox

Amazon-owned autonomous vehicle company Zoox is expanding its robotaxi testing program to Phoenix, Arizona, and Dallas, Texas, as the company continues building toward commercial deployment of its purpose-built autonomous vehicles.

The expansion will introduce Zoox’s autonomous driving technology into two additional urban environments while also supporting the launch of new operational infrastructure, including fleet depots and a new operations center in Scottsdale, Arizona.

With these additions, Zoox now operates testing fleets across ten major U.S. markets, reflecting a broader effort by autonomous vehicle developers to gather real-world data across diverse driving conditions.

Testing in New Environments

The first phase of Zoox’s rollout in Phoenix and Dallas will involve a small number of retrofitted SUVs used for mapping and early testing.

These vehicles will initially be driven manually as engineers map city streets and gather environmental data. Autonomous testing will follow, with safety drivers remaining behind the wheel to intervene if necessary.

Once the company completes this phase, Zoox plans to deploy its purpose-built robotaxi vehicles in both cities.

Each location presents unique testing conditions. Phoenix offers an opportunity to evaluate sensor performance and vehicle durability in extreme heat and dusty environments, particularly on high-speed roads common in the region.

Dallas, meanwhile, provides a complex road network and more variable weather patterns, helping engineers refine how the autonomous system handles diverse driving scenarios.

A Partnership with Uber

At the same time, Zoox is expanding its distribution strategy through a new partnership with Uber.

Under a multi-year agreement, Zoox robotaxis will be integrated into Uber’s ride-hailing platform, allowing users to request autonomous rides through the Uber app in selected cities.

The first integration is expected to begin in Las Vegas later this year, followed by Los Angeles in 2027.

Zoox will continue offering rides through its own mobile application as well, effectively operating on both its proprietary platform and Uber’s global network.

The partnership reflects Uber’s strategy of collaborating with autonomous vehicle developers rather than building its own driverless technology.

Uber previously ran an in-house autonomous vehicle program but sold the division after a fatal crash in 2018. Since then, the company has shifted toward forming partnerships with technology developers.

Building the Infrastructure for Autonomous Fleets

Supporting Zoox’s growing robotaxi program is a network of facilities known as Fusion Centers.

The company is opening a third such facility in Scottsdale, Arizona, joining existing centers in Las Vegas and the San Francisco Bay Area.

These facilities function as operational command centers, coordinating autonomous fleets through teleoperations, mission control, and rider support systems.

Fusion Centers allow human operators to assist vehicles in complex scenarios, manage fleet operations, and provide customer service for passengers.

Since launching its early robotaxi service in Las Vegas and testing programs in San Francisco, Zoox says its vehicles have completed more than one million autonomous miles and transported over 300,000 passengers.

The company’s robotaxi design differs from traditional vehicles. The fully autonomous platform eliminates the steering wheel and pedals, replacing them with a bidirectional cabin featuring face-to-face seating intended to encourage social interaction among riders.

The Growing Robotaxi Race

Zoox’s expansion highlights the intensifying competition among companies seeking to deploy autonomous ride-hailing services.

Developers such as Waymo, Cruise, and several emerging startups are all testing driverless vehicles across multiple U.S. cities, racing to demonstrate safe and scalable operations.

For Zoox, the strategy combines purpose-built vehicles, extensive real-world testing, and partnerships with major mobility platforms.

As autonomous driving technology moves from pilot programs toward commercial deployment, cities like Phoenix and Dallas are becoming critical testing grounds for the next phase of driverless transportation.

Automation, Business & Markets, News, Robots & Robotics

iRobot Expands Roomba Mini Launch to Europe and the U.K.

iRobot has introduced its compact Roomba Mini robot vacuum to Europe and the U.K., marking the company’s first product rollout since emerging from Chapter 11 restructuring earlier this year.

By Laura Bennett | Edited by Kseniia Klichova Published:
iRobot Expands Roomba Mini Launch to Europe and the U.K.
The compact Roomba Mini robot vacuum navigates tight spaces using lidar-based mapping as iRobot expands the device to European markets. Photo: iRobot

iRobot has begun rolling out its smallest robotic vacuum cleaner, the Roomba Mini, across Europe and the United Kingdom, marking the company’s first major product launch since emerging from bankruptcy earlier this year.

The compact robot, which combines vacuuming and mopping capabilities, was previously introduced in Japan and is now being positioned as a cleaning device designed specifically for smaller homes and apartments.

The expansion comes as iRobot attempts to regain momentum following a pre-packaged Chapter 11 restructuring completed in January and a change in ownership that placed the company under the control of its longtime manufacturing partner, Picea.

A Smaller Robot for Smaller Homes

The Roomba Mini is designed to address a practical challenge that has long affected robot vacuums: reaching tight spaces.

According to iRobot, the new model’s compact footprint allows it to navigate narrow corners and areas that are often inaccessible to standard-sized robotic vacuums or traditional upright cleaners.

The robot uses a lidar-based navigation system called ClearView, enabling it to map its surroundings, avoid obstacles, and detect rugs while operating in mopping mode.

Users can control the device through the Roomba Home mobile application, voice assistants, or directly through onboard controls. The system can also operate without a Wi-Fi connection, allowing basic cleaning functionality even when offline.

An AutoEmpty Dock collects debris into an AllergenLock bag capable of holding several months’ worth of dust and dirt, reducing the need for frequent maintenance.

Early Demand Signals in Japan

The Roomba Mini first launched in Japan in February, where iRobot reported strong early demand.

According to company representatives, the black version of the device sold out within the first week of availability.

While the robot was initially designed with compact Japanese homes in mind, iRobot executives say the same characteristics make it suitable for European living spaces, which often feature tighter layouts than homes in North America.

The robot is now available through iRobot’s European online store with a retail price of approximately €399 in the European Union and £379 in the United Kingdom.

A Strategic Moment for iRobot

The European launch arrives at a pivotal moment for the company.

iRobot, once the dominant name in robotic vacuum cleaners, has faced increasing competition in recent years from a growing number of consumer robotics companies offering lower-cost devices with advanced features.

The company’s financial challenges culminated in a bankruptcy restructuring earlier this year. As part of that process, iRobot was acquired by Picea, a firm that previously served as both a manufacturing partner and lender to the company.

Executives say the Roomba Mini was developed before the acquisition and that the ownership transition did not influence the product’s design or release timeline.

However, under Picea’s ownership, iRobot may benefit from expanded manufacturing capabilities and distribution networks across Asia and other global markets.

The Next Phase of Consumer Robotics

The launch of the Roomba Mini also reflects a broader shift within the consumer robotics market.

As robotic vacuum technology matures, manufacturers are increasingly focusing on specialized designs that address specific living environments rather than relying on a single universal product.

Smaller robots capable of navigating dense household layouts may become particularly relevant in urban markets where apartments dominate the housing landscape.

For iRobot, the success of such products could help determine whether the company can maintain its position in an increasingly crowded consumer robotics industry.

Business & Markets, News, Robots & Robotics

Poland’s First Humanoid Influencer Draws Brands and Millions Of Views

A humanoid robot named Edward Warchocki has rapidly gained attention in Poland’s influencer market, attracting hundreds of thousands of social media followers and landing commercial brand partnerships.

By Daniel Krauss | Edited by Kseniia Klichova Published: Updated:
Poland’s First Humanoid Influencer Draws Brands and Millions Of Views
Edward Warchocki, a humanoid robot operating on the streets of Warsaw and Poznań, interacts with the public as part of a marketing experiment exploring robots as social media influencers. Photo: Project creators

A humanoid robot has entered Poland’s influencer economy, quickly drawing attention from brands, social media audiences, and technology observers.

The robot, named Edward Warchocki, has appeared on streets in Warsaw and Poznań interacting with passers-by while building a growing presence on social media platforms. Within roughly two weeks of its debut, the project amassed tens of thousands of followers on Instagram and more than 100,000 on TikTok, while videos featuring the robot reportedly generated hundreds of millions of views across platforms.

The experiment places a physical humanoid robot into a market previously dominated by human personalities and digital avatars.

For companies experimenting with new forms of marketing, the project represents an early test of how embodied AI could reshape influencer culture.

From Lab Experiment to Commercial Platform

Edward Warchocki began as a technology experiment initiated by entrepreneur Radosław Grzelaczyk and supported by artificial intelligence developer Bartosz Idzik, who created the system that powers the robot’s conversational behavior.

Unlike scripted promotional mascots, the robot is designed to interact dynamically with people in real-world environments.

According to the project’s creators, the system uses a combination of proprietary AI tools and existing technologies to generate responses during live interactions. The goal was to create a robot that could adapt to conversations rather than simply repeat preprogrammed lines.

Public reactions have ranged from curiosity to enthusiasm. Videos show people approaching the robot on city streets to shake hands, ask questions, and record short interactions for social media.

These spontaneous encounters have become a key part of the robot’s appeal.

Brands Experiment with a New Type of Influencer

The project has already begun attracting commercial interest.

Edward’s first advertising collaboration reportedly involved promoting a luxury watch valued at approximately 80,000 złoty, marking a symbolic entry into the influencer marketing industry.

For brands, robots introduce a different set of characteristics than human creators.

A humanoid influencer cannot become involved in personal scandals, take breaks, or deviate from a brand’s messaging strategy. The creators of the project argue that this level of control makes robots attractive marketing ambassadors for companies seeking predictable campaigns.

Some analysts also point to engagement metrics as a key factor. Early data from projects involving robotic or virtual influencers suggests that novelty and public curiosity can generate unusually high engagement rates compared with traditional creators.

Physical Presence in a Digital Industry

The rise of virtual influencers is not new. Computer-generated personalities such as Lil Miquela in the United States or Rozy in South Korea have already built large audiences and signed partnerships with major global brands.

However, those figures exist entirely in digital form.

Edward represents a different model: a physical robot capable of interacting with people face to face.

This physical presence creates a type of engagement that purely digital characters cannot replicate. Passers-by can approach the robot, speak with it, and record the interaction in real time.

In practice, the robot becomes both an influencer and a live attraction.

For social media creators and marketers, such encounters provide content that can spread quickly across platforms.

A New Category of Embodied Media

The project reflects a broader trend in robotics where machines are increasingly designed to operate in social environments rather than purely industrial settings.

Humanoid robots have traditionally been developed for research or automation applications. But advances in conversational AI and robotics hardware are opening the possibility of robots that function as public-facing personalities.

If the experiment succeeds commercially, it could mark the emergence of a new category within influencer marketing: embodied influencers.

Analysts estimate that the global virtual influencer market could reach nearly $16 billion by 2026. Robots capable of appearing in both online content and physical events may represent the next stage of that market’s evolution.

For now, Edward Warchocki remains an experiment.

But its rapid rise suggests that the intersection of robotics, artificial intelligence, and digital media may be creating a new kind of celebrity – one built from code, hardware, and algorithms rather than human charisma.

News, Robots & Robotics, Science & Tech

Tesla Teases More Human Like Hands for Next Generation Optimus Robot

Tesla has teased a new generation of humanoid robot hands for Optimus, suggesting the company is focusing on improved dexterity as it develops its next iteration of the humanoid platform.

By Laura Bennett | Edited by Kseniia Klichova Published: Updated:
Tesla Teases More Human Like Hands for Next Generation Optimus Robot
A close-up teaser image from Tesla’s AI team shows a new generation of humanoid robot hands for Optimus designed to resemble human anatomy and improve dexterity. Photo: Tesla AI

Tesla has offered a new glimpse into the next stage of development for its humanoid robot Optimus, teasing a redesigned set of robotic hands that appear significantly more human-like than previous prototypes.

The teaser image, shared by Tesla’s AI team on Chinese social media platform Weibo, shows a pair of robotic hands with finger proportions and articulation that closely resemble those of a human hand. The image quickly circulated across the robotics community, fueling speculation that the company is preparing a new iteration of the robot’s manipulation system.

Although Tesla has not released technical specifications, the design suggests the company is focusing heavily on improving dexterity – widely considered one of the most difficult challenges in humanoid robotics.

Why Robotic Hands Matter

In robotics research, the ability to manipulate objects with human-level precision remains a major technical hurdle.

Industrial robots have long been capable of gripping and moving objects, but most rely on specialized end-effectors designed for specific tasks. Humanoid robots, by contrast, must interact with a wide range of tools, devices, and environments originally designed for human hands.

This requirement makes hand design one of the most complex engineering problems in humanoid robotics. Achieving fine motor control requires a combination of compact actuators, high-resolution sensors, and sophisticated control software capable of coordinating dozens of joints simultaneously.

If Optimus is intended to perform tasks in factories, warehouses, or eventually homes, improved hand dexterity will likely be essential.

The teaser image suggests Tesla may be moving toward a more anatomically inspired design, potentially enabling the robot to handle objects with greater precision.

A Key Step Toward Tesla’s Robotics Ambitions

Elon Musk has repeatedly described Optimus as one of Tesla’s most important long-term initiatives, potentially exceeding the impact of the company’s electric vehicles.

Musk has suggested that humanoid robots could eventually perform a wide range of tasks across industries, from manufacturing and logistics to household assistance.

In public comments, he has also framed the project in more ambitious terms, describing Optimus as a potential “Von Neumann machine” – a theoretical self-replicating system capable of building copies of itself using available materials.

While such concepts remain far from practical reality, they reflect the scale of Musk’s long-term vision for the project.

For now, however, Tesla’s progress will likely depend on solving more immediate engineering problems.

Among those, robotic hands remain one of the most critical components determining whether humanoid robots can move beyond demonstrations and into real-world work environments.

The new teaser suggests Tesla is continuing to refine that capability as it works toward the next generation of its Optimus platform.