A new artificial intelligence startup founded by prominent AI researcher Yann LeCun has raised more than $1 billion to develop machines capable of reasoning about the physical world, signaling a growing push to move beyond today’s dominant large language model approach.
The company, called Advanced Machine Intelligence (AMI), is focused on building AI systems and robotic platforms designed around what researchers describe as “common sense” understanding – the ability to reason about objects, environments, and cause-and-effect relationships.
The funding round raised approximately $1.03 billion at a pre-money valuation of $3.5 billion and was co-led by several venture investors including Cathay Innovation, Greycroft, Hiro Capital, HV Capital, and Bezos Expeditions.
The investment highlights increasing interest in AI systems that go beyond language generation and move toward embodied intelligence capable of interacting with the real world.
Challenging the Dominance of Language Models
LeCun, who previously served as chief AI scientist at Meta and helped establish the company’s influential FAIR research lab, has long argued that current AI systems built primarily around predicting the next word in a sentence are fundamentally limited.
While large language models have made rapid progress in text generation and coding tasks, LeCun has said they lack deeper reasoning capabilities and an understanding of physical reality.
The new startup aims to develop alternative architectures based on so-called world models – systems designed to build internal representations of how the physical world works.
Such models could allow machines to plan actions, anticipate outcomes, and adapt to changing environments.
In interviews discussing the company’s vision, LeCun has suggested that these capabilities will be essential for robots and autonomous systems that must operate safely in complex environments.
From Software Intelligence to Physical Machines
Although AMI is developing AI software platforms, robotics is expected to be one of its long-term targets.
Machines that interact with the physical world must interpret sensory data, understand spatial relationships, and predict how objects will behave. These tasks require a type of reasoning that goes beyond text-based pattern recognition.
LeCun has argued that robots intended for everyday environments, such as homes or workplaces, will require a significant level of common sense to function reliably.
That includes understanding concepts such as gravity, object permanence, and the likely consequences of physical actions.
Domestic robots, for example, must recognize not only objects but also how those objects can be used or manipulated in different contexts.
Potential Applications Across Multiple Industries
The company’s technology is expected to target a broad range of industries where AI systems must interact with real-world environments.
Potential applications include manufacturing, automotive engineering, aerospace systems, biomedical research, and pharmaceutical development.
In addition to industrial uses, LeCun has indicated that consumer applications could emerge as well. One possibility under discussion is integrating the technology into wearable devices such as smart glasses.
LeCun said he has been in conversations with Meta about potential future applications of the technology in products like the company’s Ray-Ban smart glasses.
Such devices could eventually act as intelligent assistants capable of understanding physical surroundings rather than simply processing spoken commands.
A New Direction in AI Research
The launch of AMI reflects a growing debate within the AI community about the future direction of the field.
Large language models have dominated AI development in recent years, attracting massive investment and widespread adoption across industries. However, some researchers argue that these systems alone cannot produce broadly capable artificial intelligence.
LeCun has been one of the most prominent voices advocating alternative approaches based on learning world models and predictive representations of reality.
If successful, those methods could enable machines to reason more effectively about complex environments and ultimately control robots and other physical systems.
The $1 billion investment suggests that at least some investors are willing to back that alternative vision.
As robotics and embodied AI become central to the next phase of artificial intelligence development, systems capable of understanding and interacting with the physical world may become an increasingly important frontier for the industry.