Introducing X1: The world’s first multirobot system that integrates a humanoid robot with a transforming drone that can launch off the humanoid’s back, and later, drive away.
The new multimodal system is one product of a three-year collaboration between Caltech’s Center for Autonomous Systems and Technologies (CAST) and the Technology Innovation Institute (TII) in Abu Dhabi, United Arab Emirates. The robotic system demonstrates the kind of innovative and forward-thinking projects that are possible with the combined global expertise of the collaborators in autonomous systems, artificial intelligence, robotics, and propulsion systems.
“Right now, robots can fly, robots can drive, and robots can walk. Those are all great in certain scenarios,” says Aaron Ames, the director and Booth-Kresa Leadership Chair of CAST and the Bren Professor of Mechanical and Civil Engineering, Control and Dynamical Systems, and Aerospace at Caltech. “But how do we take those different locomotion modalities and put them together into a single package, so we can excel from the benefits of all these while mitigating the downfalls that each of them have?”
Testing the capability of the X1 system, the team recently conducted a demonstration on Caltech’s campus. The demo was based on the following premise: Imagine that there is an emergency somewhere on campus, creating the need to quickly get autonomous agents to the scene. For the test, the team modified an off-the-shelf Unitree G1 humanoid such that it could carry M4, Caltech’s multimodal robot that can both fly and drive, as if it were a backpack.
The demo started with the humanoid in Gates–Thomas Laboratory. It walked through Sherman Fairchild Library and went outside to an elevated spot where it could safely deploy M4. The humanoid then bent forward at the waist, allowing M4 to launch in its drone mode. M4 then landed and transformed into driving mode to efficiently continue on wheels toward its destination.
Before reaching that destination, however, M4 encountered the Turtle Pond, so it switched back to drone mode, quickly flew over the obstacle, and made its way to the site of the “emergency” near Caltech Hall. The humanoid and a second M4 eventually met up with the first responder.
“The challenge is how to bring different robots to work together so, basically, they become one system providing different functionalities. With this collaboration, we found the perfect match to solve this,” says Mory Gharib, Ph.D., the Hans W. Liepmann Professor of Aeronautics and Medical Engineering at Caltech and CAST’s founding director.
Gharib’s group, which originally built the M4 robot, focuses on building flying and driving robots as well as advanced control systems. The Ames lab, for its part, brings expertise in locomotion and developing algorithms for the safe use of humanoid robots. Meanwhile, TII brings a wealth of knowledge about autonomy and sensing with robotic systems in urban environments. A Northeastern University team led by engineer Alireza Ramezani assists in the area of morphing robot design.
“The overall collaboration atmosphere was great. We had different researchers with different skill sets looking at really challenging robotics problems spanning from perception and sensor data fusion to locomotion modeling and controls, to hardware design,” says Ramezani, an associate professor at Northeastern.
When TII engineers visited Caltech in July 2025, the partners built a new version of M4 that takes advantage of Saluki, a secure flight controller and computer technology developed by TII for onboard computing. In a future phase of work, the collaboration aims to give the entire system sensors, model-based algorithms, and machine learning-driven autonomy to navigate and adapt to its surroundings in real time.
“We install different kinds of sensors—lidar, cameras, range finders—and we combine all these data to understand where the robot is, and the robot understands where it is in order to go from one point to another,” says Claudio Tortorici, director of TII. “So, we bring the capability of the robots to move around with autonomy.”
Ames explains that even more was on display in the demo than meets the eye. For example, he says, the humanoid robot did more than simply walking around campus. Currently, the majority of humanoid robots are given data originally captured from human movements to achieve a particular movement, such as walking or kicking, and scaling that action to the robot. If all goes well, the robot can imitate that action repeatedly.
But, Ames argues, “If we want to really deploy robots in complicated scenarios in the real world, we need to be able to generate these actions without necessarily having human references.”
His group builds mathematical models that describe the physics of that application to a robot more broadly. When these are fused with machine learning techniques, the models imbue robots with more general abilities to navigate any situation they might encounter.
“The robot learns to walk as the physics dictate,” Ames says. “So X1 can walk; it can walk on different terrain types; it can walk up and down stairs, and importantly, it can walk with things like M4 on its back.”
An overarching goal of the collaboration is to make such autonomous systems safer and more reliable.
“I believe we are at a stage where people are starting to accept these robots,” Tortorici says. ” In order to have robots all around us, we need these robots to be reliable.”
That is ongoing work for the team. “We’re thinking about safety-critical control, making sure we can trust our systems, making sure they’re secure,” Ames says. “We have multiple projects that extend beyond this one that study all these different facets of autonomy, and these problems are really big. By having these different projects and facets of our collaboration, we are able to take on these much bigger problems and really move autonomy forward in a substantial and concerted way.”
Citation:
Robot ‘backpack’ drone launches, drives and flies to tackle emergencies (2025, October 14)
retrieved 15 October 2025
from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.