GW Engineering Professor Works to Secure Autonomous Aircraft from Bad Actors
You may not have seen them yet, but the flying robots have entered American airspace. In parts of Arkansas, California, Texas, Virginia and elsewhere, autonomous drones have been cleared to deliver packages, prescriptions and meals at consumers’ doors. Even more ambitious projects are in the works worldwide, like autonomous electric take-off and landing (eVTOL) aircraft that take off and land vertically like helicopters and could carry commuters or tourists. The companies building such technologies say these innovations will speed up package delivery and passenger travel times, alleviate ground traffic and save on human labor.
But like any new technology, these large and small autonomous aircraft have vulnerabilities—some known, some not yet discovered—that could leave them open either to malfunction or to exploitation by bad actors.
Peng Wei, an associate professor of mechanical and aerospace engineering at the George Washington University School of Engineering and Applied Science, is the lead PI of a three-year project that could help close these security gaps by investigating how to safeguard autonomous aircraft flying in high-density urban airspace from cyberattacks that could disrupt safe operations. Three multi-institutional teams will receive a total of $18 million in University Leadership Initiative (ULI) funding from NASA, with Wei’s team receiving $6 million over the next three years.
The goal, Wei said, is not only to research secure aircraft autonomy but also to train the next generation of aerospace engineers at the undergraduate and graduate levels. Students who work on the project will, by the time they enter the U.S. aviation workforce, have the depth and breadth of knowledge to address emerging real-world challenges in aircraft autonomy and cybersecurity.
“Working with NASA is very exciting because we as an academic institution have the opportunity to train student talent, especially at the graduate level, for the broader U.S. aerospace industry,” Wei said. “I cannot wait to see the students from across our collaborating institutions become great researchers and contributors to this industry."
The project also offers an opportunity for mechanical and aerospace engineering students to become deeply versed in cybersecurity and artificial intelligence (AI). Certain functions in aircraft autonomy rely on machine learning and neural networks, which allow these craft to achieve complex goals such as autonomous landing and autonomous separation assurance (maintaining safe distances from other aircraft and their flight paths).
But these neural-network embedded functions are becoming more sophisticated to verify, safeguard and certify, Wei said. He and his team will focus on enhancing the resilience of the selected AI functions under malicious attacks.
Wei’s team includes researchers from GW as well as collaborators from Vanderbilt University, Purdue University, Tennessee State University, University of California-Irvine, University of Texas at Austin, Collins Aerospace and Northern Virginia Community College. The team also is supported by a large group of government, industry and academic partners including the Federal Aviation Administration (FAA), MITRE, Honeywell Aerospace, Boeing, Aurora Flight Sciences, Skygrid, Joby Aviation, Xwing, Reliable Robotics, JHU Applied Physics Lab, MIT Lincoln Lab, Northrop Grumman Aeronautics, George Mason University, Embry-Riddle Aeronautical University, RTCA, and ASTM. Together, they’ll leverage their vast range of expertise to build their own sub-scale aircraft prototypes, figure out how to hack them and patch the vulnerabilities that allowed that hacking to occur.
“Urban air traffic systems are really uncharted territory for systems security researchers and practitioners,” said team member Vijay Gupta, professor of electrical and computer engineering at Purdue University. “Challenges, constraints and solutions from ‘similar’ fields such as urban road transportation networks or commercial airlines networks do not port to these very dynamic and heterogeneous systems. This project will take a holistic view at the problem and lay the foundations to trustworthy and secure implementations of urban air traffic systems.”
“This project is an exciting opportunity to develop holistic security solutions for next-generation advanced air mobility applications, spanning from low-level systems development to high-level attack detection and mitigation,” said team member Bryan Ward, assistant professor of computer science and of electrical and computer engineering at Vanderbilt University. “This work will inform how next-generation advanced air mobility applications are designed and implemented to ensure system-wide security.”
Wei is particularly looking forward to the flight demonstration phase of the project, he said. He’s excited to see how exactly a supposedly secure aircraft or small delivery drone can be attacked by an outside actor. In a safety-critical field such as aviation and aerial robotics, flight test evidence about what doesn’t work is just as important as evidence about what does.
“What I'm working on in AI safety, aircraft autonomy and security goes naturally into GW’s bigger AI efforts with my other colleagues,” Wei said. “GW is establishing our leadership and becoming a powerhouse in trustworthy AI.”
This article was published by GW Today.