If we as a society are going to trust technology, we need to trust that technology doesn’t mishandle a situation. 

“We hardly trust human teenagers behind the wheel, and they take classes and practice until they are ready to pass a driving test,” said computer engineering graduate student Joey Stamenkovich. “How can we instill trust for things that run by themselves?”

With Associate Professor Cameron Patterson and fellow graduate student Lakshman Maalolan, Stamenkovich is working to boost reliability by developing drones that police themselves—enforced by hardware already present on many high-end drones to override the existing controller if the drone violates certain rules. 

Current FAA regulations cap drone altitude at 400 feet, forbid unauthorized drone activity within five miles of an airport, and limit drone operation to the operator’s line of sight, “which makes things like patrolling a pipeline or home delivery difficult without a chase vehicle,” said Patterson.

In commercial aviation, redundancy is built into every function—there is contingency management and backups for everything, including the pilot, explained Patterson. “We are going to need something equivalent for drones if they’re flying outside of the operator’s line of sight.” 

Sponsored by the National Science Foundation’s Center for Unmanned Aircraft Systems and NAVAIR, the goal is to develop  drone-constraining technology that would convince the FAA to relax these restrictions, said Patterson. 

Student flier

When learning to fly an airplane, pilots spend many hours flying with an instructor, who is monitoring the situation and can take control of the craft at any moment. Similarly, the virtual “instructor” can force a compromised drone to stop and hover, take evasive action, land, or return home.

Patterson and his team are using field-programmable gate arrays (FPGAs), which can monitor all sensory input—drone position, direction, and velocity—and all application software outputs, which include control commands to climb, descend, or move in a specific direction.

Encoded with safety and perimeter rules, the hardware-implemented monitors are supplied with GPS coordinates every second. 

“If, for example, the drone is approaching a boundary and not slowing sufficiently, the monitors will trigger, take control of the drone, and do something appropriate, like force it to hover or land,” said Patterson.

The safety monitors can also be customized to enforce pitch-and-bank envelope protection, limits on speed, or other defining rules.

Implementation 

“The way we implement these monitors is really the key,” said Patterson. “They are isolated from the software, so even if the software crashes or gets hacked, the monitors will still be effective.”

To prevent misunderstandings that can arise from informal specifications, desired system behavior is captured with linear temporal logic formulas.  Each formula is synthesized into a hardware monitor that operates concurrently with all the other synthesized monitors. As a further measure, the monitor implementations are formally analyzed to make sure they are consistent with the original specifications. 

“More than an end product”

“A holy grail of computing is establishing system correctness with a mathematical degree of certainty,” said Patterson. “We generally can’t do that because we implement things that are too complicated—with too many software layers, it becomes infeasible.”

But by implementing isolated monitors that can be synthesized from mathematical expressions and then analyzed, Patterson’s team achieves high confidence that they’re correct. In other words: These monitors will be trustworthy.

“It’s more than an end product,” said Lakshman, a computer engineering graduate student. “The process is something that will help us make sure that whatever we develop is correct—it’s a last line of defense during runtime and needs to be highly reliable.”

This ability to enforce and verify behavior is what Patterson considers the main contribution of this work: “We found a way to apply formal methods to existing, complex embedded systems that were not designed with formal verification in mind.” 

Drones in the classroom

Drones in the classroom

The self-policing drone technology incorporates all aspects of computer engineering: algorithms, computer architecture, software, hardware, and mathematics-grade verification. In Computational Engineering (ECE 2514), which will be offered for the first time in Fall 2019, students will program a virtual drone to familiarize themselves with the basic concepts of computer engineering. Students will use simulation tools to develop simple algorithms and data structures to control virtual drones in georeferenced environments. Student teams will also have the opportunity to conduct actual drone flights to collect data and see how their models work in reality.  

Contact: