Beyond flying drones: Collaborative mission-driven autonomy for military applications
Data analytics, machine learning, artificial intelligence (AI), and autonomy. You can’t go anywhere without hearing about these fascinating technologies solving some of the world’s most challenging problems. Autonomy in particular offers great potential for some military applications, but can be the most challenging to actually achieve. While machine learning and AI have made rapid progress in computer vision, speech recognition, natural language processing, and robotic control, the community still has a ways to go to achieve true autonomy.
One step on this path is to investigate the use of better task-oriented features from high-dimensional, unstructured data using multiple data streams comprised of different sensors and data types. Leidos is determined to do just that for airborne collaborative autonomy applications. We’re working to reduce operator workload and improve operational sensing capability across intelligence, surveillance, and reconnaissance (ISR), electronic warfare (EW), and strike applications.
Data streams can be comprised of many different types of data including LIDAR, radio frequency (RF), and radar imagery (as examples). Each of these data types offer opportunities to take advantage of different features that in theory can be used to improve classification by combining the features in new ways (versus relying on obtaining a complete classification by one sensor modality). One of the anticipated promises of deep learning is to provide advanced perception, prediction, and detection capabilities by incorporating and optimizing feature extraction over multiple heterogeneous sensing modalities. However, research in this area is even more challenging due to the current lack of simultaneous collections of labeled data over multiple sensor modalities.
Improving classification is a huge achievement in and of itself, but Leidos is taking it a step further. Autonomy requires reasoning and decision-making AI using these multiple data types to determine how to accomplish tasks and even what tasks should be accomplished within the context of the mission. True autonomous capability requires focus on the AI that reasons about data and information that may be unexpected or never seen before, especially in military applications to overcome the lack of data, especially labeled data, to train some of these advanced techniques in machine learning.
Unlike the commercial world, where companies like Google and Facebook are able to leverage lots of available data (pictures, video, text, etc.) on which to train, pictures of military objects are not as abundant, and in some cases are not available at all. Leidos is researching the use of synthetic data to augment the limited available data through programs with AFRL and DARPA to realize the benefits of the latest machine learning techniques. To make the problem even more challenging, an autonomous system has to go beyond recognition and be able to make a decision, potentially changing tasks or actions in real-time while taking into account other available systems’ capabilities as well. This requires collaborative autonomy and hybrid AI approaches (ML and non-ML based techniques) that can even take into account human teammates and their behaviors.
We need to move autonomy beyond flying drones to dynamic, mission-driven autonomy to truly realize the claimed benefits to the warfighter. This requires a deep understanding of the data sensors provide and their relationship to mission objectives. Leidos is working to bring together some of the most advanced sensors and sensor processing in the world and apply our collaborative autonomy architecture and reasoning AI to ultimately make decisions that affect the outcome of the mission. These decisions can then inform where a drone (or other vehicle) needs to go to accomplish a specific mission task or even recommend a course of action to a human. This innovative approach requires architecting systems with decentralized decision-making and sharing information between the functional subsystems. That’s the key to realize mission autonomy. We are also aggressively attacking the challenge of applying this same approach to multiple domains (air, space, sea, etc.) to enable even broader next-generation military capabilities.
It's an exciting time at Leidos, where we’re bringing together incredible capabilities and people across multiple areas of sensing, open architectures, platform integration, and collaborative autonomy to tackle big challenges for the Air Force. Leidos has a rich history of developing cutting edge sensor and autonomy capability for organizations such as AFRL, DARPA, and ONR. Leidos is now working to bring these capabilities together to create vehicle-agnostic system solutions and realize new capabilities through the use of our collaborative autonomy.
While the Midwest may not be the first area of the country that comes to mind when you think about autonomy, Leidos aims to change that perspective in the area of airborne autonomous sensing in Dayton, Ohio. Learn more about our work in Dayton and browse current opportunities.