Collaborative mission-driven autonomy
Data analytics, machine learning (ML), artificial intelligence (AI), and autonomy. You can't go anywhere without hearing about these fascinating technologies solving some of the world's most challenging problems. Autonomy, in particular, offers great potential for some military applications but can be the most challenging to actually develop and achieve. Leidos designs flexible, open architectures that enable rapid incorporation of emerging best-of-breed autonomy solutions to suit any applicable mission need. By combining our expertise in designing and building AI and ML systems with our knowledge of the underlying physics of many domains and missions, we are able to take a disciplined, science-based approach to develop effective collaborative autonomy solutions.
One step on this path is to investigate the use of better task-oriented features from high-dimensional, unstructured data using multiple data streams comprised of different sensors and data types. Data streams can be comprised of many different types of data, including LIDAR, radio frequency (RF), and radar imagery (as examples). Each of these data types offers opportunities to take advantage of different features that, in theory, can be used to improve mission decisions by exploiting features and behaviors in new ways (versus relying on only one sensor modality). By applying our collaborative autonomy framework and techniques, we can provide advanced perception, prediction, and detection capabilities by incorporating and optimizing over multiple heterogeneous sensing modalities. However, research in this area is extremely challenging due to the current lack of simultaneous collections of data using multiple sensor modalities.
Improving multi-modality sensing approaches is a huge achievement in and of itself, but Leidos is taking it a step further. Autonomy requires reasoning and decision-making AI to determine how to accomplish tasks, and even what tasks should be accomplished within the context of the mission. True autonomous capability requires focus on the AI that reasons about data and information that may be unexpected or never seen before, especially in military applications.
Unlike the commercial world, where companies like Google and Facebook are able to leverage lots of available data (pictures, video, text, etc.) on which to train, pictures of military objects are not as abundant, and in some cases, are not available at all. Leidos is researching the use of synthetic data to augment the limited available data through programs with AFRL and DARPA to realize the benefits of the latest ML techniques. To make the problem even more challenging, an autonomous system has to go beyond recognition and be able to make a decision, potentially changing tasks or actions in real-time while taking into account other available systems' capabilities as well. This requires collaborative autonomy and hybrid AI approaches (ML and non-ML based techniques) that can even take into account human teammates and their behaviors.
We need to move autonomy beyond flying drones to dynamic, mission-driven autonomy to truly realize the claimed benefits to the warfighter. This requires a deep understanding of the data sensors provide and their relationship to mission objectives. Leidos is working to bring together some of the most advanced sensors and sensor processing in the world and apply our collaborative autonomy architecture and reasoning AI to ultimately make decisions that affect the outcome of the mission. These decisions can then inform where a drone (or other vehicles) needs to go to accomplish a specific mission task or even recommend a course of action to a human. This innovative approach requires architecting systems with decentralized decision-making and sharing information between the functional subsystems. That's the key to realize mission autonomy. We are also aggressively attacking the challenge of applying this same approach to multiple domains (air, space, sea, etc.) to enable even broader next-generation military capabilities.
It's an exciting time at Leidos, where we're bringing together incredible capabilities and people across multiple areas of sensing, open architectures, platform integration, agile software development, and collaborative autonomy to tackle big challenges for the Air Force. Leidos has a rich history of developing cutting edge sensor and autonomy capability for organizations such as AFRL, DARPA, and ONR. Leidos is now working to bring these capabilities together to create vehicle-agnostic system solutions to accomplish some of the most challenging missions in the most challenging environments.