Information advantage: It's not what you do...
…It’s the way that you do it. How we procure and maintain the systems that secure information advantage from the growing volume of data in both military and civilian environments will dictate whether we can stay ahead of the game in situations that demand speed and agility. The solution? Integration and iteration.
- The threats we face in policing, national security, and defence change rapidly; foes often dismantle operations quickly when confronted, and shift domains.
- We are getting better at maintaining an information advantage that allows us to monitor and adapt to these evolving threats using vast and growing data sources.
- To maintain this edge, systems must move from ‘just-so’ scoping and domain-exclusive procurement to more agile, feature-based approaches that will leverage the expertise of both domain specialists and integration experts.
What can the Maginot Line teach us about developing, maintaining, and exploiting information advantage? In the 1930s, France spent about 1% of its GDP (the equivalent of the UK spending £28bn today) building thousands of bunkers and blockhouses to blunt a future German invasion.
As a point solution, it was an effective design to tackle one conceivable threat. But it failed to integrate with other elements of France’s military or diplomatic efforts. To this day, historians argue endlessly whether the Line itself was a smart move; regardless, France surrendered six weeks after the German invasion of Belgium. (As an example of the old saying that “armies waste their time preparing to fight the last war” it’s pretty effective.)
A modern equivalent is investment in bespoke, siloed information collection and analysis systems by military, national security, and civilian agencies in data-rich environments. We need an advantage over increasingly nebulous foes, within a diverse range of environments. But all too often, we spend time perfecting a brief for systems that risk ‘doing a Maginot.’ The ‘red lines’ we draw when setting mission or capability parameters end up hardening into gaps that fatally undermine actionable information advantage.
Looking for network effects
The need to build more flexible, open information systems has been going on since the end of the Cold War. In 1980, the focus was almost exclusively the Communist bloc. Intelligence and other resources focused on a visible, usually physical, threat. By 2000, that singular focus had vanished. More and more diffuse threats were emerging. In 2001, 9/11 forced us to realise that any failure to attain an information advantage around potential, rather than visible, threats would be a catastrophic failure of policy.
The problem of connecting different sources of information to gain insights into multi-domain threats affects every kind of agency, both military and civilian. Effective, affordable solutions to secure information advantage in situations that evolve and reshape quickly is a demanding challenge, with political and practical dimensions.
The solution is to adapt information gathering, management and analysis systems to become more agile and less rigid. (It’s no surprise that the UK MoD’s big strategic defence paper is called the Integrated Review of Security, Defence, Development, and Foreign Policy, which states: “the UK Armed Forces will become a threat-focused integrated force with a continued shift in thinking across land, sea, air, space, and cyber domains.”)
Rather than design systems that are hard-wired for one task or domain, highly networked, increasingly open (in the software sense) approaches are needed. Cross-pollination between domains has huge value, not just in identifying and tackling emerging threats, but also generating robust insights into everything from trade to health, border security to professional accreditations.
Integrate to automate
The network effects of more integrated information systems are not easy to deliver – even in the age of robust, secure, and ubiquitous cloud computing. We need confidence in the value of an ‘integration layer’ so that procurement professionals can start specifying systems that are capable of feeding into a ‘single view’ from the outset.
Automation and the use of artificial intelligence (AI) technologies is already helping draw together systems to make this single view more achievable. “Stovepipes [siloed systems restricted to one task] don’t scale, so we will work… to integrate and focus common architectures, AI standards, data-sharing strategies, educational norms, and best practice for AI implementation,” said Lt. Gen. Michael S. Groen, director of the Joint Artificial Intelligence Center at the Pentagon, in November.
Universal schema for data and interoperability between systems is a promising start. But we need two other key elements in place to deliver a more future-proof solution.
The first is engineered uncertainty. Traditional procurement begins with a detailed breakdown of mission requirements in a given environment. When there’s no need for network effects, this makes sense. The design for a new main battle tank, for example, needs to be specified in some detail, then procured precisely according to those red lines.
But for an automated system monitoring vehicle movements, or analysing internet traffic, those hard red lines can be counter-productive. There might be a broad overall mission, but the aim should be a ‘minimum viable product’ (MVP) of immediate deliverables, rather than spending months or even years specifying every last feature or component. As new use cases crop up, that MVP can be adapted to maintain information advantage.
This will help make any system more adaptable to changing circumstances, but will also speed up development. Technology advances so quickly that a ground-up, one-supplier security systems will often be nearly redundant by the time they have been delivered.
Having an expert integrator interpreting the evolving needs of different agencies, then delivering workable solutions for those needs (rather than a feature set that procurement thinks might work in the future) is a much more flexible approach.
An iterative approach
The second element is iterative upgrades. In the private sector, desire for proprietary systems has all-but vanished in favour of software and platforms that are open, with APIs to support add-ons and new applications. When a new capability emerges, or a new mission becomes evident, systems capable of adapting or being augmented are ideal.
The need here is for compatibility and auditability. There must be a resolute belief that adding on new capabilities won’t break existing functionality, that data in existing schema will continue to be useful, and that it will maintain standards – whether that’s relating to its robustness, security, accuracy, or ethics.
We move from an industrial age of systems, to an information age of platforms – enabled at every level by a digital backbone into which all sensors, effectors, and deciders will be plugged. We’ve already been working on these more diffuse approaches – such as those for the US DoD’s C4ISR (Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance) objectives, where we’re incorporating distributed sensing and surveillance as part of a more flexible approach to delivering information advantage.
We know this will require us to embrace combinations of information-centric technologies. Predicting these combinations will undoubtedly be challenging – not least for the people charged with procuring them. But an open approach solves many of the problems.
Agility – commercially and operationally
The simplest way to see this in practical terms is that agile foes and unpredictable situations demand not just agility from the systems designed to reveal information about them – we need agility in how those systems are designed and procured, too.
Commercial agility from suppliers is the other side of the coin. Ideal partners understand the need to layer new information sources into existing platforms. Interoperability between different suppliers’ systems is a key attribute of true commercial agility.
The Leidos-developed Command and Control Incident Management Emergency Response Application (C2IMERA) is a great example. Installed at various US Air Force bases, it allows teams to dynamically add new information sources to their base installation. In one case, real-time Web Map Server (WMS) services from external servers run by National Oceanic and Atmospheric Administration's (NOAA) geospatial services, were layered onto the Common Operating Picture (COP) dashboard, helping local commanders see exactly how shifting hurricane tracks might affect operations. That’s agile operationally. But it was powered by commercial agility: our software development teams went from concept to in-field capability in less than a month.
The model exists
Are wider security and defence establishments ready to embrace not just cloud, but all the other ‘open’ tech that will help to enable the creation of these kinds of ‘apps’ feeding into an integration layer? There’s certainly precedent.
For example, the MoD’s Defence Information Infrastructure programme included provision of a new IT system capable of exploiting the power of cloud computing, along with Microsoft Office 365, while securely delivering agility and mobility to service personnel.
The technology is robust. AWS and Azure, for example, routinely host critical national infrastructure in the cloud. It means MoD or Home Office expertise can be focused on the generation of information advantage, not specifying and building information infrastructure.
With the right integration partner, they can focus on building out incremental gains rather than spending time issuing 330-line point specifications for new systems. By focusing on the ‘business’ and looking for ways to deliver new outcomes or capabilities, we can help them to quickly develop solutions and maintain an information advantage.
Opening the door to next-gen tech
This model is already contributing to the AI projects we’re delivering for the US DoD, too. “We deployed 20 different machine learning microservices on one of our programs, including computer vision, natural language processing, and video analysis, into a complex data processing pipeline that processes petabytes of heterogeneous data,” explains Ron Keesing, VP and director, artificial intelligence & machine learning at Leidos. “We did it all within 18 months, using a highly scalable and fast model production and deployment capability. The DoD has since reused many of these microservices for other programs.”
In addition to adding new sensor or data sources seamlessly, more autonomous and modular systems will also make it easier for other vendors to contribute cutting-edge technology. It could link the different capabilities into a seamless and adaptive sensor network, ensuring the technology – and with it the information advantage – can be upgraded more quickly and at less cost.
A secure, agile, data-driven approach to procurement is working in the MoD, too, thanks to the Logistics Commodity & Services Transformation (LCST) programme we’ve been working on. It’s proof that logistics benefits massively from its own form of information advantage.
In other words, we can create an integration advantage that makes information advantage easier and quicker to secure. But it’s also a great example of true digital transformation, not just digitising existing processes. By using data more creatively, adapting policies and designing around pragmatic outputs, it becomes more resilient and can meet fresh challenges both more robustly and quickly.
Just building a digital version of old approaches to information advantage simply creates Maginot Lines, where siloed capabilities are updated, but will still become obsolete – fast. Scoping and procuring systems to gather, analyse, and share data in ways that will evolve quickly and reliably is our route out of the 1930s – and into the 2030s.