Agentic AI Aims to Cut Down Emergency Response Time When Disasters Strike
Leidos and NVIDIA are developing autonomous agents to help speed life-saving tasks and decisions in emergency management command and control, called C2AI
Three Points to Remember
- Disaster response presents high-stakes, manually intensive and mentally demanding scenarios for responders and management teams.
- C2AI enhances situational awareness, automates processes toward faster decision-making, and optimizes response plans and courses of action.
- Leidos and NVIDIA continue to speed C2AI’s development toward operational fielding as a cutting-edge tool for life-saving capabilities in disaster management.
A loud explosion suddenly sends a routine day into chaos. Within seconds, 911 calls flood in and first responders jump into action. Amid the scramble, a team of AI agents bands together to rapidly pinpoint where an underground gas pipeline has blown, assist in taking assessments of injuries, and create courses of action for human approval to send to police, fire and EMS dispatchers.
This could resemble disaster response in the near future, with command-and-control AI, or C2AI. Serving as nimble eyes and ears for responders and emergency management leaders when stress levels are high and mental and physical capabilities are taxed, C2AI will also help automate flows of crucial information while crunching multiple streams of incoming data to provide situational insights that can lead to more effective decisions.
Leidos and NVIDIA are developing C2AI in a partnership. Through a network of task-specific autonomous agents, C2AI is designed to turn emergency response from a process requiring intense thinking and coordination into a more efficient and automated one rooted in collaboration between humans and AI. Personnel can affirm or tailor the artificial intelligence’s recommended courses of action that are based on “doctrinally sound” FEMA or agency-specific principles, procedures and practices.
“Command and control in disaster management can take time when seconds matter, getting information from 911 operators to the incident commander and determining an action plan,” says Corey Hendricks, VP and chief engineer of Leidos’ Commercial and International Sector. "We’ve created an agentic workflow and removed a lot of the friction points in 911 and emergency management systems.”
![]()
With C2AI, we are focused on applying agentic AI to solve complex problems and augment decision-making in high-consequence situations.
Corey Hendricks
VP, Chief Engineer
AI agents take the cognitive load off emergency response personnel
After the gas explosion, operators at 911 answering points quickly become inundated by distress calls. They tend to panicked callers and juggle monitors and radios, but instead of compiling and relaying information by hand to get aid dispatched, an AI agent helps transcribe the calls to ease their multitasking burden.
When even a split-second loss of attention could put people in harm’s way, the operators stay focused on callers. An AI orchestration agent routes the transcripts to a third agent that parses them to create incident reports. The agent displays the reports back to the operators through chat, with incident locations, injury numbers and their severity.
While this is happening, the AI agent that transcribes incoming calls also listens to first-responder and EMS radios and transmits more crucial details for the orchestration agent, which is coordinating with yet other agents that are flagging and tagging events off municipal video camera feeds. As a building partially collapses in the wake of the blast, these visual alerts agents inform the operators through chat, as well.
“There are cameras located all across most cities, so imagine if you have AI agents monitoring all those camera feeds 24/7,” Hendricks says. “And instead of static image classification, they can do event detection.”
With C2AI stepping in, machine vision can detect what humans might not. And the artificial intelligence can build out the situational awareness picture by correlating what the visual alerts agents pick up on various camera feeds, helping keep track of rapidly changing conditions.
Working C2AI seamlessly into disaster response and action planning
Like a virtual, supplementary workforce, the C2AI agents work synchronously by themselves but always under the supervision of humans. Leidos built the agents using NVIDIA’s foundational speech AI, LLMs and vision-language models to enable natural back-and-forth communication with both each other and humans.
“The intent is to augment operators and for the AI to start building the course of action,” Hendricks says. “We want to help operations center personnel, as well. They are looking at seven or eight screens simultaneously, so making those screens turn red, orange or yellow is a straightforward way to get their attention.”
At the emergency operations center, where the incident commander needs to size up the crisis situation quickly, the visual alerts agents directly patch in what they detect on cameras and color-code the events based on severity. Trained to pick up cues like smoke, structural damage and vehicle crashes, one of the agents displays on-screen: “A building collapse is happening, with debris scattered around. The building has a large hole in its side, but there are no visible signs of fire or individuals exhibiting signs of distress.”
But within seconds, embers appear and the C2AI’s EMS planning agent adjusts the recommended response plan to dispatch a fire truck and requests, “Affirm plan or provide input for revision.” The experienced incident commander, anticipating that a fire will grow, orders an additional fire engine and a ladder truck. Moments later, a new log from the alerts agent comes in, describing the scene: “A fire is spreading rapidly with intense heat and smoke visible through five floors of the building.”
Firefighters en route see the update on their Team Awareness Kit, or TAK, a communications and collaboration application that the Air Force Research Laboratory and Leidos co-developed and is currently fielded to 70,000 DOD, DHS, state and local users. The information helps all emergency vehicle units to be better prepared when they arrive on the scene.
Designing C2AI to run at the tactical edge
“TAK jumped out at us when we thought about how to integrate C2AI naturally into emergency management workflows because that’s where first responders and personnel are already operating,” Hendricks says.
TAK features location tracking, live video feeds, incident maps and chat functions. With C2AI, descriptive and visual map updates, 911 transcripts and action approvals from commanders will be designed to flow into TAK seamlessly and be seen by all first responders, adding robustness to the common operating picture.
Meanwhile, the EMS planning agent continuously receives updates streaming in about available vehicle units and the conditions of the injured. The agent looks up established doctrine and keeps recommendations optimized and going up the chain to the incident commander, who returns confirmations or further instructions on what to dispatch to ground zero. This results in an efficient loop of command and control that helps save countless lives.
As the emergency management scenario depicts, C2AI stands on a framework of modular and scalable agent building blocks. The agents don’t require knowledge training or retraining since they rely on real-time retrieval of specified documentation when making their decisions, and this allows them to adapt quickly to changes in principles, procedures and practices.
Leidos and NVIDIA are developing C2AI as lightweight microservices that don’t require large computing infrastructures to operate. That means C2AI will be designed to run on servers at the tactical edge for tasks that rely on data processing with almost no delay time. The approach uses NVIDIA’s established AI models that are packaged in containers with the necessary computing resources for streamlined deployment. With the microservices architecture, agencies would be able to add C2AI in pieces and at a pace that fits their operational needs.
Leveraging NVIDIA’s AI hardware, software and Metropolis smart-cities platform has expedited C2AI’s development so far, Hendricks says, and will continue to accelerate it towards fielding and operational use.
C2AI spawned out of a partnership the two companies established earlier this year to operationalize artificial intelligence capabilities faster for government agencies, combining Leidos’ mission expertise and integration experience with NVIDIA’s AI technologies.
“With C2AI, we are focused on applying agentic AI to solve complex problems and augment decision-making in high-consequence situations,” Hendricks says. “In disaster scenarios, let’s reduce the time to critical care.”