What is artificial intelligence?
The dream of artificial intelligence (AI) is ancient and enduring. Popular culture from Isaac Asimov to Stanley Kubrick and George Lucas has reflected our fascination with the possibility of smart machines. Even Greek mythology features the inanimate bestowed with human likeness. But our interest in AI is not necessarily contingent upon our belief in it. Historically, AI has been an idea more closely linked to science fiction than viable business or military strategy. Many people view the idea of truly intelligent machines as implausible or, at best, the vision of a distant future.
In the past decade, however, AI technology and its real-world applications have advanced dramatically. During this time, the world’s top minds in computer science have focused on the rapid creation of data. In the 1960s, around the time computer chips were invented, scientist Gordon Moore predicted computer processing power would double every year, an accurate forecast known as Moore’s Law. We are now observing a similar trend in which many experts believe the world’s data will double in volume every year or less.
Computing systems engineered over the previous decades are mostly helpless in attempts to catalog and exploit this deluge of "unstructured data," an umbrella term which includes emails, text documents, tweets, photographs, audio and video content, and any other data that lives outside the orderly confines of a spreadsheet. Every minute, internet users generate roughly 400 hours of video, 18 million text message, and 187 million emails. This data is not only overwhelming, but also largely useless. Unless a person categorizes this information for a computer to understand (or analyzes it themselves), unstructured data by nature is incoherent to computers and, crucially, not actionable. The vast majority of data we store is unstructured or, as some would describe it, invisible.
Blindness comes at a cost. An infinite number of valuable insights are hidden in our data, but our computing systems need to understand how to uncover them. The ability to make these insights visible, many believe, will change our fortunes in virtually every global challenge from cancer research to counterterrorism. AI systems are uniquely designed to address this challenge because of their ability to grasp the idiosyncrasies of sight and language and apply them to analyze unstructured data.
How? Reasoning. Reasoning is the defining characteristic of human intellect. Our brains do it early, intuitively, and with incredible speed. Take natural language processing, for example. The inability of computers to apply reasoning has always been the crucial barrier preventing humans and machines from conversing fluidly. There is no rigid structure of human wordplay. Natural language is clever. To comprehend the written or spoken word, you must not only understand basic rules of grammar, but also rely on reasoning to decode analogy, slang, dialect, tone, sarcasm, and other nuances. Unlike the highly advanced computers encased in the human cranium, machines have neither evolved over eons nor been trained to operate in this way. Breakthroughs centered on a computer’s ability to reason are not exhibitions of brute memory and computing force, but leaps forward in replicating the human brain. To undervalue this distinction is to misunderstand the significance of AI.
, CTO -- Defense & Intelligence
AI is a class of smart machines that exhibit human traits like logical thinking, intuitive leaps, emotional intelligence, and empathy.
Keith Johnson, Chief Technology Officer for the Leidos Defense and Intelligence business, views AI as a broad domain of machine techniques that are characteristic of human intelligence. He also sees AI as a generic term which refers to "smart machines that exhibit human traits like logical thinking, intuitive leaps, emotional intelligence, and empathy," he said. A subset of implementation tools, including machine learning, deep learning, and convolutional neural networks, are each specialized fields, but generally considered to be part of the computational toolset underneath the umbrella of AI.
As long as insight dictates success, AI will continue to dominate the world of emerging technology because of its power gain insight from data to bolster efficiency, prediction, and effective decision-making. IDC predicts spending on AI will grow into a $47 billion industry by 2020, and goes so far as to refer to AI and its related fields as the “Fourth Industrial Revolution.” At its core, AI is about making our data visible and uncovering insights to help tackle the world’s most important problems. Data scientists at Leidos are continually developing expertise in this domain through internal research and development and centers of excellence.
Read more about how Leidos is helping its customers accelerate adoption of AI.