Networks / Cyber, Pentagon

OSINT overdose: Intelligence agencies seek new ways to manage surge of open-source intel

on May 13, 2024 at 1:13 PM
220526_pentagon_AI_brain

Pentagon grapples with growth of artificial intelligence. (Graphic by Breaking Defense, original brain graphic via Getty)

WASHINGTON — AI-powered machine translation, big data analytics, and now large language models are sucking up data from social media, smartphones and other “open source” to generate unprecedented amounts of open-source intelligence. That means the 18 agencies of the Intelligence Community need new contracting and technical approaches to tap the rising power of OSINT without being overwhelmed by it, IC officials said last week.

“It’s amazing what’s there. It also scares me,” said Randall Nixon, director of the Open Source Enterprise at the CIA, which leads the IC as “functional manager” for OSINT. “The next intelligence failure could easily be an OSINT failure, because there’s so much out there.”

It’s often overwhelming for analysts just to pull together the OSINT they already have access to and put it in context of classified information, said Casey Blackburn, assistant director for emerging technology at the Office of the Director of National Intelligence. “We need to integrate that open source into the environments where our people work,” he said. “As long as analysts … have to separate their attention between multiple different terminals of unintegrated information, we will never take full advantage of open source.”

“We can do it securely,” Blackburn emphasized. It’s not technology that’s the problem, he said: “It’s acquisition and process and something of policy that’s getting in the way right now.”

“We have to change our model, our approach, our way of obtaining information, our way of purchasing information,” agreed Jason Barrett, the IC-wide open source intelligence executive at ODNI. “It’s not up to the commercial sector at this point to come to us. It’s up to the government to start to change how we do our business.”

The problem is two-fold, Barrett, Blackburn, Nixon and other experts explained Wednesday at the Special Competitive Studies Project’s second annual Ash Carter Exchange. There are too many IC elements trying to solve the problem independently — not just the 18 agencies, but individual analysts using OSINT ad hoc without proper training — and vastly too many OSINT providers vying for those IC contracts.

“It’s not like in the old days,” said Kristin Wood, who served over a quarter-century in the CIA before entering the private sector. “In the old days … you could go and set up relationships with companies and do a one-on-one contract. [Now] there’s two or three or four or 500 companies that have value to bring to the open-source space.”

That explosion has been powered, in large part, by technology. Once upon a time, the term “open source intelligence” might have invoked a plane-spotter watching an airbase runway through binoculars, or a lone linguist listening to foreign radio broadcasts, but today, OSINT is big business and high tech. One of the first companies to catch the wave, Recorded Future, was founded in 2009 based on three insights, said co-founder Staffan Truvé: the proliferation of smartphones to gather data, breakthroughs in deep-learning algorithms to sift that data, and the rise of cloud computing to run those algorithms affordably.

Today, Recorded Future claims over 1,700 clients worldwide, including over 30 governments and more than 200 of the Forbes Global 500 companies. Since private-sector demand is increasingly important for OSINT companies, that creates another complication for government agencies, because today’s OSINT pricing schemes are often punitive for federal customers. Charging for each user with access to a company’s OSINT products, for instance, may work for a private firm with 10, 100, or even 1,000 employees, but it scales up astronomically if government agencies want to share OSINT across the entire Intelligence Community, let alone the Department of Defense.

“Private sector likes to sell by license,” Nixon said. “That immediately makes it nearly impossible for us, because of the large scale of users that we have.”

“We have to have new ways of doing those contracts,” he continued. At best, “purchase it once and share” across the entire IC; at minimum, write contracts with easy options for agencies to share data as needed with each other.

“We can get to a more streamlined acquisition process,” Barrett agreed. “We really do need a mindset shift when it comes to how we are acquiring it [OSINT], so that we’re not taking 18 different approaches and then having to set up the contracts 18 different ways.

Neither official offered more details of how the IC would reform OSINT contracting. The IC’s recently released OSINT Strategy [PDF] doesn’t offer specifics, either, but Barrett made it clear that getting open-source right was a high priority.

“We’re reaching an inflection point,” he said. “Commercial and publicly available data is so powerful, and it’s so foundational to what we are trying to accomplish.”

“What is important,” Barrett went on, “is to really focus on where can we get the greatest return on investment … not to duplicate what’s already being built, probably faster and better, outside the walls, [but] how can we bring that in or leverage those capabilities in the commercial private sector.”

Topics

, , , , , , , ,

Exit mobile version