Is a more collaborative approach the answer to tackling global disinformation?

0

Written by Suzanne Smalley

Disinformation is flooding online platforms to such an extent that it is simply impossible for a tech company, content moderator or researcher to prevent every malicious post from quickly spreading online and leading to real consequences.

The problem is so overwhelming and complex that social media platforms, government officials and other stakeholders must pool their efforts, Graphika CEO John Kelly told CyberScoop in an interview on Thursday.

Hence his social media surveillance company is developing a blueprint for a so-called software-based multi-stakeholder threat center that would track, share, and analyze disinformation at scale with journalists, government officials, academics, and other experts.

While the virtual The threat center remains theoretical until funding comes through, Graphika officials said, having discussed the concept with a variety of potential partners.

Well respected for its work in identifying strategic influence campaigns on the Internet, Graphika says it uses artificial intelligence to uncover and analyze online communities and identify coordinated information operations. It recently worked with Facebook and other social media platforms to uncover a sprawling, nearly 5-year-old Russian-language pro-US intelligence operation.

Kelly said he recognizes the urgent need for a broader shared community to pursue the threats that disinformation poses to society at large.

The center proposed by Graphika would act as a sort of one-stop shop for overseeing and cataloging information operations, a model Kelly says is critical given that many experts work in isolation from each other, working on different tasks.

So far, the Graphika approach is similar to other Information Sharing and Analysis Centers, or ISACs, designed to provide critical threat intelligence to specific industry players.

“It’s not just that we want to teach everyone how to be like a little Graphika,” Kelly said. “We want to provide technology tools, data sharing and things like that as a layer and a model that then other people can come in and bring something that they can develop a very specific expertise in… and then it’s valuable for everyone.”

Independent disinformation experts said a Graphika-style threat hub could be a crucial tool in combating a growing problem.

“It is clear that a more connected and resourced response to countering disinformation is needed globally,” said Kevin Sheives, associate director of the International Forum for Democratic Studies at the National Endowment for Democracy. He added that it is crucial for civil society leaders to get a “clearer picture of their own information ecosystems and particularly the impact of their own interventions”.

The Graphika model could bridge the gap between “research and practice” by improving the relationship between social media platforms and civil society researchers and disinformation experts, Sheives said.

“Facebook in particular requires data transparency and a true partnership between these two communities to improve content moderation,” Sheives said.

Graphika has a long history of working with social media platforms to root out disinformation, a track record that could be leveraged by others involved in launching and running the threat hub. Kelly said many sophisticated intelligence operations emerge across platforms, and because tracking these threats is costly and time-consuming, platforms have historically welcomed the assistance of Graphika’s experts.

In August, Graphika announced its findings in a joint investigation with platforms and Stanford University that uncovered a web of pro-Western information operations. It has uncovered several other key influence operations in recent years, including a Russian government-linked one known as Secondary Infection that spanned six years and two continents.

Graphika’s model for a multi-stakeholder threat center to detect coordinated online disinformation campaigns

Kelly said social media companies know they need help tackling the “sophisticated problem” of threat actors misleadingly coordinating activity on platforms. He said that in his experience, platforms welcome assistance in tracking and exposing coordinated internet manipulation campaigns.

“It’s a higher-class problem,” Kelly said. “What a center like this can do is discover this higher-level coordinated manipulation… authenticate it, validate it, find out as much as possible about it [and] convert this into the signals that the platforms need to take action and conduct the investigation on their own terms.”

Many social media platforms don’t have the budget to pursue coordinated disinformation campaigns. Kelly said Facebook and Google are well funded, but Telegram, Discord and even Twitter have far less money to address the disinformation problem.

The multi-stakeholder threat hub could also be a place where civil society leaders come together to align research goals, agree on a common vocabulary and share examples of real-time information operations, said Graphika chief technologist Jennifer Mathieu when she launched the model at the annual DEF CON hacker conference in August.

Graphika envisions the center working with social media platforms to gain access to more data than is currently available through APIs, Mathieu said. The data access she hopes would meet the terms of service to protect users, but would grow “as we build trust across the community,” Mathieu said.

The threat center would only seek data on threat signals, historical archives on known, attributed disinformation campaigns, and evidence of tampering, as opposed to secure user data, which Kelly says is “lawfully protected” for privacy reasons.

To investigate threats in near real time, the center’s work would be automated and likely supported by human analysts. Graphika hopes to include both large and small platforms in the effort.

“A number of people from different interest groups have been talking about this for many years,” Mathieu told the DEF CON audience. “The challenge is funding.”

Share.

Comments are closed.