New Web tool tracks Russian “influence ops” on Twitter

Google+ Pinterest LinkedIn Tumblr +

The top “Russian Twitter Trolls” on Hamilton 68 this morning are RT & Sputnik News reports on the latest Wikileaks dump… — Editor

Hamilton 68 tracks Russian state news and Twitter trolls, shows propaganda trends.

The Alliance for Securing Democracy, a bipartisan project backed by the German Marshall Fund of the United States (GMF), has launched a Web tool to keep tabs on Russia’s ongoing efforts to influence public opinion in the United States and abroad. Called Hamilton 68—named for the 68th edition of the Federalist Papers, in which Alexander Hamilton discussed how to prevent foreign meddling and influence in America’s electoral process—the Web dashboard tracks 600 Twitter accounts “linked to Russian influence activities online.” That’s according to a blog post by the Alliance’s senior fellow and director Laura Rosenberger and non-resident fellow J.M. Berger.

Russia’s use of Twitter and other social media in the run-up to the 2016 US presidential election (as well as in France, Germany, and Poland) as part of “influence operations” has been well documented. In a New York Times Magazine article in 2015,Adrian Chen exposed a “troll factory” operating on behalf of the interests of the Russian government. The “Internet Research Agency” conducted trial runs well before the election, spreading a hoax about a fictitious accident at a Louisiana chemical plant. And influence operations have continued since the election in the US, promoting stories from both official Russian government media sources and sites like InfoWars.

Tapping into Twitter badness

In an e-mail exchange with Ars, Alliance fellow Berger said that he first started taking an interest in social media analytics while researching terrorism and extremism. “I had developed tools for analyzing social networks in that space, and I published large-scale studies in 2013 (on white nationalism) and in 2015 (on ISIS), which I did with Jonathon Morgan, one of the collaborators on the dashboard,” Berger said. “So I’ve been looking at various kinds of manipulative activities on social media for some time. I’ve worked with Jonathon and my colleagues Andrew Weisburd and Clint Watts on these issues, including state-sponsored influence campaigns, which we’ve been tracking for the last couple of years.”

During that research, Berger related, he and his collaborators “identified and monitored a lot of networks that we could credibly connect to Russian influence operations in aggregate.” While specifically tagging any single account as being part of an influence campaign “is problematic,” Berger said, “we can see when groups are synchronized. We would load these networks up in Tweetdeck to monitor them, but that is a very difficult way to assess trends and digest content. So we began tossing the idea of a dashboard around.”

Watts was the one responsible for taking point on leading the project and bringing it to the German Marshal Fund.

The network of 600 accounts was “identified over a fairly long period of time using an app that I have developed with a couple different coders and which has gotten a lot more sophisticated since I started working with Jonathon,” Berger said. The analysis worked in three stages:

  • First, accounts that promoted disinformation campaigns spread through overt outlets (such as RT and Sputnik) were identified, and accounts that were determined to be more than casual re-tweeters of dezinformatsiya were tagged for tracking.
  • The second group was derived from analysis of the social networks of users who openly professed their pro-Russian stance and tweeted primarily messages in line with Russian policies and themes. Followers of those accounts who posted similar content were identified as being part of the influence operation.
  • Finally, the team identified “bot” and semi-automated accounts (referred to as “cyborgs”) used to “boost the signal” of the other accounts they had identified.

The 600-account pool is currently as much as Berger thinks is practical for a near-realtime dashboard. “For now, we’re sticking with the 600, except to replace accounts that are suspended,” he said. “If we feel we need to make a change, we’ll announce it transparently.”

Berger noted that they could have “stood up 6,000 almost as easily, but the analysis would be less close to real-time.” Additionally, while they could have identified that many accounts with about 95-percent accuracy, the 600 selected accounts were identified with a 98-percent confidence rate, he explained. “Ninety-five percent is good enough for almost any professional purpose, but given the public roll-out, we wanted to have as little noise as possible in the system. With 600, we could also manually vet the accounts, which would be difficult with a larger set.”

There’s about a five-minute delay in the dashboard’s analysis, as it mines data from the accounts using Twitter’s standard APIs. “We have no privileged access,” Berger said. “If we wanted to scale up to handle bigger sets, we would probably need to switch to GNIP,” the social media aggregation API service acquired by Twitter in 2014…

This article (New Web tool tracks Russian “influence ops” on Twitter) was originally published on Arg Technica and syndicated by The Event Chronicle


Comments are closed.