If a brand today wants to promote a new product, it would order its social media team to tailor posts that resonate with its audience, buy targeted ads to reach impressionable eyeballs, and closely monitor the performance of its messaging to ensure it reaches, and influences, as many viewers as possible.
If a Russian troll farm wanted to disrupt an American election and amplify discord in an open society, it would apparently do the exact same things.
That’s one of the key takeaways from an indictment filed Friday by special counsel Robert S. Mueller III against 13 Russians and three Russian companies that allegedly sought to interfere with the 2016 presidential election and aid the Trump campaign.
The indictment describes textbook usage of American tech platforms, shedding light on an alleged covert campaign that aims to undermine the political process by exploiting features — not bugs — of social media services.
The ease with which anyone can fan ideas on social media, is both its selling point and its flaw, the alleged Russian conspiracy shows.
“While platforms like Facebook and Twitter are allowing Americans to communicate and share ideas in ways unimaginable just a decade ago, we’re also learning that we each bear some responsibility for exercising good judgment and a healthy amount of skepticism when it comes to the things we read and share on social media,” Sen. Mark R. Warner (D-Va.), the ranking Democrat on the Senate Intelligence Committee, said in a statement Friday.
Consider the way the Moscow-based Internet Research Agency allegedly approached its targets. The company established departments for data analysis, search engine optimization and what amounts to an IT help desk, the indictment alleges.
More than 80 staff members were assigned to a group called the “translator project,” which the indictment says aimed to influence Americans on YouTube, Facebook, Instagram and Twitter. And they apparently used techniques to find unwitting allies, or influencers, that wouldn’t be out of place for any corporate social media team.
“In order to gauge the performance of various groups on social media sites, the organization tracked certain metrics like the group’s size, the frequency of content placed by the group, and the level of audience engagement with that content, such as the average number of comments or responses to a post,” the indictment says.
The group tried to influence all sides of the political spectrum to intensify the conflict, the indictment says. And they apparently knew what times of any given day to post new content to drive up engagement.
Thousands of dollars were spent each month buying targeted social media ads with messages such as, “Hillary Clinton Doesn’t Deserve the Black Vote” and “Hillary is a satan, and her crimes and lies had proved how evil she is.”
It didn’t take much to circumvent the social media platforms’ disclosure rules to buy ads or maintain accounts. The Russians used stolen Social Security numbers from real Americans and acquired fake driver’s licenses, the charges say.
This also allowed them to create fictitious American personas to fan their alleged campaign.
And in a bid to create viral content, they organized rallies such as one in Washington in 2016 in which they recruited a U.S. person to hold a sign depicting Hillary Clinton and a quote attributed to her that said, “I think Sharia Law will be a powerful new direction of freedom.”
Bret Schafer, an analyst at the Alliance for Securing Democracy, which tracks Russian influence networks, said the indictment shows that any bad actor with means can game social media for nefarious purposes.
“They have adopted many of the techniques that anyone involved in social media marketing knows how to do,” he said. “On any given day, we’ll see them tweeting with the hashtags ‘Monday Motivation’ or ‘Wisdom Wednesday.’ They’re using that to drive eyeballs to their content. This is not complicated stuff. Most millennials running a start-up know how to do this.”
The detailed indictment is another blow for Facebook and Twitter, which have been trying to contain a crisis for over a year that they initially downplayed.
In a statement Friday, Facebook emphasized that it had already acknowledged the Russian attack. The company said it was cooperating with law enforcement and was doubling the number of people working on security to 20,000.
“We know we have more to do to prevent against future attacks,” said Joel Kaplan, vice president of global policy for Facebook. “We’re committed to staying ahead of this kind of deceptive and malevolent activity going forward.”
Twitter, which declined to comment for this article, has been working to detect and prevent more bot accounts since the election. The company announced last month that it had contacted 1.4 million users in the United States to alert them they had engaged with Twitter accounts linked to the Internet Research Agency.
Karen North, a social media expert who teaches at USC’s Annenberg School for Communication and Journalism, said the indictment will again intensify the debate over what responsibility social networks have for the content they allow on their platforms.
“It’s easy to say it’s unethical given the circumstances, but do you want Facebook or Twitter to be deciding unilaterally where to draw the line?” she said. “Do you want them to decide what kind of content or conversation you’re allowed to have with your audience? And do we want them to make different decisions based on people’s nationality or country of origin?”
The alleged Russian interference also highlights how much has changed since the early days of social media. Back then, platforms were seen as ushering in a new age of openness and human connection. Anyone, vetted or not, had a chance of going viral and reaching previously unimaginable audiences. Now, a decade later, societies are waking up to the sobering realization that that freedom can be exploited.
“At the dawn of the digital age, we had such beautiful visions of using these platforms for education and entertainment and access, and we’re doing all of that, but nobody knew the downsides that would come with it,” said Shum Preston, director of national advocacy at Common Sense Kids Action, an independent policy group that focuses on child health and safety.
“Facebook has gone from educational platform to being an ignorance engine or a disinformation engine,” Preston said.
UPDATES:
2:40 p.m.: This article was updated to include comments from Karen North, a social media expert who teaches at USC’s Annenberg School for Communication and Journalism, and Shum Preston, director of national advocacy at Common Sense Kids Action.\
source:-.latimes.