Author’s Note: This article originally appeared on AdWeek’s website, but was subsequently taken down. We believe every marketer will benefit from this analysis of Russian social media ads, and therefore, we’re publishing it ourselves. Any questions, comments, or feedback? Tweet us your thoughts at @LadderDigital.
We’ve all heard reports about how Russian operatives created and distributed ads and posts on Facebook between June 2015 and August 2017.
The idea was to split the millions of Americans who saw these fabricated stories along party lines, thereby deepening existing hardline, ideological stances around controversial social issues. While it is unknown whether or not these ads and posts had any actual influence, this being interpreted seriously as an attempt to try and sway voter opinion.
In November of 2017, a House Intelligence Committee hearing revealed how these operatives leveraged Facebook’s advertising tools and free posts to target specific groups of people, segmented by political leaning, age, location, interests, and other attributes. In some instances, the ads even attempted to suppress voting by distributing false information about where, when, and how viewers could cast their votes.
The use of digital marketing techniques in politics isn’t new: the Obama campaign were power users of A/B testing and people close to the Trump and Brexit campaigns cited data-driven Facebook ads as key to their success. What was unprecedented was the transparency into the techniques used that resulted from the investigation.
As voters and citizens, it’s important to educate ourselves about exactly what strategies were used so that we can make more informed decisions next time we see political ads in our newsfeed. As Facebook moves towards full transparency for political ads (and it might not stop there), it is becoming more useful to understand how Facebook ads work.
So, why were Facebook and Instagram ads such a potentially effective avenue for Russian to sway U.S. voters?
As the most popular social media platform among internet users, Facebook is the perfect platform for creating ads and other content that speaks directly to a specific interest group or community concerned about a specific hot-button issue affecting them.
According to a recent Pew Research Center study, 72 percent of American adults use Facebook, and 70 percent of those people log into it daily.
Usage isn’t limited to just the young (as older people may think) or just the old (as some younger people say). Eighty-two percent of 18- to 29-year-olds use it, and 64 percent of 50 to 64-year-olds. In contrast, Pew found that only 28 percent of U.S. adults use Instagram, and among adults between the ages of 18 and 29, only 55 percent do.
Were the ads effective? In September 2017, Facebook disclosed to members of Congress that it found more than 3,000 Russian-backed ads bought by 470 accounts, worth approximately $100,000. Although investigators estimate that about 11.4 million Americans saw Russian-sponsored paid ads, several independent researchers claim that the biggest impact came from unpaid, organic Facebook posts, which reached up to 126 million Americans, according to The Washington Post.
In a broad sense, what was Russia’s strategy in placing Facebook ads?
As growth marketers, we understand that Facebook has to strike a delicate balance between providing engaging content from friends and posts from paid sponsors in order to avoid overwhelming users with ads — hence the relevance score, wherein Facebook uses data analytics to determine how relevant specific ads are to certain audiences.
Delivering them only to those who care about the subject matter of the ads improves the user experience for both people and businesses. Consequently, if users hate your ad or don’t engage with it, Facebook’s advertising algorithm will likely either charge you more money or reject showing your ad to users.
Therefore, businesses will use Facebook’s advertising platform to:
The same holds true for the Russian operatives behind the Facebook ad campaigns. They were able to find the users who were most likely to engage with the messaging of their targeted ads, and then spent approximately $100,000 on Facebook ads (plus almost $250,000 on Twitter ads, which rarely makes headlines). By creating fake accounts and groups to spread content to a broader audience, they successfully managed to reach — and arguably influence — hundreds of millions of Americans.
It’s important to note just how much the advertising landscape has changed to make this possible. Previously, to reach hundreds of millions of Americans, you had to know someone at a TV network. Now you can sign up for Facebook and charge it to your credit card (yes, even in rubles). The same removal of transaction costs that benefits small businesses has also meant an inability to police the advertisers signing up at a rate of 1 million every 7 months.
Now, let’s take a look at how and why using a social media site’s native ad platform to target those specific interest groups made these ads so effective.
This sponsored ad above, paid for by the Russian-controlled Facebook group “Heart of Texas,” first ran on October 26, 2016. It targeted Facebook users between 18 and 65+ years of age in Texas, and according to The Washington Post, it garnered 16,168 impressions and 2,342 clicks–a 14 percent clickthrough rate!
In this ad, which is labeled as an “Event,” the Facebook group is attempting to organize a statewide rally to protest “establishment robbers,” the “corrupt media,” and “the crimes committed by Killary Rotten Clinton.”
Before being officially taken down by Facebook officials, the group had acquired a quarter of a million followers. As Casey Michel noted in a special investigative piece for The Washington Post, this group specifically targeted Texas residents who supported the state’s secession from the United States. Members of the group also discussed identifying Texas as a “Christian state,” the need to “keep Texas Texan,” and a few conspiracy theories.
As Michel and other investigators found, the page became so influential, it was responsible for organizing several right-leaning protests, including an armed, white supremacist protest in Houston in May. In a section not shown in the screencap above, but published in full by Business Insider, the group also encouraged supporters to vote for Donald Trump to prevent “more refugees, mosques, and terrorist attacks. Banned guns. Continuing economic depression.” — as seen below:
Takeaway: This ad from the same Facebook group performed well because it speaks directly about some issues that really resonate with Texans: illegal immigration, terrorism, the economy, distrust of the federal government, and the Second Amendment. Despite some obvious grammatical and spelling errors posted by the group’s admins and the use of Texan stereotypes, this group was extremely effective at getting its message across by targeting a specific geographic location and posting messages that resonated with many Texas residents.
From a marketing point of view, this ad shows the benefits of localizing advertisements. Ad localization is the process of using local language in your ad copy. (Note: This isn’t the same as a word-for-word translation). Localization works extremely well in improving ad conversions — in fact, research published in Harvard Business Review found that 72.4 percent of consumers would be more likely to buy a product that relays information using their own language. Moreover, localization allows brands to appeal to new groups of users and build trust and brand loyalty with them.
This ad is a chilling example of how Russian operatives were able to effectively target Americans based on their grievances. Per analytics data published in The Washington Post, it first ran on July 13, 2015, and targeted individual users on Facebook ages 18 to 65+ in Atlanta, Maryland, Virginia, St. Louis, and Ferguson, Missouri.
In this case, the ad asked viewers to join a Facebook group called “Black Matters,” a name that undoubtedly stems from and capitalizes on the original Black Lives Matter movement and organization. The ad copy is short and to the point — “Join us because we care. Black Matters!”
While the ad copy alone may not have been effective, it’s accompanied by images of three well-known cases of black men (one being a child, Tamir Rice) who were killed by law enforcement officials. This tactic is incredibly important and effective for several reasons.
First, the ad is targeting individuals who are more likely to engage with content related to civil rights and racial justice. Therefore, using the imagery of three young men involved with a modern-day civil rights movement (i.e., Black Lives Matter) significantly increased user engagement.
In addition to targeting individuals interested in civil rights and racial justice, this ad targeted specific locations where fatal, police-involved killings and Black Lives Matter protests had taken place. It ran in Maryland, where Freddie Gray was killed in Baltimore in April 2015, and also in Ferguson, Missouri, where Michael Brown was killed in August 2014.
What’s more, the images of the three faces are seemingly making eye contact with the viewer. As AdEspresso notes, using a face in a Facebook ad draws people’s attention and activates a specific part of the brain programmed only to recognize faces. Wishpond also notes that images that make eye contact with a viewer are more likely to catch someone’s attention.
Finally, the ad provides a form of social proof by showing that 223,799 people like the “Black Matters” page, which convinces the viewer that this is a legitimate, active community. I believe this also helped improve user engagement, resulting in 55,561 clicks. However, note that this figure is not a customizable feature that page admins can add or remove themselves–they put a lot of resources into building this community.
Takeaway: Russian operatives created a Facebook group designed specifically to target a specific community’s hot-button issue–and it worked. The campaign earned more than 780,000 impressions, 55,000+ clicks, and 223,000+ group members.
To best describe the effectiveness of interest targeting, imagine you’re at a networking event and you start a conversation with someone you’ve never met before. One way to start a genuine relationship with someone new is to identify a shared interest, whether it’s sports, music, social issues, etc.
Similarly, using interest targeting in Facebook ads, brands can reach out to and begin to connect with a cold audience that has shown interest in a certain topic.
In this case, an ad from a Facebook page that focuses on the subject of police brutality and the Black Lives Matter movement will have more success earning page likes from Facebook users who have expressed interest in civil rights and racial justice than, let’s say, someone who expressed interest in cooking shows.
This sponsored paid ad from another Russian-backed Facebook group — “LGBT United” — ran on May 11, 2016, targeting users between the ages of 14 and 65+ who were interested in LGBT rights, same-sex marriage, Hillary Clinton, and/or Bernie Sanders. It aimed to encourage viewers to join a counterprotest of the Westboro Baptist Church (the infamous anti-LGBT “religious” group known for carrying crude signs at protests) in a city in Kansas.
While this ad is clearly meant to target the LGBT community, what’s even more interesting is the age group this ad targets. Unlike other Russian-backed Facebook ads released by Congress that generally target an age group between 18 and 65+, this ad lowers the starting target age to 14.
Three different theories could explain this decision:
One possible explanation involves a 2013 Pew Research Center survey of LGBT Americans that found the median age at which lesbian, gay, and bisexual adults first felt like they may be something other than heterosexual or straight was 12 years old. The median age for those who now identify as lesbian, gay, bisexual, or transgender is 17 years old. As a result, the age targeting of this ad could have intentionally tried to rally younger adults who are actively involved in protesting on behalf of LGBT rights.
The second theory involves ties into one of the interests targeted: Bernie Sanders. According to The Economist, Bernie Sanders won 70 percent of the under-30 vote in the Democratic primaries and caucuses overall.
The final theory comes from the ad’s copy, which states that the purpose of the WBC’s protest is to “stand against tolerance in education system near LG.” Because this is likely addressing the curriculum students learn in school, it makes sense that Russian operatives would have wanted younger, high school-aged children to attend.
Takeaway: Considering that this ad targets a specific geographical location with a wide-ranging age group, earning more than 4,700 impressions and 240 clicks is a fairly successful end result.
This is a reminder of the power and importance of creating user personas in marketing. The key to any successful strategy is clearly defining the audience and then understanding their needs, goals, motivations, and characteristics. It’s all about creating tailored messaging that speaks to them, answers their questions, and motivates them to take action.
User personas are archetypes that represent certain populations within your target market. They should be as specific as possible and modeled after real customers or target prospects. This will allow you to better visualize and conceptualize their roles and the challenges they face — and how you can help to solve them.
This is a boosted organic post specifically targeting Bernie Sanders supporters from a page called “Born Liberal.” For a cost of 500 rubles, it garnered 1,938 impressions and 222 clicks, even though it ran for only one day (June 8, 2016), according to Gizmodo.
There are many illuminating aspects of this ad to consider. First, it targeted voters aged 18 to 65+ who hold liberal views. From the page name alone, we can assume that, although this ad specifically targeted people interested in Bernie Sanders, they would be overwhelmingly Independents, Democrats, and liberal voters.
This is notable because, during the Democratic primaries (which took place between February 1 and June 14), Sanders clearly won the Independent vote (66.3 percent), but only won 35.5 percent of the vote from Democratic party members, compared to Clinton’s 63.7 percent.
Sanders won 49.9 of voters who self-identified as “very liberal,” tying Clinton’s 49.8 percent. However, he only won 43 percent of voters who identified as “somewhat liberal,” according to an analysis published in the Wall Street Journal.
Also notable, this ad first ran the day after the last open primaries were held in Montana on June 7, 2016. The very last primary, held on June 14, was a closed one in Washington, DC. The information signaling that Hillary Clinton would become the Democratic nominee (note the “Second Democratic option” in the caption) leads me to suspect that this ad is one of the earliest indicators of Russian operatives trying to undermine Clinton’s credibility and create a lasting rift between two factions of Democratic voters.
It’s interesting that this ad uses a quote from one of Bernie Sanders’ interviews on CNN in the ad copy. This closely resembles a marketing tactic we call “Influencer Proof Ad Copy,” which incorporates recognizable influencers (usually from the same industry or market) in the ad copy in order to encourage engagement and give instant credibility to the product or company — or, in this case, a Facebook page. Adding a Bernie quote helps legitimize the points made and opinions stated in the second paragraph for the reader.
Finally, although we can’t see the rest of the ad’s copy, no clear call to action is provided in the ad copy, in the ad’s image, or as a CTA button. This may simply indicate that this ad was created solely to sow division among Democratic voters.
Takeaway: For only 500 rubles (less than $9 USD), this ad was wildly successful at boosting audience engagement. It earned 124 reactions, 5 comments, 21 shares, 1,938 ad impressions and 222 clicks!
It shows the value of using Boosted Facebook posts to boost impressions and engagement. According to 2016 research from SocialFlow, organic reach dropped 42 percent between January and May 2016. That same year, Facebook updated its algorithm to prioritize showing content from a user’s friends and family over Pages. As a result, some publishers saw a 52 percent decline in organic reach on Facebook between January and July 2016. More recently, Facebook updated its algorithm on January 11, 2018, to prioritize showing users posts from their friends and family over content from brands, businesses, and media outlets.
As the decline in organic reach affects more and more publishers, using Boosted organic Facebook posts has become a viable solution for many marketers. Boosted posts are less robust than “regular” Facebook ads; they’re typically used to show your organic post to more people who already like your Page, thereby increasing the overall number of likes, shares, and clicks your post gets.
Therefore, it’s important for marketers to consider integrating Facebook Boosted posts into their social media marketing strategy as a way of boosting engagement and distributing content to their Facebook fans.
This ad is a boosted organic (paid) Facebook post created by the fake Christian Facebook page “Army of Jesus,” which first ran on October 19, 2016, to groups of people ages 18 to 65+ — the standard age range that most of these ads targeted. It depicts an arm wrestling match between Satan and Jesus Christ. The caption dubs Hillary “a Satan” and encourages viewers to vote for Donald Trump, “an honest man that cares deeply for this country,” despite his imperfections.
More specifically, it targeted users with interests in Christianity, God, Jesus, Bible, Faith, Conservatism, Laura Ingraham, Bill O’Reilly, Ron Paul, Andrew Breitbart, Rush Limbaugh, Michael Savage, and/or Mike Huckabee. According to Christianity Today, the Army of Jesus page had more than 217,000 likes — far more than other (legitimate) politically conservative, evangelical media outlets, such as “World” and “Charisma” magazines.
Surprisingly, although this ad targeted a wider range of interests, it did not perform very well compared to other ads, which could be attributed to a few factors.
For starters, Facebook prefers images with little to no text in order to give users a higher-quality experience on the platform, and therefore, established the “20% text rule.” I also ran the meme in the ad through Facebook’s Image Text Check tool, and it indicated that the image has way too much text, as seen in the screenshot image below.
Moreover, the interests targeted — God, Bible, and Faith, for example — are very broad, which likely lowered the number of overall post engagements.
Finally, the CTA is disjointed from the CTA in the ad copy. In the image, it encourages viewers to “Press ‘Like’ To Help Jesus Win,” but in the ad copy, it implies that you should vote for a “president with godly morals.” So, the slight confusion that created probably also affected the outcome.
Takeaway: As a result, this ad was only displayed to 71 people, yet still had a decent amount of impressions: 14 clicks, 97 reactions, 15 comments, and 29 shares.
I believe that this also closely ties into the effectiveness of Boosted Facebook posts, as mentioned in the previous section. However, I think it’s also important to note that this ad shows the power of effective copywriting, which uses strong religious symbolism and language to elicit emotions from an audience interested in religious, conservative figures and topics.
As an example, in the ad copy, Army of Jesus is drawing a moral equivalence between Hillary Clinton and “Satan” — explicitly calling her “evil” and implying that her “crimes and lies” prove that she’s profoundly immoral. When coupled with imagery that shows Satan arm wrestling with a virtuous religious figure (Jesus), this ad is powerful enough to exacerbate feelings of fear, distrust, and anger for viewers, and thus, encourage them to take action (i.e., “Hit ‘Like’ To Help Jesus Win”).
There are two types of images that Russian operatives used in these ads. The first is meme-style images that are easy to understand and share with others. Examples include:
Other ads contain images with a more traditional campaign-style look and feel. These often appeared as promoted events:
Several outlets covering the Russian ads story focus primarily on Facebook ads, but don’t pay much attention to some of the ads distributed on Twitter.
Members of the Senate Judiciary Subcommittee presented many of these Twitter ads last October. According to The Washington Post, Twitter identified 2,752 accounts controlled by Russian operatives and more than 36,000 bots that tweeted 1.4 million times during the election.
Here’s a tweet sent from a fake account that encouraged viewers to vote via text message. In this case, the tweet isn’t a sponsored (paid) ad; it simply uses popular hashtags related to Hillary’s campaign in order to spread its message organically — specifically and deliberately misleading voters who support her.
Takeaway: When used effectively, social media hashtags are an effective way to reach your target audience organically, hashtags also have a bit of a problem with bots and spam accounts. To identify a potentially fake Twitter account, be wary of accounts with a high follower/following count that receive little to no engagement on tweets, accounts with low follower/following count but have an unusually high number of likes, shares, and retweets, and accounts that use more than four hashtags in a every single tweet.
With narrow interest and/or geo-targeting, many of these Russian ads contained specific copy that elicits a visceral emotion from a person viewing the ad. As a best practice, Wordstream notes, you must write to one person and one person alone: “This person, your target, is the one you need to woo and persuade. Just as though you were an in-person salesperson, you need to focus all your attention on this person and their needs.”
Many ads, especially those that exploited racial and religious tensions, often utilized both coded and explicitly divisive language. For instance, immigrants are referred to as “illegals,” “rapists, drug dealers, human traffickers.” The also use simple language that is easy to understand, such as “Make America Great Again” or “America First.”
There are a few cases where obvious spelling and grammatical errors in the copy of the ad pretty much give its origins away. In the “Heart of Texas” ad, for example, the creators omitted indefinite articles (i.e., the word “a”), which are not used in the Russian language.
Another factor lending to the success of these Russian ads was Donald Trump himself. In several documented cases, Trump, one of his relatives, a member of his transition team, and/or a member of his administration retweeted, quoted, or endorsed content created and shared by fake Russian accounts, as seen below:
Other instances include:
As political advertising increases in prominence, and as society continues to devolve in terms of having productive conversations around controversial issues, bad actors will continue to try to influence a specific viewpoint with misleading advertising and social media manipulation.
As this analysis shows, audiences will respond to inflammatory rhetoric when it comes to key issues. When such rhetoric is used to manipulate audiences, it can easily be detected, as well. That said, marketers have a responsibility not just to their employers, but also to their audience, to provide fair and truthful advertising.
Advertising platforms like Facebook and Google, on the other hand, need to learn that making money, money, and more money does not justify enabling shady practices and manipulative targeting. They need institutional checks and balances in place — especially during key events like elections or popular unrest — to prevent malicious interest groups from using their paid advertising tools to fan the flames.
Sign up to our newsletter to stay up to date with all the latest movements in the field.
More from Ladder Blog
The art world is being disrupted by generative AI, and artists aren’t happy. Generative AI models like DALL-E, Midjourney, and Stable Diffusion were trained by scraping millions of images from the internet, without permission. Artists claim this violates copyright, while the companies training these models argue this falls under fair use.
Read More →In the world of digital marketing and data-driven decision-making, creative testing is a pivotal tool in achieving business growth. Gone are the days of relying on gut feelings or guesswork; now, business decisions are powered by data-validated insights, meticulously collected, analyzed, and validated. This transformative process empowers businesses of all sizes, from established enterprises to budding startups, to thrive in an ever-evolving digital market. This article looks at the practical applications, challenges, and innovative transformations associated with creative testing, offering you valuable insights and actionable strategies to implement in your own digital marketing efforts for achieving growth and success.
Read More →