How Facebook could swing an election AP

The social networking giant connects friend networks, coworkers, news, politics, neighborhoods and so much more. We’ve seen the platform connect people in beautiful ways we never could have imagined a decade ago, and create terrible moments showcasing human vapidity and callousness.

As Facebook consumes more and more of our social lives, it reaches even deeper into our political consciousness.

Facebook is no stranger to political engagement. President Obama used the platform to great effect in both 2008 and 2012, and it’s become just as crucial to the political world as traditional communications. But recently, Facebook has announced their increased efforts to turn out the vote, as The Times-Picayune reports:

In elections past, Facebook has allowed users to use tools that would display a notification showing the user had voted. But social media website is stepping up its efforts to encourage voters to meet deadlines as part of the company’s push to boost civic engagement.

The post on Facebook, which will display at the top of the user’s news feed, will appear a week before the deadline to register to vote, which is Feb. 3.

Hooray civic engagement, right? Well, maybe.

First, let’s assume Facebook is encouraging individuals to register to vote regardless of their political leanings, innocently trying to increase voter participation. What could that do to the electorate? Increased turnout in New Hampshire, South Carolina, and Nevada played at least some role in supporting Donald Trump, compared to his more muted performance in Iowa. There’s no hard and fast rule as to whom increased turnout benefits. But there is also no doubt it could greatly change an election. Facebook, utilizing the power of its unique sphere of influence, could tip the scales in some areas.

In 2012 Facebook even published a study which used peer pressure to encourage certain individuals to go vote in the 2012 election, while a control group received no such social feedback. Their study claims “the Facebook social message increased turnout directly by about 60,000 voters and indirectly through social contagion by another 280,000 voters, for a total of 340,000 additional votes.” This is hardly an insignificant total, and depending on the distribution and demographics, this small-scale test could even have changed the outcome of elections.

More voter participation is not a bad thing, but it’s not necessarily a neutral thing either. If Facebook works to increase voter turnout, and that benefits one candidate more than another, does Facebook’s choice to increase voter turnout mean they’ll effectively help elect certain representatives? What affect could that have not only on political outcomes, but on the behavior of politicians as Facebook lobbies for certain issues?

And this is of course assuming Facebook just wants to innocently increase voter turnout. What if they wanted to nudge the process a bit more?

Take off your tinfoil hat. This isn’t as crazy as you might think, but you need to understand why and how your newsfeed is constructed.

Facebook makes money by keeping you on their website to show you advertisements. The cat and baby pictures, memes, and food pictures you see everyday are served up by an algorithm that predicts what’s interesting to you, which keeps you interested in Facebook. It’s a fascinating tool that’s constantly being tweaked to separate the wheat from the chaff and bring you quality posts based off of your actions and other people’s interactions with that content. If many people engage with a post, that tells Facebook it’s good, so more people see it. If you like a lot of posts with foodie hashtags, prepare to see more of what your friends are eating.

However, Facebook also consciously decides what content you see and don’t see; based not just off of engagement, but also on their valuation of what “good” content is. They outright ban certain content, but also by simply hiding ideas and messages they don’t want to spread.

In January, Facebook announced a new campaign to weed out anything they deem as hate speech against refugees in Europe. Ostensibly, they’ll focus on calls for violence, but their automated efforts could ensnare simple criticism of immigration policies in Europe.

More intimately, Facebook already has the ability to judge the mood of a post and try to tweak people’s emotions. Adding more or less of a certain political side’s perspective is well within their reach. As reported in 2014, Facebook experimented with manipulating the emotions of users, trying to find the effect positive or negative posts had on their users’ posting habits. It’s clear Facebook has the capability to change the presentation of political information.

In 2013, Facebook announced they would consider posts more “high quality content” if it is “from a source you would trust”. Of course, they could decide which sources are more trustworthy than others, and adjust the information you and other voters see accordingly. This protects users from satire sites masquerading as news, but also could be used for more nefarious purposes.

This wouldn’t need to be a large, obvious campaign either. In some districts, small nudges at the margins can be enough to push an election. Facebook’s treasure trove of data could provide just the means to do it. Not only do most users (myself included) willingly give them information about my birthday, home, interests, and friends; but each click, comment, post, and like tells them just a bit more about me.

Even more, they track your movements across other websites, even when you are logged out of Facebook. This wealth of data makes it easier for advertisers to find just the right message for just the right audience, but also could be used to find audiences to nudge politically. Savvy political campaigns have been at this for years, but Facebook could get in on the act by showing or hiding information that could make up the basis for a voter’s opinion.

It’s worth remembering, Facebook is not a neutral entity. The company, just like any other group of people, has a right to their perspective and a right to promote it. Beyond political contributions, funding, and policy groups however, the consequences of manipulating users of their platform could be huge.

This isn’t a new problem: traditional media outlets like cable news and newspapers have long been the gatekeepers of information and access. Now the gate keepers live in Silicon Valley. There’s no solution or policy to fix, but we should be aware of the influence that curators of information have.

As Noam Chomsky observed: “Technology is basically neutral. It’s kind of like a hammer. The hammer doesn’t care whether you use it to build a house, or whether a torturer uses it to crush somebody’s skull.”

But if you wouldn’t mind, please like, comment, and share this posts to your heart’s content!

Mike Morrison About the author:
Mike Morrison is the Director of Communications for American Majority, a non-partisan training institute whose mission is to identify and mold the next wave of liberty-minded new leaders, grassroots activists and community leaders. Follow him on Twitter @MikeKMorrison
View More Articles

Stories You Might Like