bootstrap responsive templates

Social Media Sockpuppets

Facebook Ads Purchased by a Russian Troll Farm

Algorithmic Intensification

Like any social media site, Facebook encourages engagement and participation. It does so by utilizing algorithms that prioritize content based on whether it’s more likely to interest an individual, which is based on that user’s prior interactions with the platform as well as other similar users in the network. Imagine an American Facebook user clicking on a sponsored ad for a group called “Being Patriotic.” The posts seem innocuous enough, so the user likes the page and then begins seeing more content from that group on their newsfeed. Over time they may gradually start to see content from “Being Patriotic” that incites the user’s pride in their country, which they like and maybe share with friends. Now a pattern emerges. Facebook’s algorithm may start recommending other related groups to that user based on their pattern of clicking on content professing patriotism. Some of these groups may be more extreme, denouncing immigrants or the Black Lives Matter movement. The user may click on some of these and then get drawn into those online communities, which may lead them to even more extreme content, such as the conspiracy theories of Infowars. I use this brief example to illustrate how social media algorithms often lead users to more extreme content.

Although this dataset analyzes Facebook ads and not YouTube videos, I think that Zeynep Tufekci’s article from last year titled “YouTube, the Great Radicalizer” can be instructive for our purposes (Tufekci). In it, she describes her experience of watching videos of Trump rallies for research purposes and soon after receiving recommendations from YouTube to watch videos of white supremacist rants and Holocaust denials. She found this to be true of virtually any topic she tested, political or not. Videos on Bernie Sanders led to videos on left-wing conspiracies, videos on vegetarianism led to videos on veganism, videos on jogging led to videos on ultramarathons. Its algorithms may be private, but it appears that YouTube concludes that people are drawn to content that is more extreme than what they initially viewed, or to incendiary content in general. This is not some malicious strategy on YouTube’s part; it’s an outgrowth of Google, which owns YouTube, monetizing people’s attention. Tufekci calls this the “computational exploitation of a natural human desire to ‘look behind the curtain,’ to dig deeper into something that engages us.” She concludes her article with a metaphor of YouTube as a restaurant serving us increasingly unhealthy junk foods, which makes our tastes change to crave more junk food, and then telling concerned citizens that they are merely serving the people what they want.

This metaphor can be transposed to describe any social media algorithm that seeks to monetize user’s attention and clicks by giving them more of what they have shown a propensity to like in the past. The problematic implication from all this is that “if elections-related advertising is political propaganda, then ordinary advertising and promotions can be thought of as the propaganda of everyday life” (Jack). Therein lies the problem: Facebook makes its money off advertising and is not going to fundamentally alter its entire business model. Instead, they perform cognitive contortions to suggest that “targeted advertising works well enough that companies and candidates should pay Facebook to help them do it” while simultaneously claiming that it “doesn’t work well enough to require legal restrictions.”

And it is incorrect to assume that each of these platforms exist in isolation; Facebook, Instagram, Twitter, YouTube, etc. all occupy the same ecosystem and direct traffic to one another constantly. Even if the content of an IRA ad seems innocuous and doesn’t radicalize you, following it nudges you toward more extreme content, which may bring you into contact with other users who share conspiracy theory videos from YouTube, which pull you even deeper into the rabbit hole. The echo chamber effect is real on social media, especially with follows and subscriptions, “which let users curate their own homepages that consistently reaffirm a particular political worldview" (Kaiser and Rauchfleisch).