Skip to content →

Platforms of the Alt-Right

One of the original platforms for the alt-right is a site called Reddit. Reddit, the “front page of the Internet,” is anonymous by design. It is set up as a series of “threads” in which users can read, post and comment on one another’s posts. While the niche topics that users discuss on Reddit are infinite, some of the most popular threads are political. r/politics, a thread created in 2007, has 5.9 million members. r/Trump, created in 2009, has nearly 40k “Patriots,” or members. r/BernieBros is a private community, so users are able to join only by invitation from the moderators of the community. Because of the anonymity of Reddit, it became a breeding ground for aggressive debate and tacitly permitted hate groups to thrive.

Reddit has since taken action in preventing hate speech on its threads. For example, when users click on a post in r/politics about Donald Trump as the “worst president ever,” a message appears from the moderator which reads, “As a reminder, this subreddit is for civil discussion. In general, be courteous to others. Debate/discuss/argue the merits of ideas, don’t attack people. Personal insults, shill or troll accusations, hate speech, any advocating or wishing death/physical harm, and other rule violations can result in a permanent ban.” Reddit’s shift from being a platform that was largely a free-for-all to a highly monitored one represents their compliance with federal laws preventing hate speech. Because of their stricter guidelines, though, many former Reddit users have found other platforms, like 4chan, 8chan and over a dozen new “alt-tech” sites where they can troll the internet without rules. 

The alt-right has also populated more mainstream platforms, such as YouTube, home of the group called the Alternative Influence Network. A 2018 report studied the actions of 65 scholars, media pundits and internet celebrities who “use YouTube to promote a range of political positions, from mainstream versions of libertarianism and conservatism, all the way to overt white nationalism.” The existence of this network squashes the idea that the alt-right exists only in the dark corners of the Internet, as YouTube reaches one of the widest audiences on the internet. While most of the more “mainstream members” do not explicitly subscribe to alt-right and white nationalist ideals, the most surprising finding of the study is the overt racism and sexism that goes unchallenged on many of the members’ channels. Rebecca Lewis, the author of the report, refers to conversations with guests that are brought onto many of these channels, stating, “they have these conversations where really openly racist ideas are getting thrown around as if they are perfectly normal.

Open white nationalists generally know how to stay within YouTube’s community guidelines, however, and that is how many of them still survive on the platform. They rarely use overt racial slurs; instead they “‘couch the language in a way that obscures its violent overtones,’” Lewis says. Not only do they survive, but they thrive. Using marketing tactics, they gain views by mentioning terms such as “‘social justice’, ‘liberal’ and ‘intersectionality’” in their videos. Another researcher of this phenomenon of “YouTube radicalization,” Kevin Munger warns, “‘for these far-right groups, the audience is treating it much more as interactive space. And this could lead to the creation of a community.’” 

Until YouTube does more to monitor far-right groups on YouTube, they will gain more and more influence. Social media platforms like YouTube should adapt to make sure extremist hate groups are not growing on their platforms. While there can be a fine line between opinion and hate, YouTube should take action to prevent any language that promotes hate or violence, even if that hatred or violence is disguised. As much as alt-right users like to avoid it, according to the American Library Association, “the First Amendment does not protect behavior that crosses the line into targeted harassment or threats, or that creates a pervasively hostile environment.” Thus, YouTube is allowed to, and should, remove users and groups that are part of “pervasively hostile environments” on its platform.

Previous Page   Next Page