'Censorship-free' platforms: Evaluating content moderation policies and practices of alternative social media

Nicole Buckley, Joseph S. Schafer

Abstract


Following the development and implementation of mainstream social media platforms’ election-related speech policies, a renewed wave of criticism emerged from the U.S. ideological right. Several months before the 2020 U.S. presidential election, conservative politicians, pundits, and self-described patriots alleged that their speech was being censored by “Big Tech.” This resulted in right-leaning influencers, and many of their followers, migrating to alternative online platforms to avoid moderation. Alternative social media, such as Parler, Bitchute, Gab, and Gettr, describe themselves as unmoderated hubs for “free speech,” signalling an invitation for users to voice everything from unpopular opinions, to misinformation, to hate speech. Yet when pushed by technology infrastructure platforms like Apple’s App Store and Google’s Play Store to address missing or substandard moderation practices, “alt-tech” platforms were forced to create or adapt ad hoc, often minimalistic, content moderation policies.

Our research explores and evaluates these policies in comparison to mainstream platforms, and analyses how moderation policies interact with the ideological framework asserted at an alternative platform’s nascence. Our work provides necessary insight into the potential motivations for one potential source of U.S. internet platform oversight. With few immediately available regulatory options, assessing the viability of alternatives is crucial. This is particularly true as severe legislative gridlock stalls meaningful reform to the federal law perhaps most capable of improving platforms’ moderation practices. Because private regulation appears to be the most immediate solution to address new breeding grounds for mis- and disinformation, inquiry into alternative platforms’ adoption and enforcement of moderation policies is needed. Our paper concludes with questions for future research into the efficacy of alternative platforms’ policy implementation; it is imperative to distinguish legitimate moderation from mere shells constructed to retain profit in parallel with ideological posturing.


Keywords


Content Moderation; Social Media; Platform Policy; Section 230; Misinformation

Full Text:

PDF




Copyright (c) 2022 Nicole Buckley, Joseph S. Schafer

License URL: https://creativecommons.org/licenses/by/4.0/

for(e)dialogue

ISSN 2398-0532

University Home