Taking Away Content Moderation Is A Dangerous Proposition
There have been many misunderstandings between the freedom of speech and the rights of private companies to create a safe digital community.
April 28, 2022 at 3:21 pm
After I founded the Georgia Youth Poll Worker Project to recruit poll workers and provide young people with accurate voter information online, I became the target of attacks by a far right-leaning website. Though it was a terribly unpleasant experience, it did not change how I value the internet being the amazing tool that it is, particularly since many of its major platforms provide services that I and thousands of others use to forward our work.
I do believe that there should be regulations on major platforms like Google, Facebook and Amazon Web Services, but we need to be careful and surgical when considering regulations, because reforms that are too vague could have unintended and dangerous consequences. These are the reasons why I do not support the American Innovation and Choice Online Act.
The American Innovation and Choice Online Act (AIOCA) is an antitrust proposal currently moving through Congress. The act seeks to regulate the practice of the major tech corporations that many see as “anti-competitive” behaviors. Of course, major tech companies should be regulated more than they are now, but one provision that gives me pause would make it “unlawful for a person operating a covered platform, in or affecting commerce, to engage in any conduct in connection with the operation of the covered platform that discriminates among similarly situated business users.”
If passed, this rule could hurt online platforms’ ability to moderate their content. Content moderation is when an online platform screens and monitors user-generated content based on platform-specific rules and guidelines to determine if the content should be published on the online platform or not.
We have seen examples of the crucial need for this in recent national political campaigns. In June 2021, the editor of the technology industry blog, TechDirt wrote about an attack on content moderation in the way of the Parler and Amazon debacle. In response to Parler serving as a platform for the planning and execution of the January 6th insurrection as well as concerns with data and security, Amazon Web Services decided not to host Parler on their platform (a move followed by the Apple App Store and Google Play Store). In response, Parler sued these platforms citing unfair treatment, a case which was thrown out on the basis of AWS having the ability to moderate the content of their platform. If AIOCA passes, some pages and apps could become dangerous for people of color, organizers and activists using these platforms to further their causes.
There have been many misunderstandings between the freedom of speech and the rights of private companies to create a safe digital community. America is one of the freest places for ideas in the world, and the government has done much to ensure the rights for people to use public resources and platforms to exchange ideas, even if they are hate-filled. With that being said, these protections do not extend to private hosting and social media platforms. Platforms have the ability to set terms of service in the same way the business has the ability to set dress codes and enforce mask mandates. The erosion of these protections will not only hurt internet users but also the ability of businesses to create safe environments.
Evan Malbrough is a voting access advocate and founder of the Georgia Youth Poll Worker Project. He serves as a fellow with the ACLU of Georgia and recently joined the board of the Andrew Goodman Foundation.