Civil discourse is used to project the voices of marginalized groups and ultimately, cultivate progressive change; throughout history, activism has been an integral aspect of the global socio-political narrative. Therefore, activism evolved with technological developments in contemporary society. With the widespread prevalence and success of social media, modern activists, pundits and politicians have found spaces online where they can quickly broadcast their messages and viewpoints within the click of a button or the swipe of a screen. According to Stacey Steinberg, the senior legal skills professor at the University of Florida’s Levin College of Law, online activism “has both amplified the voices of marginalized individuals and shifted national conversations on important legal and social issues.”

While social media has been essential for the amplification of activism — both online and offline — it has also been the breeding ground for vitriolic hate and pro-white supremacist content. Demagogues and others have used social media sites, such as Facebook, YouTube and Twitter, to garner followings who agree with their racist, anti-Semitic, homophobic, transphobic, anti-Muslim and white supremacist viewpoints. Alongside these demagogues’ increasing usage of social media, the world has seen a rise in white supremacist violent attacks. For example, the gunmen who perpetrated the mass shootings in Charleston, South Carolina, Pittsburgh, Pennsylvania, and New Zealand, were all active on hate sites before they transferred their hate from online engagement to offline violence. These cases indicate that hate spewed on social media can have real-life, deadly consequences.

With this correlation in mind, the Lawyers’ Committee, through its Stop Hate Project, has been advocating for Facebook and other platforms to revise and strengthen their policies regarding online hate. In March 2019, Facebook vowed the platform would begin banning content that promotes white nationalism and white separatism. YouTube followed shortly after with its own renewed policy that aims to remove and or demonetize videos that promote hateful and white supremacist content. To continue challenging other social media corporations to make these changes to their platforms, the Lawyers’ Committee has partnered with Change the Terms, an organization that aims to reduce hate online, and has developed a list of recommended policies for corporations to adopt and implement.

Although these changes in Facebook and YouTube’s policies mark a step in the right direction, skeptics are wondering whether these provisions will have any real positive impact on the current political climate. Prior to the change in policy, YouTube had not been actively enforcing its past rules that vowed to demonetize hateful and white supremacist content. Furthermore, YouTube’s recent blog post regarding its new policy did not give much detail as to how this policy will be enforced, thereby suggesting to onlookers that these new rules may be ambitious euphemisms that will not solve the problem. Moreover, although Facebook took some steps to remove a few prominent demagogues and racists from its platform, the company did not detail a strategy as to how it will be removing less prominent white supremacist influencers.

Although Twitter has taken steps to delegitimize some of its more hateful users, Twitter’s CEO, Jack Dorsey, has expressed reservations about how the site can move forward with taking a stronger stance against hateful activities and white supremacist content. More recently, Twitter’s executives have expressed that they worry that if the platform moves to remove white supremacist content, conservatives could be disproportionately affected. To move forward in a manner that is equitable for all users — regardless of their political affiliations — it is imperative that these social media sites create clear policies that prohibit hateful activities and take into account the nuances and breadth of the complex relationship that exist between marginalized identities and politically fueled agendas and language.

Yet, it can be argued that contemporary social media sites were not originally intended to be the place for political debate and cultivation of activist movements. Rather, out of need, activists, pundits and politicians morphed and transformed these platforms to gain a more prominent voice in an increasingly digitized society. This theory is supported by the general design of social media sites. For example, the algorithms of these online platforms are built with the intention of maximizing user engagement, and that causes the most extreme users to gain optimal traction. According to Jay Van Bavel, the Associate Professor of Psychology & Neural Science at New York University and the Director of New York University’s Social Perception and Evaluation Lab, “political news stories that are laden with emotional content connect to our identities and are morally arousing;” these are the posts that usually compel users to engage the most, thereby ensuring that the most outrageous content creators garner the largest followings.

Once a user engages with hateful content, the algorithm exposes them to similar content and eventually traps them in what Bavel calls an “echo chamber.” Through his research, Bavel has concluded that low quality and highly emotional content is more likely to go viral. Evidently, social media sites that use these algorithms are intrinsically “designed to reward bad apples” — those who are willing to push the envelope to share the most outrageous content. That can gradually lead users from seemingly normal content to progressively more extreme, hateful or conspiracy theorist content. Ultimately, although social media platforms, such as Facebook and YouTube, have worked to punish bad actors, they have not made any efforts to systematically change their platforms.

Although social media is merely one facet of this increasingly complex ecosystem of hateful activity online, social media executives need to take proactive steps to protect the safety of their users and fairly enforce their own policies. Meanwhile, in order to protect equal opportunity for everyone in the modern public square and marketplace, Congress needs to extend brick-and-mortar civil rights laws to apply to Internet platforms.

Melissa Denizard is a communications and Education Opportunities Project intern for the Lawyers’ Committee, a nonpartisan, nonprofit organization that was formed in 1963 at the request of President John F. Kennedy to involve the private bar in providing legal services to address racial discrimination. Denizard is also a budding activist, organizer, documentarian, and public speaker who attends Babson College.