Attorney General calls for changes to social media sites in wake of Buffalo shooting

Facebook
Twitter
LinkedIn
Email
Print
Discord website is one of the sites being looked at by Attorney General James.

ALBANY – New York State Attorney General Letitia James and Governor Kathy Hochul are calling for punishments for social media platforms following the Buffalo mass shooting in May of this year.  Payton Gendron, accused of killing 10 people in a grocery store, live-streamed the incident.

Gendron, according to a statement from the Office of the Attorney General, said the live stream of the massacre was initially broadcast on the fringe social media site known as Twitch.  The platform allegedly shut down the broadcast within two minutes.

The attorney general said, “Fringe online platforms, like 4chan, radicalized the shooter; livestreaming platforms, like Twitch, were weaponized to publicize and encourage copycat violent attacks; and a lack of oversight, transparency, and accountability of these platforms allowed hateful and extremist views to proliferate online, leading to radicalization and violence.”

Attorney General James and Governor Hochul are calling for federal and state reforms to combat online extremism and violence, including state legislation that would criminalize graphic images or videos created by a perpetrator of a homicide and penalize individuals who reshare or repost those same images or videos.

“The tragic shooting in Buffalo exposed the real dangers of unmoderated online platforms that have become breeding grounds for white supremacy,” said Attorney General James.

“The report confirms that several online platforms played an undeniable role in the racist attack, first by radicalizing the shooter as he consumed voluminous amounts of racist and violent content, helping him prepare for the attack, and finally allowing him to broadcast it,” according to the attorney general.

In response to the findings in the report, Attorney General James recommends a variety of reforms that are supported by Governor Hochul and would tackle online extremism and increase accountability of online platforms. These recommendations include: 

 

  • Create Liability for the Creation and Distribution of Videos of Homicides: New York and other states should pass legislation imposing criminal liability for the creation, by the perpetrator, of images or videos depicting a homicide. New York should explore establishing civil liability for anyone who transmits or distributes such images or videos beyond the perpetrator. In concert with appropriate revisions to Section 230, this liability would extend to online platforms, including social media and livestreaming platforms, that do not take reasonable steps to prevent such content from appearing.
  • Add Restrictions to Livestreaming: Livestreaming was used as a tool by the Buffalo shooter, like previous hate-fueled attacks, to instantaneously document and broadcast his violent acts to secure a measure of fame and radicalize others. Livestreaming on platforms should be subject to restrictions — including verification requirements and tape delays — tailored to identify first-person violence before it can be widely disseminated.
  • Reform Section 230: Currently, Section 230 of the federal Communications Decency Act protects online platforms from liability for third-party content that they host, regardless of those platforms’ moderation practices. Congress should rethink the ready availability of Section 230 as a complete defense for online platforms’ content moderation practices. Instead, the law should be reformed to require an online platform that wishes to retain Section 230’s protections to take reasonable steps to prevent unlawful violent criminal content from appearing on the platform. This proposal would change the default. Instead of simply being able to assert protection under Section 230, an online platform has the initial burden of establishing that its policies and practices were reasonably designed to address unlawful content.
  • Increase Transparency and Strengthen Moderation: Online platforms should provide better transparency into their content moderation policies and how those policies are applied in practice, including those that are aimed at addressing hateful, extremist, and racist content. They should also invest in improving industry-wide processes and procedures for reducing the prevalence of such content, including by expanding the types of content that can be analyzed for violations of their policies, improving detection technology, and providing even more efficient means to share information.
  • Call on Industry Service Providers to Do More: Online service providers, like domain registrars and hosting companies, stand in between fringe sites and users. These companies should take a closer look at the websites that repeatedly traffic in violent, hateful content, and refuse to service sites that perpetuate the cycle of white supremacist violence.



Popular Stories