Leveraging CDA 230 to Counter Online Extremism

Legal Perspectives on Tech Series

September 1, 2019

  PDF Version

Amendments to section 230 that would require platforms to operate as “neutral public forums” would greatly undermine their efforts to keep hate speech and extremist content offline. Amendments to section 230 that would make platforms liable for illegal third-party content that they fail to detect and remove would fall hardest on platforms that are least able to bear the risk and cost of increased liability and uncertainty. For those who are concerned about economic concentration at the edge of the internet, it’s worth noting that Facebook, Twitter, and YouTube are relatively well-positioned to bear the increased costs of decreased immunity, whereas start-ups and smaller sites are not.

Any amendment to section 230 should focus narrowly on protections for users of megaplatforms whose lawful speech is affected by algorithmic enforcement of community guidelines. Amendments that condition immunity for the world’s largest platforms on transparency, explanation, and redress would protect users’ freedom of expression and help address concerns that platforms are enforcing their community guidelines unfairly.