Downey, MargaretPritkin, JasmineWilson, IsabelBeyer, Jessica L.Chung, VictoriaGales, ChristianGlenn, EmilyHoward, LeviKelly, TessaLay, IreneLeonard, AveryMaglaras, SophieMin, GunheeNguyen, ThuyRyan, JillianVan Eenenaam, ElenaWang, Vienna2024-07-042024-07-042023http://hdl.handle.net/1773/51511Our report looks at three major categories of disputable content, provides an overview of the current state of social media platform content moderation, and assesses content moderation legislation—or lack thereof—in democratic nations in order to understand the strengths and shortcomings of regulatory approaches to moderating social media content. To accomplish this, we unpack categories of disputable content with a particular focus on disinformation and misinformation, state sponsored content, hate speech, and extremist content and illustrate the real-world impacts of this content. We examine five social media platforms with significant global audiences—YouTube, Instagram, Facebook, TikTok, and Twitter—to understand their existing content moderation policies. Finally, we analyze government regulation related to social media content that exemplifies different approaches to this issue in democracies, including the European Union, Germany, the United States, Taiwan, Brazil, and Australia.Disputable Content and Democracy: Freedom of Expression in the Digital World