Abstract
Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators' digital labor, how transparency design could support creators' learning, as well as implications for transparency design of other creator platforms.
- Julia Alexander. 2019. YouTube moderation bots punish videos tagged as gay' or lesbian,' study finds. The Verge. Retrieved from https://www.theverge.com/2019/9/30/20887614/youtube-moderation-lgbtq-demonetization-terms-words-nerd-city-investigationGoogle Scholar
- Julia Alexander. 2019. LGBTQ YouTubers are suing YouTube over alleged discrimination. The Verge. Retrieved from https://www.theverge.com/2019/8/14/20805283/lgbtq-youtuber-lawsuit-discrimination-alleged-video-recommendations-demonetizationGoogle Scholar
- Ali Alkhatib and Michael Bernstein. 2019. Street--level algorithms: A theory at the gaps between policy and decisions. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, NY, USA, 1--13. DOI:https://doi.org/10.1145/3290605.3300760Google ScholarDigital Library
- Mike Ananny and Kate Crawford. 2018. Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media Soc 20, 3 (March 2018), 973--989. DOI:https://doi.org/10.1177/1461444816676645Google ScholarCross Ref
- Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Fem Media Stud (2021). DOI:https://doi.org/10.1080/14680777.2021.1928259Google ScholarCross Ref
- Sophie Bishop. 2018. Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm. Convergence: The International Journal of Research into New Media Technologies 24, 1 (2018), 69--84. DOI:https://doi.org/10.1177/1354856517736978Google ScholarCross Ref
- Lindsay Blackwell, Mark Handel, Sarah T. Roberts, Amy Bruckman, and Kimberly Voll. 2018. Understanding bad actors" online. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 1--7. DOI:https://doi.org/10.1145/3170427.3170610Google ScholarDigital Library
- Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qual Res Psychol 3, 2 (January 2006), 77--101. DOI:https://doi.org/10.1191/1478088706qp063oaGoogle ScholarCross Ref
- Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc Media Soc 6, 2 (2020). DOI:https://doi.org/10.1177/2056305120936636Google ScholarCross Ref
- Bobo Chan. 2020. Is Being A YouTuber Still Lucrative? Jumpstart Magazine. Retrieved from https://www.jumpstartmag.com/is-being-a-youtuber-still-lucrative/Google Scholar
- Stevie Chancellor, Jessica Pater, Trustin Clear, Eric Gilbert, and Munmun De Choudhury. 2016. #thyghgapp: Instagram Content Moderation and Lexical Variation in Pro-Eating Disorder Communities. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing - CSCW '16, ACM Press, New York, New York, USA. DOI:https://doi.org/http://dx.doi.org/10.1145/2818048.2819963Google ScholarDigital Library
- Eshwar Chandrasekharan, Chaitrali Gandhi, Matthew Wortley Mustelier, and Eric Gilbert. 2019. Crossmod: A Cross-Community Learning-Based System to Assist Reddit Moderators. Proc. ACM Hum.-Comput. Interact. 3, CSCW (November 2019). DOI:https://doi.org/10.1145/3359276Google ScholarDigital Library
- Eshwar Chandrasekharan, Mattia Samory, Shagun Jhaver, Hunter Charvat, Amy Bruckman, Cliff Lampe, Jacob Eisenstein, and Eric Gilbert. 2018. The Internet's hidden rules: An empirical study of Reddit norm violations at micro, meso, and macro scales. Proc ACM Hum Comput Interact 2, CSCW (November 2018), 1--25. DOI:https://doi.org/10.1145/3274301Google ScholarDigital Library
- Hao-Fei Cheng, Ruotong Wang, Zheng Zhang, Fiona O'connell, Terrance Gray, F Maxwell Harper, and Haiyi Zhu. Explaining Decision-Making Algorithms through UI: Strategies to Help Non-Expert Stakeholders. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. DOI:https://doi.org/10.1145/3290605Google ScholarDigital Library
- Lars Thøger Christensen and George Cheney. 2015. Peering into Transparency: Challenging Ideals, Proxies, and Organizational Practices. Communication Theory 25, 1 (February 2015), 70--90. DOI:https://doi.org/10.1111/COMT.12052Google ScholarCross Ref
- Nicholas Diakopoulos and Michael Koliska. 2017. Algorithmic Transparency in the News Media. Digital Journalism 5, 7 (August 2017), 809--828. DOI:https://doi.org/10.1080/21670811.2016.1208053Google ScholarCross Ref
- Bryan Dosono and Bryan Semaan. 2019. Moderation practices as emotional labor in sustaining online communities: The case of AAPI identity work on reddit. In Conference on Human Factors in Computing Systems - Proceedings, Association for Computing Machinery, New York, New York, USA, 1--13. DOI:https://doi.org/10.1145/3290605.3300372Google ScholarDigital Library
- Brooke Erin Duffy. 2020. Algorithmic precarity in cultural work. Communication and the Public 5, 3--4 (September 2020), 103--107. DOI:https://doi.org/10.1177/2057047320959855Google ScholarCross Ref
- Motahhare Eslami, Aimee Rickman, Kristen Vaccaro, Amirhossein Aleyasen, Andy Vuong, Karrie Karahalios, Kevin Hamilton, and Christian Sandvig. 2015. ?I always assumed that I wasn't really that close to [her]": Reasoning about Invisible Algorithms in News Feeds. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (April 2015), 153--162. DOI:https://doi.org/10.1145/2702123Google ScholarDigital Library
- Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proc ACM Hum Comput Interact 4, CSCW1 (May 2020). DOI:https://doi.org/10.1145/3392845Google ScholarDigital Library
- Casey Fiesler, Jialun Aaron Jiang, Joshua McCann, Kyle Frye, and Jed R. Brubaker. 2018. Reddit rules! Characterizing an ecosystem of governance. 12th International AAAI Conference on Web and Social Media, ICWSM 2018 (2018), 72--81.Google Scholar
- Mikkel Flyverbom. 2016. Digital Age| Transparency: Mediation and the Management of Visibilities. Int J Commun 10, 0 (January 2016), 13.Google Scholar
- Archon Fung, Mary Graham, and David Weil. 2007. Full Disclosure: The Perils and Promise of Transparency.Google ScholarCross Ref
- Matthias Funk. 2020. How Many YouTube Channels Are There? Retrieved from https://www.tubics.com/blog/number-of-youtube-channels/Google Scholar
- Ysabel Gerrard. 2018. Beyond the hashtag: Circumventing content moderation on social media. New Media Soc 20, 12 (December 2018), 4492--4511. DOI:https://doi.org/10.1177/1461444818776611Google ScholarCross Ref
- Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. Retrieved from https://www.degruyter.com/document/doi/10.12987/9780300235029/htmlGoogle Scholar
- Tarleton Gillespie. 2020. Content moderation, AI, and the question of scale: https://doi.org/10.1177/2053951720943234 7, 2 (August 2020). DOI:https://doi.org/10.1177/2053951720943234Google ScholarCross Ref
- GMI Blogger. 2022. YouTube Statistics 2022. Retrieved October 24, 2022 from https://www.globalmediainsight.com/blog/youtube-users-statistics/Google Scholar
- Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data Soc 7, 1 (January 2020), 205395171989794. DOI:https://doi.org/10.1177/2053951719897945Google ScholarCross Ref
- James Grimmelmann. 2015. The Virtues of Moderation. Yale Journal of Law and Technology 17, (2015). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/yjolt17&id=42&div=&collection=Google Scholar
- Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc ACM Hum Comput Interact 5, CSCW2 (October 2021). DOI:https://doi.org/10.1145/3479610Google ScholarDigital Library
- Hans Krause Hansen, Lars Thøger Christensen, and Mikkel Flyverbom. 2015. Introduction: Logics of transparency in late modernity: Paradoxes, mediation and governance. https://doi.org/10.1177/1368431014555254 18, 2 (April 2015), 117--131. DOI:https://doi.org/10.1177/1368431014555254Google ScholarCross Ref
- Maya Holikatti, Shagun Jhaver, and Neha Kumar. 2019. Learning to Airbnb by Engaging in Online Communities of Practice. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 228. DOI:https://doi.org/10.1145/3359330Google ScholarDigital Library
- Nicholas Jackson. 2011. Infographic: The History of Video Advertising on YouTube. The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2011/08/infographic-the-history-of-video-advertising-on-youtube/242836/Google Scholar
- Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. Did you suspect the post would be removed?": Understanding user reactions to content removals on reddit. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 1--33. DOI:https://doi.org/10.1145/3359294Google ScholarDigital Library
- Shagun Jhaver, Iris Birman, Eric Gilbert, and Amy Bruckman. 2019. Human-machine collaboration for content regulation: The case of reddit automoderator. ACM Transactions on Computer-Human Interaction 26, 5 (July 2019), 1--35. DOI:https://doi.org/10.1145/3338243Google ScholarDigital Library
- Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. Proc ACM Hum Comput Interact 3, CSCW (2019). DOI:https://doi.org/10.1145/3359252Google ScholarDigital Library
- Shan Jiang, Ronald E Robertson, and Christo Wilson. 2019. Bias Misperceived: The Role of Partisanship and Misinformation in YouTube Comment Moderation.Google Scholar
- Lin Jin. 2020. The Creator Economy Needs a Middle Class. Harvard Business Review. Retrieved from https://hbr.org/2020/12/the-creator-economy-needs-a-middle-classGoogle Scholar
- Hilary Johnson and Peter Johnson. 1993. Explanation Facilities and Interactive Systems. In Proceedings of the 1st international conference on Intelligent user interfaces.Google ScholarDigital Library
- Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through the looking glass: Study of transparency in Reddit's moderation practices. Proc ACM Hum Comput Interact 4, GROUP (January 2020), 1--35. DOI:https://doi.org/10.1145/3375197Google ScholarDigital Library
- D. Bondy Valdovinos Kaye and Joanne E. Gray. 2021. Copyright Gossip: Exploring Copyright Opinions, Theories, and Strategies on YouTube: Soc Media Soc 7, 3 (August 2021). DOI:https://doi.org/10.1177/20563051211036940Google ScholarCross Ref
- Charles Kiene, Jialun Aaron Jiang, and Benjamin Mako Hill. 2019. Technological Frames and User Innovation: Exploring Technological Change in Community Moderation Teams. Proc. ACM Hum.-Comput. Interact. 3, CSCW (November 2019). DOI:https://doi.org/10.1145/3359146Google ScholarDigital Library
- Rene F. Kizilcec. 2016. How much information? Effects of transparency on trust in an algorithmic interface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 2390--2395. DOI:https://doi.org/10.1145/2858036.2858402Google ScholarDigital Library
- Susanne Kopf. 2020. Rewarding Good Creators": Corporate Social Media Discourse on Monetization Schemes for Content Creators. Soc Media Soc 6, 4 (October 2020), 205630512096987. DOI:https://doi.org/10.1177/2056305120969877Google ScholarCross Ref
- Yubo Kou. 2021. Punishment and Its Discontents: An Analysis of Permanent Ban in an Online Game Community. Proc ACM Hum Comput Interact 5, CSCW2 (October 2021). DOI:https://doi.org/10.1145/3476075Google ScholarDigital Library
- Yubo Kou and Colin M. Gray. 2018. What do you recommend a complete beginner like me to practice?" Proc ACM Hum Comput Interact 2, CSCW (November 2018). DOI:https://doi.org/10.1145/3274363Google ScholarDigital Library
- Yubo Kou and Xinning Gui. 2020. Mediating Community-AI Interaction through Situated Explanation: the Case of AI-Led Moderation. Proc ACM Hum Comput Interact 4, CSCW2 (October 2020), 1--27. DOI:https://doi.org/10.1145/3415173Google ScholarDigital Library
- Kyle Langvardt. 2017. Regulating Online Content Moderation. Georgetown Law Journal 106, (2017). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/glj106&id=1367&div=39&collection=journalsGoogle Scholar
- Ralph LaRossa. 2005. Grounded Theory Methods and Qualitative Family Research. Journal of Marriage and Family 67, 4 (November 2005), 837--857. DOI:https://doi.org/10.1111/j.1741--3737.2005.00179.xGoogle ScholarCross Ref
- Min Kyung Lee, Anuraag Jain, Hae J.I.N. Cha, Shashank Ojha, and Daniel Kusbit. 2019. Procedural justice in algorithmic fairness: Leveraging transparency and outcome control for fair algorithmic mediation. Proc ACM Hum Comput Interact 3, CSCW (November 2019), 26. DOI:https://doi.org/10.1145/3359284Google ScholarDigital Library
- Renkai Ma and Yubo Kou. 2021. How advertiser-friendly is my video?": YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation. PACM on Human Computer Interaction 5, CSCW2 (2021), 1--26. DOI:https://doi.org/https://doi.org/10.1145/3479573Google ScholarDigital Library
- Renkai Ma and Yubo Kou. 2022. ?I'm not sure what difference is between their content and mine, other than the person itself": A Study of Fairness Perception of Content Moderation on YouTube. Proc ACM Hum Comput Interact 6, CSCW2 (2022), 28. DOI:https://doi.org/10.1145/3555150Google ScholarDigital Library
- Renkai Ma and Yubo Kou. 2022. I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms": Bureaucracy of Creator Moderation on YouTube. In Companion Computer Supported Co-operative Work and Social Computing (CSCW'22 Companion). Retrieved from https://doi.org/10.1145/3500868.3559445Google ScholarDigital Library
- Melissa J. Morgans. 2017. Freedom of Speech, the War on Terror, and What's YouTube Got to Do with It: American Censorship during Times of Military Conflict. Federal Communications Law Journal 69, (2017). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/fedcom69&id=163&div=&collection=Google Scholar
- Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media Soc 20, 11 (2018), 4366--4383. DOI:https://doi.org/10.1177/1461444818773059Google ScholarCross Ref
- Juho Pääkkönen, Matti Nelimarkka, Jesse Haapoja, and Airi Lampinen. 2020. Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems. Conference on Human Factors in Computing Systems - Proceedings (April 2020). DOI:https://doi.org/10.1145/3313831.3376780Google ScholarDigital Library
- Jessica A. Pater, Oliver L. Haimson, Nazanin Andalibi, and Elizabeth D. Mynatt. 2016. Hunger hurts but starving works": characterizing the presentation of eating disorders online. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Association for Computing Machinery, New York, NY, USA, 1185--1200. DOI:https://doi.org/10.1145/2818048.2820030Google ScholarDigital Library
- Hector Postigo. 2016. The socio-technical architecture of digital labor: Converting play into YouTube money. New Media Soc 18, 2 (2016), 332--349. DOI:https://doi.org/10.1177/1461444814541527Google ScholarCross Ref
- Emilee Rader, Kelley Cotter, and Janghee Cho. 2018. Explanations as mechanisms for supporting algorithmic transparency. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 1--13. DOI:https://doi.org/10.1145/3173574.3173677Google ScholarDigital Library
- Aja Roman. 2019. YouTubers claim the site systematically demonetizes LGBTQ content. Vox. Retrieved from https://www.vox.com/culture/2019/10/10/20893258/youtube-lgbtq-censorship-demonetization-nerd-city-algorithm-reportGoogle Scholar
- Laura Savolainen. 2022. The shadow banning controversy: perceived governance and algorithmic folklore: Media Cult Soc (March 2022). DOI:https://doi.org/10.1177/01634437221077174Google ScholarCross Ref
- Mark Scott and Mike Isaac. 2016. Facebook Restores Iconic Vietnam War Photo It Censored for Nudity. The New York Times. Retrieved from https://www.nytimes.com/2016/09/10/technology/facebook-vietnam-war-photo-nudity.htmlGoogle Scholar
- Joseph Seering. 2020. Reconsidering Self-Moderation: the Role of Research in Supporting Community-Based Models for Online Content Moderation. Proc ACM Hum Comput Interact 4, CSCW2 (October 2020), 1--28. DOI:https://doi.org/10.1145/3415178Google ScholarDigital Library
- Joseph Seering, Juan Pablo Flores, Saiph Savage, and Jessica Hammer. 2018. The Social Roles of Bots: Evaluating Impact of Bots on Discussions in Online Communities. Proc ACM Hum Comput Interact 2, CSCW (November 2018). DOI:https://doi.org/10.1145/3274426Google ScholarDigital Library
- Joseph Seering, Tony Wang, Jina Yoon, and Geoff Kaufman. 2019. Moderator engagement and community development in the age of algorithms. New Media Soc 21, 7 (July 2019), 1417--1443. DOI:https://doi.org/10.1177/1461444818821316Google ScholarCross Ref
- Nischal Shrestha, Titus Barik, and Chris Parnin. 2021. Remote, but Connected: How #TidyTuesday Provides an Online Community of Practice for Data Scientists. Proc ACM Hum Comput Interact 5, CSCW1 (April 2021), 1--31. DOI:https://doi.org/10.1145/3449126Google ScholarDigital Library
- Spandana Singh. 2019. Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content. Retrieved from https://www.newamerica.org/oti/reports/everything-moderation-analysis-how-internet-platforms-are-using-artificial-intelligence-moderate-user-generated-content/Google Scholar
- Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. Int J Commun 13, (2019). Retrieved from https://ijoc.org/index.php/ijoc/article/view/9736Google Scholar
- Lars Thøger Christensen. 2002. Corporate communication: The challenge of transparency. Corporate Communications: An International Journal 7, 3 (September 2002), 162--168. DOI:https://doi.org/10.1108/13563280210436772/FULL/XMLGoogle ScholarCross Ref
- Kaitlyn Tiffany. 2019. The YouTuber union isn't really a union, but it could be a big deal. Vox. Retrieved from https://www.vox.com/the-goods/2019/7/30/20747122/youtube-union-fairtube-ig-metall-instagramGoogle Scholar
- Sarah J. Tracy. 2013. Qualitative Research Methods: Collecting Evidence, Crafting Analysis.Google Scholar
- Rebecca Tushnet. 2019. Content Moderation in an Age of Extremes. Case Western Reserve Journal of Law, Technology and the Internet 10, (2019).Google Scholar
- Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. ?At the End of the Day Facebook Does What It Wants": How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction, Association for Computing Machinery, 1--22. DOI:https://doi.org/10.1145/3415238Google ScholarDigital Library
- Kristen Vaccaro, Ziang Xiao, Kevin Hamilton, and Karrie Karahalios. 2021. Contestability For Content Moderation. Proc ACM Hum Comput Interact 5, CSCW2 (October 2021), 28. DOI:https://doi.org/10.1145/3476059Google ScholarDigital Library
- Richard Ashby Wilson and Molly K. Land. 2021. Hate Speech on Social Media: Content Moderation in Context. Conn Law Rev (2021). Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3690616Google Scholar
- Lindsey Wotanis and Laurie McMillan. 2014. Performing Gender on YouTube. Fem Media Stud 14, 6 (November 2014), 912--928. DOI:https://doi.org/10.1080/14680777.2014.882373Google ScholarCross Ref
- Austin P. Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Duen Horng (Polo) Chau, and Diyi Yang. 2021. RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization. Proc ACM Hum Comput Interact 5, CSCW1 (April 2021), 1--26. DOI:https://doi.org/10.1145/3449280Google ScholarDigital Library
- Jing Zeng and D. Bondy Valdovinos Kaye. 2022. From content moderation to visibility moderation: A case study of platform governance on TikTok. Policy Internet 14, 1 (March 2022), 79--95. DOI:https://doi.org/10.1002/POI3.287Google ScholarCross Ref
- How to Appeal. onlinecensorship.org. Retrieved from https://onlinecensorship.org/resources/how-to-appealGoogle Scholar
- YouTube Partner Program overview & eligibility. YouTube Help. Retrieved from https://support.google.com/youtube/answer/72851?hl=enGoogle Scholar
- Advertiser-friendly content guidelines. YouTube Help. Retrieved from https://support.google.com/youtube/answer/6162278?hl=en#Adult&zippy=%2Cguide-to-self-certificationGoogle Scholar
- Request human review of videos marked Not suitable for most advertisers." YouTube Help. Retrieved from https://support.google.com/youtube/answer/7083671?hl=en#zippy=%2Chow-monetization-status-is-appliedGoogle Scholar
- Your content and Restricted mode. YouTube Help. Retrieved from https://support.google.com/youtube/answer/7354993?hl=en-GBGoogle Scholar
- Appeal Community Guidelines actions. YouTube Help. Retrieved from https://support.google.com/youtube/answer/185111?hl=en&ref_topic=9387060Google Scholar
- Get in touch with the YouTube Creator Support team. YouTube Help. Retrieved from https://support.google.com/youtube/answer/3545535?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2CemailGoogle Scholar
- What is a Content ID claim? YouTube Help. Retrieved from https://support.google.com/youtube/answer/6013276Google Scholar
- Protecting your identity. YouTube Help. Retrieved from https://support.google.com/youtube/answer/2801895?hl=enGoogle Scholar
- The Santa Clara Principles on Transparency and Accountability in Content Moderation. Retrieved from https://santaclaraprinciples.org/Google Scholar
Index Terms
- "Defaulting to boilerplate answers, they didn't engage in a genuine conversation": Dimensions of Transparency Design in Creator Moderation
Recommendations
"How advertiser-friendly is my video?": YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation
CSCW2To manage user-generated harmful video content, YouTube relies on AI algorithms (e.g., machine learning) in content moderation and follows a retributive justice logic to punish convicted YouTubers through demonetization, a penalty that limits or deprives ...
Conceptualizing and Rethinking the Design of Cross-platform Creator Moderation
CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing SystemsMy doctoral research explores how content creators experience creator moderation across different platforms through mixed methods. Through qualitative methods such as semi-structured interview, my prior work has gained understanding of the socioeconomic ...
"I'm not sure what difference is between their content and mine, other than the person itself": A Study of Fairness Perception of Content Moderation on YouTube
CSCWHow social media platforms could fairly conduct content moderation is gaining attention from society at large. Researchers from HCI and CSCW have investigated whether certain factors could affect how users perceive moderation decisions as fair or ...
Comments