ABSTRACT
Recent HCI studies have recognized an analogy between bureaucracy and algorithmic systems; given platformization of content creators, video sharing platforms like YouTube and TikTok practice creator moderation, i.e., an assemblage of algorithms that manage not only creators’ content but also their income, visibility, identities, and more. However, it has not been fully understood as to how bureaucracy manifests in creator moderation. In this poster, we present an interview study with 28 YouTubers (i.e., video content creators) to analyze the bureaucracy of creator moderation from their moderation experiences. We found participants wrestled with bureaucracy as multiple obstructions in re-examining moderation decisions, coercion to appease different algorithms in creator moderation, and the platform's indifference to participants’ labor. We discuss and contribute a conceptual understanding of how algorithmic and organizational bureaucracy intertwine in creator moderation, laying a solid ground for our future study.
- Julia Alexander. 2019. YouTube moderation bots punish videos tagged as ‘gay’ or ‘lesbian,’ study finds. The Verge. Retrieved from https://www.theverge.com/2019/9/30/20887614/youtube-moderation-lgbtq-demonetization-terms-words-nerd-city-investigationGoogle Scholar
- Ali Alkhatib and Michael Bernstein. 2019. Street–level algorithms: A theory at the gaps between policy and decisions. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3290605.3300760Google ScholarDigital Library
- Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Fem. Media Stud. (2021). DOI:https://doi.org/10.1080/14680777.2021.1928259Google Scholar
- Sophie Bishop. 2018. Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm. Converg. Int. J. Res. into New Media Technol. 24, 1 (2018), 69–84. DOI:https://doi.org/10.1177/1354856517736978Google ScholarCross Ref
- Ragnhild Brøvig-Hanssen and Ellis Jones. 2021. Remix's retreat? Content moderation, copyright law and mashup music: New Media Soc. (June 2021). DOI:https://doi.org/10.1177/14614448211026059Google Scholar
- Brian Butler, Elisabeth Joyce, and Jacqueline Pike. 2008. Don't look now, but we've created a bureaucracy: The nature and roles of policies and rules in Wikipedia. Conf. Hum. Factors Comput. Syst. - Proc. (2008), 1101–1110. DOI:https://doi.org/10.1145/1357054.1357227Google ScholarDigital Library
- Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc. Media + Soc. 6, 2 (2020). DOI:https://doi.org/10.1177/2056305120936636Google Scholar
- Kelley Cotter. 2021. “Shadowbanning is not a thing”: black box gaslighting and the power to independently know and credibly critique algorithms. Information, Commun. Soc. (2021). DOI:https://doi.org/10.1080/1369118X.2021.1994624Google Scholar
- Michel Crozier and Erhard Friedberg. 1964. The Bureaucratic Phenomenon. Routledge. DOI:https://doi.org/10.4324/9781315131092Google Scholar
- [10] Brooke Erin Duffy. 2020. Algorithmic precarity in cultural work. Commun. Public 5, 3–4 (September 2020), 103–107. DOI:https://doi.org/10.1177/2057047320959855Google ScholarCross Ref
- Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proc. ACM Human-Computer Interact. 4, CSCW1 (May 2020). DOI:https://doi.org/10.1145/3392845Google ScholarDigital Library
- Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data Soc. 7, 1 (January 2020), 205395171989794. DOI:https://doi.org/10.1177/2053951719897945Google ScholarCross Ref
- Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc. ACM Human-Computer Interact. 5, CSCW2 (October 2021). DOI:https://doi.org/10.1145/3479610Google ScholarDigital Library
- Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did you suspect the post would be removed?”: Understanding user reactions to content removals on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (November 2019), 1–33. DOI:https://doi.org/10.1145/3359294Google ScholarDigital Library
- Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (2019). DOI:https://doi.org/10.1145/3359252Google ScholarDigital Library
- Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through the looking glass: Study of transparency in Reddit's moderation practices. Proc. ACM Human-Computer Interact. 4, GROUP (January 2020), 1–35. DOI:https://doi.org/10.1145/3375197Google ScholarDigital Library
- Sanjay Kairam and Jeffrey Heer. 2016. Parting Crowds: Characterizing divergent interpretations in crowdsourced annotation tasks. Proc. ACM Conf. Comput. Support. Coop. Work. CSCW 27, (February 2016), 1637–1648. DOI:https://doi.org/10.1145/2818048.2820016Google ScholarDigital Library
- D. Bondy Valdovinos Kaye and Joanne E. Gray. 2021. Copyright Gossip: Exploring Copyright Opinions, Theories, and Strategies on YouTube: Soc. Media + Soc. 7, 3 (August 2021). DOI:https://doi.org/10.1177/20563051211036940Google Scholar
- Susanne Kopf. 2020. “Rewarding Good Creators”: Corporate Social Media Discourse on Monetization Schemes for Content Creators. Soc. Media + Soc. 6, 4 (October 2020), 205630512096987. DOI:https://doi.org/10.1177/2056305120969877Google Scholar
- Paul B. de Laat. 2012. Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules. Ethics Inf. Technol. 2012 142 14, 2 (February 2012), 123–135. DOI:https://doi.org/10.1007/S10676-012-9289-7Google ScholarDigital Library
- Ralph LaRossa. 2005. Grounded Theory Methods and Qualitative Family Research. J. Marriage Fam. 67, 4 (November 2005), 837–857. DOI:https://doi.org/10.1111/j.1741-3737.2005.00179.xGoogle ScholarCross Ref
- Renkai Ma and Yubo Kou. 2021. “How advertiser-friendly is my video?”: YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation. PACM Hum. Comput. Interact. 5, CSCW2 (2021), 1–26. DOI:https://doi.org/10.1145/3479573Google ScholarDigital Library
- Renkai Ma and Yubo Kou. 2022. “I'm not sure what difference is between their content and mine, other than the person itself”: A Study of Fairness Perception of Content Moderation on YouTube. Proc. ACM Human-Computer Interact. 6, CSCW2 (2022), 28. DOI:https://doi.org/10.1145/3555150Google ScholarDigital Library
- Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media Soc. 20, 11 (2018), 4366–4383. DOI:https://doi.org/10.1177/1461444818773059Google ScholarCross Ref
- Sabine Niederer and José van Dijck. 2010. Wisdom of the crowd or technicity of content? Wikipedia as a sociotechnical system: New Media Soc. 12, 8 (July 2010), 1368–1387. DOI:https://doi.org/10.1177/1461444810365297Google ScholarCross Ref
- Juho Pääkkönen, Matti Nelimarkka, Jesse Haapoja, and Airi Lampinen. 2020. Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems. Conf. Hum. Factors Comput. Syst. - Proc. (April 2020). DOI:https://doi.org/10.1145/3313831.3376780Google ScholarDigital Library
- Hector Postigo. 2016. The socio-technical architecture of digital labor: Converting play into YouTube money. New Media Soc. 18, 2 (2016), 332–349. DOI:https://doi.org/10.1177/1461444814541527Google ScholarCross Ref
- Aja Roman. 2019. YouTubers claim the site systematically demonetizes LGBTQ content. Vox. Retrieved from https://www.vox.com/culture/2019/10/10/20893258/youtube-lgbtq-censorship-demonetization-nerd-city-algorithm-reportGoogle Scholar
- Laura Savolainen. 2022. The shadow banning controversy: perceived governance and algorithmic folklore: Media, Cult. Soc. (March 2022). DOI:https://doi.org/10.1177/01634437221077174Google ScholarCross Ref
- Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. Int. J. Commun. 13, (2019). Retrieved from https://ijoc.org/index.php/ijoc/article/view/9736Google Scholar
- Sarah J. Tracy. 2013. Qualitative Research Methods: Collecting Evidence, Crafting Analysis.Google Scholar
- Rebecca Tushnet. 2019. Content Moderation in an Age of Extremes. Case West. Reserv. J. Law, Technol. Internet 10, (2019). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/caswestres10&id=83&div=5&collection=journalsGoogle Scholar
- Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction, Association for Computing Machinery, 1–22. DOI:https://doi.org/10.1145/3415238Google ScholarDigital Library
- Richard Ashby Wilson and Molly K. Land. 2021. Hate Speech on Social Media: Content Moderation in Context. Conn. Law Rev. (2021). Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3690616Google Scholar
- “Limited or no ads” explained. YouTube Help. Retrieved from https://support.google.com/youtube/answer/9269824?hl=enGoogle Scholar
- Get in touch with the YouTube Creator Support team. YouTube Help. Retrieved from https://support.google.com/youtube/answer/3545535?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2CemailGoogle Scholar
- Request human review of videos marked “Not suitable for most advertisers.” YouTube Help. Retrieved from https://support.google.com/youtube/answer/7083671?hl=en#zippy=%2Chow-monetization-status-is-appliedGoogle Scholar
- Manage mid-roll ad breaks in long videos. YouTube Help. Retrieved from https://support.google.com/youtube/answer/6175006?hl=en#zippy=%2Cfrequently-asked-questionsGoogle Scholar
- Advertiser-friendly content guidelines. YouTube Help. Retrieved from https://support.google.com/youtube/answer/6162278?hl=en#Adult&zippy=%2Cguide-to-self-certificationGoogle Scholar
Recommendations
"Defaulting to boilerplate answers, they didn't engage in a genuine conversation": Dimensions of Transparency Design in Creator Moderation
CSCWTransparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid ...
"How advertiser-friendly is my video?": YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation
CSCW2To manage user-generated harmful video content, YouTube relies on AI algorithms (e.g., machine learning) in content moderation and follows a retributive justice logic to punish convicted YouTubers through demonetization, a penalty that limits or deprives ...
"I'm not sure what difference is between their content and mine, other than the person itself": A Study of Fairness Perception of Content Moderation on YouTube
CSCWHow social media platforms could fairly conduct content moderation is gaining attention from society at large. Researchers from HCI and CSCW have investigated whether certain factors could affect how users perceive moderation decisions as fair or ...
Comments