skip to main content
10.1145/3500868.3559445acmconferencesArticle/Chapter ViewAbstractPublication PagescscwConference Proceedingsconference-collections
extended-abstract
Public Access

“I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms”: Bureaucracy of Creator Moderation on YouTube

Authors Info & Claims
Published:08 November 2022Publication History

ABSTRACT

Recent HCI studies have recognized an analogy between bureaucracy and algorithmic systems; given platformization of content creators, video sharing platforms like YouTube and TikTok practice creator moderation, i.e., an assemblage of algorithms that manage not only creators’ content but also their income, visibility, identities, and more. However, it has not been fully understood as to how bureaucracy manifests in creator moderation. In this poster, we present an interview study with 28 YouTubers (i.e., video content creators) to analyze the bureaucracy of creator moderation from their moderation experiences. We found participants wrestled with bureaucracy as multiple obstructions in re-examining moderation decisions, coercion to appease different algorithms in creator moderation, and the platform's indifference to participants’ labor. We discuss and contribute a conceptual understanding of how algorithmic and organizational bureaucracy intertwine in creator moderation, laying a solid ground for our future study.

References

  1. Julia Alexander. 2019. YouTube moderation bots punish videos tagged as ‘gay’ or ‘lesbian,’ study finds. The Verge. Retrieved from https://www.theverge.com/2019/9/30/20887614/youtube-moderation-lgbtq-demonetization-terms-words-nerd-city-investigationGoogle ScholarGoogle Scholar
  2. Ali Alkhatib and Michael Bernstein. 2019. Street–level algorithms: A theory at the gaps between policy and decisions. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), Association for Computing Machinery, New York, NY, USA, 1–13. DOI:https://doi.org/10.1145/3290605.3300760Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Fem. Media Stud. (2021). DOI:https://doi.org/10.1080/14680777.2021.1928259Google ScholarGoogle Scholar
  4. Sophie Bishop. 2018. Anxiety, panic and self-optimization: Inequalities and the YouTube algorithm. Converg. Int. J. Res. into New Media Technol. 24, 1 (2018), 69–84. DOI:https://doi.org/10.1177/1354856517736978Google ScholarGoogle ScholarCross RefCross Ref
  5. Ragnhild Brøvig-Hanssen and Ellis Jones. 2021. Remix's retreat? Content moderation, copyright law and mashup music: New Media Soc. (June 2021). DOI:https://doi.org/10.1177/14614448211026059Google ScholarGoogle Scholar
  6. Brian Butler, Elisabeth Joyce, and Jacqueline Pike. 2008. Don't look now, but we've created a bureaucracy: The nature and roles of policies and rules in Wikipedia. Conf. Hum. Factors Comput. Syst. - Proc. (2008), 1101–1110. DOI:https://doi.org/10.1145/1357054.1357227Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc. Media + Soc. 6, 2 (2020). DOI:https://doi.org/10.1177/2056305120936636Google ScholarGoogle Scholar
  8. Kelley Cotter. 2021. “Shadowbanning is not a thing”: black box gaslighting and the power to independently know and credibly critique algorithms. Information, Commun. Soc. (2021). DOI:https://doi.org/10.1080/1369118X.2021.1994624Google ScholarGoogle Scholar
  9. Michel Crozier and Erhard Friedberg. 1964. The Bureaucratic Phenomenon. Routledge. DOI:https://doi.org/10.4324/9781315131092Google ScholarGoogle Scholar
  10. [10] Brooke Erin Duffy. 2020. Algorithmic precarity in cultural work. Commun. Public 5, 3–4 (September 2020), 103–107. DOI:https://doi.org/10.1177/2057047320959855Google ScholarGoogle ScholarCross RefCross Ref
  11. Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proc. ACM Human-Computer Interact. 4, CSCW1 (May 2020). DOI:https://doi.org/10.1145/3392845Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Robert Gorwa, Reuben Binns, and Christian Katzenbach. 2020. Algorithmic content moderation: Technical and political challenges in the automation of platform governance. Big Data Soc. 7, 1 (January 2020), 205395171989794. DOI:https://doi.org/10.1177/2053951719897945Google ScholarGoogle ScholarCross RefCross Ref
  13. Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc. ACM Human-Computer Interact. 5, CSCW2 (October 2021). DOI:https://doi.org/10.1145/3479610Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did you suspect the post would be removed?”: Understanding user reactions to content removals on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (November 2019), 1–33. DOI:https://doi.org/10.1145/3359294Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (2019). DOI:https://doi.org/10.1145/3359252Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through the looking glass: Study of transparency in Reddit's moderation practices. Proc. ACM Human-Computer Interact. 4, GROUP (January 2020), 1–35. DOI:https://doi.org/10.1145/3375197Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Sanjay Kairam and Jeffrey Heer. 2016. Parting Crowds: Characterizing divergent interpretations in crowdsourced annotation tasks. Proc. ACM Conf. Comput. Support. Coop. Work. CSCW 27, (February 2016), 1637–1648. DOI:https://doi.org/10.1145/2818048.2820016Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. D. Bondy Valdovinos Kaye and Joanne E. Gray. 2021. Copyright Gossip: Exploring Copyright Opinions, Theories, and Strategies on YouTube: Soc. Media + Soc. 7, 3 (August 2021). DOI:https://doi.org/10.1177/20563051211036940Google ScholarGoogle Scholar
  19. Susanne Kopf. 2020. “Rewarding Good Creators”: Corporate Social Media Discourse on Monetization Schemes for Content Creators. Soc. Media + Soc. 6, 4 (October 2020), 205630512096987. DOI:https://doi.org/10.1177/2056305120969877Google ScholarGoogle Scholar
  20. Paul B. de Laat. 2012. Coercion or empowerment? Moderation of content in Wikipedia as ‘essentially contested’ bureaucratic rules. Ethics Inf. Technol. 2012 142 14, 2 (February 2012), 123–135. DOI:https://doi.org/10.1007/S10676-012-9289-7Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Ralph LaRossa. 2005. Grounded Theory Methods and Qualitative Family Research. J. Marriage Fam. 67, 4 (November 2005), 837–857. DOI:https://doi.org/10.1111/j.1741-3737.2005.00179.xGoogle ScholarGoogle ScholarCross RefCross Ref
  22. Renkai Ma and Yubo Kou. 2021. “How advertiser-friendly is my video?”: YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation. PACM Hum. Comput. Interact. 5, CSCW2 (2021), 1–26. DOI:https://doi.org/10.1145/3479573Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Renkai Ma and Yubo Kou. 2022. “I'm not sure what difference is between their content and mine, other than the person itself”: A Study of Fairness Perception of Content Moderation on YouTube. Proc. ACM Human-Computer Interact. 6, CSCW2 (2022), 28. DOI:https://doi.org/10.1145/3555150Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media Soc. 20, 11 (2018), 4366–4383. DOI:https://doi.org/10.1177/1461444818773059Google ScholarGoogle ScholarCross RefCross Ref
  25. Sabine Niederer and José van Dijck. 2010. Wisdom of the crowd or technicity of content? Wikipedia as a sociotechnical system: New Media Soc. 12, 8 (July 2010), 1368–1387. DOI:https://doi.org/10.1177/1461444810365297Google ScholarGoogle ScholarCross RefCross Ref
  26. Juho Pääkkönen, Matti Nelimarkka, Jesse Haapoja, and Airi Lampinen. 2020. Bureaucracy as a Lens for Analyzing and Designing Algorithmic Systems. Conf. Hum. Factors Comput. Syst. - Proc. (April 2020). DOI:https://doi.org/10.1145/3313831.3376780Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Hector Postigo. 2016. The socio-technical architecture of digital labor: Converting play into YouTube money. New Media Soc. 18, 2 (2016), 332–349. DOI:https://doi.org/10.1177/1461444814541527Google ScholarGoogle ScholarCross RefCross Ref
  28. Aja Roman. 2019. YouTubers claim the site systematically demonetizes LGBTQ content. Vox. Retrieved from https://www.vox.com/culture/2019/10/10/20893258/youtube-lgbtq-censorship-demonetization-nerd-city-algorithm-reportGoogle ScholarGoogle Scholar
  29. Laura Savolainen. 2022. The shadow banning controversy: perceived governance and algorithmic folklore: Media, Cult. Soc. (March 2022). DOI:https://doi.org/10.1177/01634437221077174Google ScholarGoogle ScholarCross RefCross Ref
  30. Nicolas P. Suzor, Sarah Myers West, Andrew Quodling, and Jillian York. 2019. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. Int. J. Commun. 13, (2019). Retrieved from https://ijoc.org/index.php/ijoc/article/view/9736Google ScholarGoogle Scholar
  31. Sarah J. Tracy. 2013. Qualitative Research Methods: Collecting Evidence, Crafting Analysis.Google ScholarGoogle Scholar
  32. Rebecca Tushnet. 2019. Content Moderation in an Age of Extremes. Case West. Reserv. J. Law, Technol. Internet 10, (2019). Retrieved from https://heinonline.org/HOL/Page?handle=hein.journals/caswestres10&id=83&div=5&collection=journalsGoogle ScholarGoogle Scholar
  33. Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction, Association for Computing Machinery, 1–22. DOI:https://doi.org/10.1145/3415238Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Richard Ashby Wilson and Molly K. Land. 2021. Hate Speech on Social Media: Content Moderation in Context. Conn. Law Rev. (2021). Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3690616Google ScholarGoogle Scholar
  35. “Limited or no ads” explained. YouTube Help. Retrieved from https://support.google.com/youtube/answer/9269824?hl=enGoogle ScholarGoogle Scholar
  36. Get in touch with the YouTube Creator Support team. YouTube Help. Retrieved from https://support.google.com/youtube/answer/3545535?hl=en&co=GENIE.Platform%3DDesktop&oco=0#zippy=%2CemailGoogle ScholarGoogle Scholar
  37. Request human review of videos marked “Not suitable for most advertisers.” YouTube Help. Retrieved from https://support.google.com/youtube/answer/7083671?hl=en#zippy=%2Chow-monetization-status-is-appliedGoogle ScholarGoogle Scholar
  38. Manage mid-roll ad breaks in long videos. YouTube Help. Retrieved from https://support.google.com/youtube/answer/6175006?hl=en#zippy=%2Cfrequently-asked-questionsGoogle ScholarGoogle Scholar
  39. Advertiser-friendly content guidelines. YouTube Help. Retrieved from https://support.google.com/youtube/answer/6162278?hl=en#Adult&zippy=%2Cguide-to-self-certificationGoogle ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    CSCW'22 Companion: Companion Publication of the 2022 Conference on Computer Supported Cooperative Work and Social Computing
    November 2022
    318 pages
    ISBN:9781450391900
    DOI:10.1145/3500868

    Copyright © 2022 Owner/Author

    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 8 November 2022

    Check for updates

    Qualifiers

    • extended-abstract
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate2,235of8,521submissions,26%

    Upcoming Conference

    CSCW '24

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format