skip to main content
10.1145/3544549.3577049acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
extended-abstract
Public Access

Conceptualizing and Rethinking the Design of Cross-platform Creator Moderation

Published:19 April 2023Publication History

Abstract

My doctoral research explores how content creators experience creator moderation across different platforms through mixed methods. Through qualitative methods such as semi-structured interview, my prior work has gained understanding of the socioeconomic implications of creator moderation on creators, the fairness and bureaucracy challenges creators face, and what transparency design requires to address these challenges. My proposed future work will first quantitatively identify what algorithmic moderation design affects creators’ perceived transparency, fairness, and accountability of creator moderation across different platforms. Then, I will co-design with creators and commercial moderators from different platforms to rethink what and how creator moderation can take creators’ interests into account. My dissertation aims to contribute to HCI and CSCW fields by conceptualizing the notion of creator moderation and detailing design considerations for empowering creators.

Skip 1RESEARCH SITUATION Section

1 RESEARCH SITUATION

I am a third-year, full-time Ph.D. candidate in the Informatics program at The Pennsylvania State University's College of Information Sciences and Technology, where I am advised by Dr. Yubo Kou. The projected completion date of my doctoral degree will be around May 2024 or summer 2024. I have never attended a doctoral consortium at any previous SIGCHI conferences in the past. The themes of my research are HCI, CSCW, and, specifically, creator moderation.

Skip 2CONTEXT AND MOTIVATION Section

2 CONTEXT AND MOTIVATION

Recent years have witnessed the growth of “creator economy,” where independent content creators rely on platforms such as YouTube, Twitch, Instagram, and TikTok to gain fanbase and income and further develop content creation as a profession. As 50 million people identify themselves as creators [12] and two million have made content creation their full-time job [17], creators usually want to establish their branding [13,32,38] that can be competitive and distinguishable from other creators [20] to sustain their careers [11]. Against this context, it is common that creators work across platforms to diversify their income and content creation [17,36].

But being creators does not mean absolute freedom of content creation, and monetizing content does not indicate secure self-employment. That is because creators need would go through creator moderation [29,30], where multiple governance mechanisms exercise control beyond creator's content or speech to their careers such as income [7], visibility [6], and audience engagement. And oftentimes, creator moderation negatively impacts creators. For example, creators complained that Facebook cut their income compared to the amount they had typically earned from videos [8]; and creators with racial and sexual minority groups complained their advertising income was unfairly reduced on YouTube compared with other creators [1,2,34].

However, as HCI and CSCW researchers have gained more understanding of users’ experience with content moderation (e.g., [9,19,22,23]), we still lack systematic knowledge of creators’ experiences with creator moderation. For example, researchers have uncovered various challenges users encounter when they experience content moderation: (1) opacity of moderation decision-making [15,18,35], (2) unfairness of moderation decisions [22,40,41], (3) procedural obstacles to appeal moderation decisions [5,25], and more. But will creators also encounter similar challenges of creator moderation systems? If so, how do they react to or handle these challenges?

Especially, as more creators nowadays work on more than one platform, relatively little work has paid attention to whether and how current moderation designs work with creators across platforms. Aiming for end-user advocacy and empowerment, HCI and CSCW researchers have started to design fairer, more transparent, and contestable moderation mechanisms (e.g., [14,24,41,43]). But it remains unknown whether and how those design structures fit creators’ interests or workflows across platforms. Communication researchers have touched on cross-platform creators in terms of how their labor for branding is shaped by certain platform affordances [13,27,32,38]. But still, from a design perspective, it has not fully understood whether and how those designs, including creator moderation, take creators’ interests and voice into account across platforms.

So, informed by prior work from different disciplines, my doctoral research aims to answer three primary research questions:

RQ1: What does creator moderation entail?

RQ2: How do creators experience creator moderation?

RQ3: What creator moderation design can better work with creators across platforms?

Skip 3WORK TO DATE Section

3 WORK TO DATE

My work is on the way but yet to fully answer all RQs. Specifically, with four completed publications [2831], I have conducted a series of case studies on YouTube first to gain an initial understanding of creator moderation because YouTube is currently the largest video sharing platform and among the first to allow users to monetize their content [21]. Then, I plan to conduct two more studies to better conceptualize creator moderation and its design across platforms, such as YouTube, Twitch, Instagram, and TikTok, as media outlets have reported the scale of creators is larger on these four than others in the US till 2022 [17,26,37]. In the following two subsections, I will detail how my published work and proposed studies address the research questions sequentially.

3.1 Study 1: Creator Moderation's Socioeconomic Implications

The study [28] explores what creator moderation entails and creators’ interactions with it (partially answered RQ1). Through a thematic analysis on online discussion data collected from a subreddit, r/youtube, nearly the largest YouTube-related online community, we identified that, beyond content moderation, creator moderation exercises socioeconomic impacts on creators. That is, creators on YouTube encounter the algorithmic opacity of moderation decisions as the platform largely implements algorithms (e.g., machine learning) in creator moderation. And such opacity led creators’ video creation work to be precarious. Thus, creators strived to cope with the precarity of creator careers by gaining and applying knowledge of creator moderation and diversifying income through multiple crowdfunding platforms. This study lays out solid ground for my future work of diving deeper into understanding how creators go through and make sense of creator moderation design.

3.2 Study 2: Fairness and Bureaucracy Challenges in Creator Moderation

Through 28 semi-structured interviews with creators who experienced moderation on YouTube, we examined how creators experience creator moderation, which oftentimes challenged creators’ career development (partially answered RQ2). We found that creators generated fairness perception given different contexts of moderation decisions they received [29], and encountered bureaucracy of moderation system [30]. First, creators developed their perceived unfairness when they encountered (1) unequal moderation treatments through cross-comparisons, (2) inconsistent decisions, processes, system actions, or those inconsistent with content policies, and (3) the lack of their voice in multiple algorithmic visibility decision-making processes. Second, I found that creators on YouTube would first experience algorithmic bureaucracy, where YouTube's moderation system fails to adapt decision-making to creators’ novel and localized content, and then experience organizational bureaucracy, where creators are not in a privileged position to appeal moderation decisions. These two sets of findings have deepened our understanding of creators’ experiences with creator moderation and depicted the typical creator moderation phases, including content rule articulation, rule enforcement or moderation decision-making, and moderation re-examination (e.g., appeal).

3.3 Study 3: Transparency Design of Creator Moderation

As prior researchers have largely viewed enhanced transparency as an approach to combat bureaucracy [33] and fairness [3,42] challenges, we also recognize the importance of transparency in creator moderation. So, drawing from prior understanding of creator moderation phases, this study addresses what transparency design requires in different phases given creators’ moderation experience on YouTube (i.e., partially answered RQ3). We found that creators hoped the moderation system to present moderation decisions saliently, explain moderation rationales profoundly, afford effective communication from human agents who are hired by the platform, and offer learning opportunities for better working with creator moderation. This study shows creator moderation needs to maintain dynamic moderation transparency that both balances transparency efforts and negative moderation impacts to value creators’ labor.

Skip 4FUTURE DISSERTATION WORK Section

4 FUTURE DISSERTATION WORK

In my future work, I will map the conceptual understanding of creator moderation I gained from my prior work into understanding how moderation design can empower creators across platforms (i.e., who self-identifies to work at least on two platforms). Study 4 and 5 aim to fully answer RQ1-RQ3.

4.1 Study 4: Improving Cross-platform Creator Moderation Design by Identifying Creators’ Interests

This study will identify what types of ex-ante proactive algorithmic decisions, which are related to creators’ best interests, affect creators’ perceived transparency, fairness, and accountability of creator moderation across platforms. Prior research, including mine, has uncovered that content creators primarily concern about their interests and careers in terms of content visibility or reach [4,10,29], income [7,28], and audience engagement [16,32] gained on platforms. Thus, I will design a controlled experiment with a series of scenario-based questions in terms of ex-ante proactive algorithmic moderation involving these creators’ personal interests to inquire about their perceptions of moderation and the rationales behind these perceptions. I will conduct both qualitative and statistical analysis to analyze survey responses, and results will include (1) who the creators preferring ex-ante proactive algorithmic moderation are (2) how effective such ex-ante proactive algorithmic moderation can improve creators’ perceived transparency, fairness, and accountability of creator moderation, and (3) what algorithmic moderation design better can work with different creators who hold different perspectives.

4.2 Study 5: Co-designing Cross-platform Creator Moderation with Creators and Commercial Moderators

In this study, I aim to conduct participatory design (PD) [39] workshops to systematically and holistically elicit a better understanding of what creator moderation means to creators and what creator moderation structures can better involve their voice and interests. The PD workshop procedure includes identifying creators and commercial moderators with certain inclusion criteria, material preparation, exploration of work, co-design activity, and evaluation activity with focus group discussion. I expect co-designing with creators and commercial moderators would empower both actors under platform governance to reconcile conflicts with platforms, implicating better creator moderation structures.

Skip 5CURRENT AND EXPECTED CONTRIBUTIONS Section

5 CURRENT AND EXPECTED CONTRIBUTIONS

My doctoral research aims to offer two primary contributions to the HCI and CSCW fields. First, my research contributes the conceptualization and notion of creator moderation and how it is contextualized across different platforms. Second, as HCI research values end-user advocacy, my research contributes design and policy implications that detail how to advocate and empower content creators in cross-platform creator moderation.

Skip ACKNOWLEDGMENTS Section

ACKNOWLEDGMENTS

This work is partially supported by NSF grant no. 2006854. I am grateful for 49 content creators who have already participated in our work and shared their stories with us. Last, I sincerely thank my advisor, Dr. Yubo Kou, for his generous guidance and tremendous support.

Skip Supplemental Material Section

Supplemental Material

3544549.3577049-video.mp4

Video Poster Presentation

mp4

67 MB

References

  1. Julia Alexander. 2019. YouTube moderation bots punish videos tagged as ‘gay’ or ‘lesbian,’ study finds. The Verge. Retrieved from https://www.theverge.com/2019/9/30/20887614/youtube-moderation-lgbtq-demonetization-terms-words-nerd-city-investigationGoogle ScholarGoogle Scholar
  2. Julia Alexander. 2019. LGBTQ YouTubers are suing YouTube over alleged discrimination. The Verge. Retrieved from https://www.theverge.com/2019/8/14/20805283/lgbtq-youtuber-lawsuit-discrimination-alleged-video-recommendations-demonetizationGoogle ScholarGoogle Scholar
  3. Mike Ananny and Kate Crawford. 2018. Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media Soc. 20, 3 (March 2018), 973–989. DOI:https://doi.org/10.1177/1461444816676645Google ScholarGoogle ScholarCross RefCross Ref
  4. Carolina Are. 2021. The Shadowban Cycle: an autoethnography of pole dancing, nudity and censorship on Instagram. Fem. Media Stud. (2021). DOI:https://doi.org/10.1080/14680777.2021.1928259Google ScholarGoogle ScholarCross RefCross Ref
  5. Anna Veronica Banchik. 2020. Disappearing acts: Content moderation and emergent practices to preserve at-risk human rights–related content. New Media Soc. (March 2020), 146144482091272. DOI:https://doi.org/10.1177/1461444820912724Google ScholarGoogle ScholarCross RefCross Ref
  6. Sophie Bishop. 2019. Managing visibility on YouTube through algorithmic gossip. New Media Soc. 21, 11–12 (November 2019), 2589–2606. DOI:https://doi.org/10.1177/1461444819854731Google ScholarGoogle ScholarCross RefCross Ref
  7. Robyn Caplan and Tarleton Gillespie. 2020. Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. Soc. Media + Soc. 6, 2 (2020). DOI:https://doi.org/10.1177/2056305120936636Google ScholarGoogle ScholarCross RefCross Ref
  8. Ashley Carman. 2021. Facebook shorted video creators thousands of dollars in ad revenue. The Verge. Retrieved from https://www.theverge.com/2021/3/31/22358723/facebook-creators-video-revenue-estimate-tool-pagesGoogle ScholarGoogle Scholar
  9. Eshwar Chandrasekharan, Umashanthi Pavalanathan, Anirudh Srinivasan, Adam Glynn, Jacob Eisenstein, and Eric Gilbert. 2017. You can't stay here: The efficacy of Reddit's 2015 ban examined through hate speech. Proc. ACM Human-Computer Interact. 1, CSCW (November 2017), 1–22. DOI:https://doi.org/10.1145/3134666Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Kelley Cotter. 2021. “Shadowbanning is not a thing”: black box gaslighting and the power to independently know and credibly critique algorithms. Information, Commun. Soc. (2021). DOI:https://doi.org/10.1080/1369118X.2021.1994624Google ScholarGoogle ScholarCross RefCross Ref
  11. Treyton Devore. 2021. How to Build a Sustainable Creator Business. Creatorbread. Retrieved from https://www.creatorbread.com/blog/how-to-build-a-sustainable-creator-businessGoogle ScholarGoogle Scholar
  12. Gary Drenik. 2022. The Creator Economy Is Booming. Here's How Businesses Can Tap Into Its Potential. Forbes. Retrieved from https://www.forbes.com/sites/garydrenik/2022/08/23/the-creator-economy-is-booming-heres-how-businesses-can-tap-into-its-potential/?sh=305e02713d27Google ScholarGoogle Scholar
  13. Brooke Erin Duffy, Urszula Pruchniewska, and Leah Scolere. 2017. Platform-specific self-branding: Imagined affordances of the social media ecology. ACM Int. Conf. Proceeding Ser. Part F1296, (July 2017). DOI:https://doi.org/10.1145/3097286.3097291Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Jenny Fan and Amy X. Zhang. 2020. Digital Juries: A Civics-Oriented Approach to Platform Governance. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2020), Association for Computing Machinery, New York, NY, USA, 1–14. DOI:https://doi.org/10.1145/3313831.3376293Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Jessica L. Feuston, Alex S. Taylor, and Anne Marie Piper. 2020. Conformity of Eating Disorders through Content Moderation. Proc. ACM Human-Computer Interact. 4, CSCW1 (May 2020). DOI:https://doi.org/10.1145/3392845Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Anthony Fung, Milan Ismangil, Wei He, and Shule Cao. 2022. If I'm not Streaming, I'm not Earning: Audience Relations and Platform Time on Douyin. Online Media Glob. Commun. 1, 2 (June 2022), 369–386. DOI:https://doi.org/10.1515/OMGC-2022-0001Google ScholarGoogle ScholarCross RefCross Ref
  17. Werner Geyser. 2022. The State of the Creator Economy | Definition, Growth & Market Size. Influencer Marketing Hub.Google ScholarGoogle Scholar
  18. Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press. Retrieved from https://www.degruyter.com/document/doi/10.12987/9780300235029/htmlGoogle ScholarGoogle Scholar
  19. Oliver L. Haimson, Daniel Delmonaco, Peipei Nie, and Andrea Wegner. 2021. Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas. Proc. ACM Human-Computer Interact. 5, CSCW2 (October 2021). DOI:https://doi.org/10.1145/3479610Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Indeed Editorial Team. 2022. 15 Ways To Make Money as a Content Creator. Indeed. Retrieved from https://www.indeed.com/career-advice/pay-salary/how-to-make-money-as-content-creatorGoogle ScholarGoogle Scholar
  21. Nicholas Jackson. 2011. Infographic: The History of Video Advertising on YouTube. The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2011/08/infographic-the-history-of-video-advertising-on-youtube/242836/Google ScholarGoogle Scholar
  22. Shagun Jhaver, Darren Scott Appling, Eric Gilbert, and Amy Bruckman. 2019. “Did you suspect the post would be removed?”: Understanding user reactions to content removals on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (November 2019), 1–33. DOI:https://doi.org/10.1145/3359294Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Shagun Jhaver, Amy Bruckman, and Eric Gilbert. 2019. Does transparency in moderation really matter?: User behavior after content removal explanations on reddit. Proc. ACM Human-Computer Interact. 3, CSCW (2019). DOI:https://doi.org/10.1145/3359252Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Shagun Jhaver, Quan Ze Chen, Detlef Knauss, and Amy Zhang. 2022. Designing Word Filter Tools for Creator-led Comment Moderation. Proc. 2022 CHI Conf. Hum. Factors Comput. Syst. (2022). DOI:https://doi.org/10.1145/3491102.3517505Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Prerna Juneja, Deepika Rama Subramanian, and Tanushree Mitra. 2020. Through the looking glass: Study of transparency in Reddit's moderation practices. Proc. ACM Human-Computer Interact. 4, GROUP (January 2020), 1–35. DOI:https://doi.org/10.1145/3375197Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. John Koetsier. 2020. 2 Million Creators Make 6-Figure Incomes On YouTube, Instagram, Twitch Globally. Forbes. Retrieved from https://www.forbes.com/sites/johnkoetsier/2020/10/05/2-million-creators-make-6-figure-incomes-on-youtube-instagram-twitch-globally/?sh=1125dbb323beGoogle ScholarGoogle Scholar
  27. Andrew M. Ledbetter and Colten Meisner. 2021. Extending the personal branding affordances typology to parasocial interaction with public figures on social media: Social presence and media multiplexity as mediators. Comput. Human Behav. 115, (February 2021), 106610. DOI:https://doi.org/10.1016/J.CHB.2020.106610Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Renkai Ma and Yubo Kou. 2021. “How advertiser-friendly is my video?”: YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation. PACM Hum. Comput. Interact. 5, CSCW2 (2021), 1–26. DOI: https://doi.org/10.1145/3479573Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Renkai Ma and Yubo Kou. 2022. “I'm not sure what difference is between their content and mine, other than the person itself”: A Study of Fairness Perception of Content Moderation on YouTube. Proc. ACM Human-Computer Interact. 6, CSCW2 (2022), 28. DOI:https://doi.org/10.1145/3555150Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Renkai Ma and Yubo Kou. 2022. “I am not a YouTuber who can make whatever video I want. I have to keep appeasing algorithms”: Bureaucracy of Creator Moderation on YouTube. In Companion Computer Supported Co-operative Work and Social Computing (CSCW’22 Companion). Retrieved from https://doi.org/10.1145/3500868.3559445Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Renkai Ma and Yubo Kou. 2023. “Defaulting to boilerplate answers, they didn't engage in a genuine conversation”: Dimensions of Transparency Design in Creator Moderation. Proc. ACM Human-Computer Interact. (2023).Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Colten Meisner and Andrew M. Ledbetter. 2020. Participatory branding on social media: The affordances of live streaming for creative labor: New Media Soc. 24, 5 (November 2020), 1179–1195. DOI:https://doi.org/10.1177/1461444820972392Google ScholarGoogle ScholarCross RefCross Ref
  33. Jeremy Pope and Frank Vogl. 2000. Making Anticorruption Agencies More Effective. Finance Dev. 37, 002 (January 2000). DOI:https://doi.org/10.5089/9781451952827.022.A002Google ScholarGoogle ScholarCross RefCross Ref
  34. Molly Priddy. 2017. Why Is YouTube Demonetizing LGBTQ Videos? Autostraddle. Retrieved from https://www.autostraddle.com/why-is-youtube-demonetizing-lgbtqia-videos-395058/Google ScholarGoogle Scholar
  35. Sarah T. Roberts. 2018. Digital detritus: “Error” and the logic of opacity in social media content moderation. First Monday 23, 3 (March 2018). DOI:https://doi.org/10.5210/fm.v23i3.8283Google ScholarGoogle ScholarCross RefCross Ref
  36. Carla Rover. 2022. Mapping The Trends: From Influencers to Influence Marketing. a.list.Google ScholarGoogle Scholar
  37. Erica Santiago. 2022. Creator Economy: Everything Marketers Need to Know. HubSpot. Retrieved from https://blog.hubspot.com/marketing/creator-economyGoogle ScholarGoogle Scholar
  38. Leah Scolere, Urszula Pruchniewska, and Brooke Erin Duffy. 2018. Constructing the Platform-Specific Self-Brand: The Labor of Social Media Promotion. Soc. Media + Soc. 4, 3 (July 2018). DOI:https://doi.org/10.1177/2056305118784768Google ScholarGoogle ScholarCross RefCross Ref
  39. Clay Spinuzzi. 2005. The methodology of participatory design. Tech. Commun. 52, 2 (2005), 163–174.Google ScholarGoogle Scholar
  40. Kristen Vaccaro, Christian Sandvig, and Karrie Karahalios. 2020. “At the End of the Day Facebook Does What It Wants”: How Users Experience Contesting Algorithmic Content Moderation. In Proceedings of the ACM on Human-Computer Interaction, Association for Computing Machinery, 1–22. DOI:https://doi.org/10.1145/3415238Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Kristen Vaccaro, Ziang Xiao, Kevin Hamilton, and Karrie Karahalios. 2021. Contestability For Content Moderation. Proc. ACM Human-Computer Interact. 5, CSCW2 (October 2021), 28. DOI:https://doi.org/10.1145/3476059Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Michael Veale, Max Van Kleek, and Reuben Binns. 2018. Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision-Making. Proc. 2018 CHI Conf. Hum. Factors Comput. Syst. (2018). DOI:https://doi.org/10.1145/3173574Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Austin P. Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Duen Horng (Polo) Chau, and Diyi Yang. 2021. RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization. Proc. ACM Human-Computer Interact. 5, CSCW1 (April 2021), 1–26. DOI:https://doi.org/10.1145/3449280Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Conceptualizing and Rethinking the Design of Cross-platform Creator Moderation

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
      April 2023
      3914 pages
      ISBN:9781450394222
      DOI:10.1145/3544549

      Copyright © 2023 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 April 2023

      Check for updates

      Qualifiers

      • extended-abstract
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%
    • Article Metrics

      • Downloads (Last 12 months)266
      • Downloads (Last 6 weeks)58

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format