Abstract
This article challenges the dominant ‘black box’ metaphor in critical algorithm studies by proposing a phenomenological framework for understanding how social media algorithms manifest themselves in user experience. While the black box paradigm treats algorithms as opaque, self-contained entities that exist only ‘behind the scenes’, this article argues that algorithms are better understood as genetic phenomena that unfold temporally through user-platform interactions. Recent scholarship in critical algorithm studies has already identified various ways in which algorithms manifest in user experience: through affective responses, algorithmic self-reflexivity, disruptions of normal experience, points of contention, and folk theories. Yet, while these studies gesture toward a phenomenological understanding of algorithms, they do so without explicitly drawing on phenomenological theory. This article demonstrates how phenomenology, particularly a Husserlian genetic approach, can further conceptualize these already-documented algorithmic encounters. Moving beyond both the paradigm of artifacts and static phenomenological approaches, the analysis shows how algorithms emerge as inherently relational processes that co-constitute user experience over time. By reconceptualizing algorithms as genetic phenomena rather than black boxes, this paper provides a theoretical framework for understanding how algorithmic awareness develops from pre-reflective affective encounters to explicit folk theories, while remaining inextricably linked to users’ self-understanding. This phenomenological framework contributes to a more nuanced understanding of algorithmic mediation in contemporary social media environments and opens new pathways for investigating digital technologies.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
It must be added here that algorithms not only change along with the event, as if events in the world (as datafied) pre-exist and the algorithm follows; rather algorithms stand in much more complex relations to reality as they can also change the course of events in itself. Bucher gives the example of how the Google Flu Tracker in 2011–2013, by recommending flu-related search terms through auto-completion, was itself ‘producing the conditions it was trying to merely describe and predict’ (Bucher, 2018, p. 28). There is an emerging literature on such instances of algorithmically mediated ‘self-fulfilling’ and ‘self-defeating’ prophecies (King & Mertens, 2023; cf. Mertens et al., 2022).
References
Åhman, H., & Hedman, A. (2019). Frameworks for Studying Social Media Interaction: A Discussion on Phenomenology and Poststructuralism. In D. Lamas, F. Loizides, L. Nacke, H. Petrie, M. Winckler, & P. Zaphiris (Eds.), Human-Computer Interaction – INTERACT 2019 (Vol. 11748, pp. 701–718). Springer International Publishing. https://doi.org/10.1007/978-3-030-29387-1_41
Amoore, L. (2020). Cloud ethics: Algorithms and the attributes of ourselves and others. Duke University Press.
Ananny, M., & Science (2016). Technology & Human Values, 41(1), 93–117. https://doi.org/10.1177/0162243915606523.
Anderson, B. R. O. (2016). Imagined communities: Reflections on the origin and spread of nationalism. Verso.
Arendt, H. (1981). The life of the mind. Harcourt.
Arendt, H. (2018). The Human Condition (Second edition). The University of Chicago Press.
Bengtsson, S., & Johansson, S. (2021). A phenomenology of news: Understanding news in digital culture. Journalism, 22(11), 2873–2889. https://doi.org/10.1177/1464884919901194
Benjamin, J. J. (2023). Machine horizons: Post-phenomenological AI studies [PhD, University of Twente]. https://doi.org/10.3990/1.9789036555357
Benjamin, J. J., Berger, A., Merrill, N., & Pierce, J. (2021). Machine Learning Uncertainty as a Design Material: A Post-Phenomenological Inquiry. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3411764.3445481
Bhandari, A., & Bimo, S. (2022). Why’s everyone on TikTok now? The Algorithmized Self and the future of self-making on Social Media. Social Media + Society, 8(1), 20563051221086241. https://doi.org/10.1177/20563051221086241
Bishop, C. (2006). Pattern Recognition and Machine Learning. Springer.
Brożek, B., Furman, M., Jakubiec, M., & Kucharzyk, B. (2024). The black box problem revisited. Real and imaginary challenges for automated legal decision making. Artificial Intelligence and Law, 32(2), 427–440. https://doi.org/10.1007/s10506-023-09356-9
Bucher, T. (2017). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information Communication & Society, 20(1), 30–44. https://doi.org/10.1080/1369118X.2016.1154086
Bucher, T. (2018). If… then: Algorithmic power and politics. Oxford University Press.
Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 205395171562251. https://doi.org/10.1177/2053951715622512
Butler, J. (2008). Giving an account of oneself. Fordham University Press.
Cerbone, D. R. (2010). Understanding phenomenology. Routledge.
Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. New York University Press.
Christin, A. (2020). The ethnographer and the algorithm: Beyond the black box. Theory and Society, 49(5–6), 897–918. https://doi.org/10.1007/s11186-020-09411-3
Chun, W. H. K. (2013). Programmed visions: Software and memory. MIT Press.
Coeckelbergh, M. (2014). The Moral Standing of machines: Towards a relational and non-cartesian Moral Hermeneutics. Philosophy & Technology, 27(1), 61–77. https://doi.org/10.1007/s13347-013-0133-8
Coeckelbergh, M. (2022). Digital Technologies, Temporality, and the politics of Co-existence. Springer. https://doi.org/10.1007/978-3-031-17982-2
Coeckelbergh, M., & Gunkel, D. J. (2023). ChatGPT: Deconstructing the debate and moving it forward. AI & SOCIETY. https://doi.org/10.1007/s00146-023-01710-4
Couldry, N., Fotopoulou, A., & Dickens, L. (2016). Real social analytics: A contribution towards a phenomenology of a digital world. The British Journal of Sociology, 67(1), 118–137. https://doi.org/10.1111/1468-4446.12183
Dahlgren, P. (2018). Public sphere participation online: The ambiguities of Affect. Les Enjeux De L’information Et de La Communication, N° 19/1(1), 5. https://doi.org/10.3917/enic.024.0005
Davis, J. L., & Graham, T. (2021). Emotional consequences and attention rewards: The social effects of ratings on Reddit. Information Communication & Society, 24(5), 649–666. https://doi.org/10.1080/1369118X.2021.1874476
DeVito, M. A., Gergle, D., & Birnholtz, J. (2017). ‘Algorithms ruin everything’: #RIPTwitter, Folk Theories, and Resistance to Algorithmic Change in Social Media. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 3163–3174. https://doi.org/10.1145/3025453.3025659
Diakopoulos, N. (2015). Algorithmic accountability: Journalistic investigation of computational power structures. Digital Journalism, 3(3), 398–415. https://doi.org/10.1080/21670811.2014.976411
Dobson, J. E. (2023). On reading and interpreting black box deep neural networks. International Journal of Digital Humanities, 5(2–3), 431–449. https://doi.org/10.1007/s42803-023-00075-w
Eslami, M., Karahalios, K., Sandvig, C., Vaccaro, K., Rickman, A., Hamilton, K., & Kirlik, A. (2016). First I ‘like’ it, then I hide it: Folk theories of Social feeds. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 2371–2382. https://doi.org/10.1145/2858036.2858494
Eubanks, V. (2019). Automating inequality: How high-tech tools profile, police, and punish the poor. Macmillan.
Floridi, L. (2013). The Ethics of Information. Oxford University Press.
Flusser, V. (1993). Dinge Und Undinge: Phänomenologische Skizzen. Carl Hanser.
Gaboury, J. (2021). Image objects: An archaeology of computer graphics. The MIT Press.
Galloway, A. R. (2006). Gaming: Essays on algorithmic culture. University of Minnesota Press.
Geniusas, S. (2012). The origins of the Horizon in Husserl’s phenomenology. Springer Netherlands. https://doi.org/10.1007/978-94-007-4644-2
Gillespie, T. (2014). The Relevance of Algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media Technologies (pp. 167–194). The MIT Press. https://doi.org/10.7551/mitpress/9780262525374.003.0009
Gillespie, T. (2016). #Trendingistrending: When algorithms become culture. In R. Seyfert, & J. Roberge (Eds.), Algorithmic cultures: Essays on meaning, performance and new technologies (pp. 52–75). Routledge.
Gorban, A. N., & Tyukin, I. Y. (2018). Blessing of dimensionality: Mathematical foundations of the statistical physics of data. Philosophical Transactions of the Royal Society A: Mathematical Physical and Engineering Sciences, 376(2118), 20170237. https://doi.org/10.1098/rsta.2017.0237
Gordon, J. S. (2021). Artificial moral and legal personhood. AI & SOCIETY, 36(2), 457–471. https://doi.org/10.1007/s00146-020-01063-2
Gran, A. B., Booth, P., & Bucher, T. (2021). To be or not to be algorithm aware: A question of a new digital divide? Information. Communication & Society, 24(12), 1779–1796. https://doi.org/10.1080/1369118X.2020.1736124
Grusin, R. (2010). Premediation: Affect and Mediality after 9/11. Palgrave Macmillan UK. https://doi.org/10.1057/9780230275270
Guidotti, R., Monreale, A., Ruggieri, S., Turini, F., Pedreschi, D., & Giannotti, F. (2018). A survey of methods for explaining Black Box models (Version 3). arXiv. https://doi.org/10.48550/ARXIV.1802.01933
Gunkel, D. J. (2018). Robot rights. MIT Press.
Gunkel, D. J. (2023). Person, thing, Robot: A Moral and Legal Ontology for the 21st Century and Beyond. MIT Press.
Han, B. C. (2022). Non-things: Upheaval in the lifeworld. Polity.
Harman, G. (2002). Tool-being: Heidegger and the metaphyics of objects. Open Court.
Heidegger, M. (1992). History of the concept of time: Prolegomena. Indiana University Press.
Heidegger, M., Macquarrie, J., & Robinson, E. (2007). Being and Time. (J. Macquarrie & E. Robinson Trans.). Blackwell.
Hepp, A. (2020). Deep mediatization. Routledge.
Herzog, L. (2021). Old facts, New beginnings: Thinking with Arendt about Algorithmic decision-making. The Review of Politics, 83(4), 555–577. https://doi.org/10.1017/S0034670521000474
Husserl, E. (1991). Cartesianische Meditationen und Pariser Vorträge (S. Strasser, Ed.; 2. Aufl.). Kluwer.
Ihde, D. (1990). Technology and the lifeworld: From garden to earth. Indiana University Press.
Karizat, N., Delmonaco, D., Eslami, M., & Andalibi, N. (2021). Algorithmic folk theories and identity: How TikTok users co-produce knowledge of identity and engage in Algorithmic Resistance. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1–44. https://doi.org/10.1145/3476046
King, O. C., & Mertens, M. (2023). Self-fulfilling prophecy in practical and automated prediction. Ethical Theory and Moral Practice, 26(1), 127–152. https://doi.org/10.1007/s10677-022-10359-9
Kitchin, R. (2017). Thinking critically about and researching algorithms. Information Communication & Society, 20(1), 14–29. https://doi.org/10.1080/1369118X.2016.1154087
Kowalski, R. (1979). Algorithm = logic + control. Communications of the ACM, 22(7), 424–436. https://doi.org/10.1145/359131.359136
Lakoff, G., & Johnson, M. (2011). Metaphors we live by. University of Chicago Press.
Latour, B. (2000). Pandora’s hope: Essays on the reality of science studies. Harvard University Press.
Leerssen, P. (2020). The Soap Box as a Black Box: Regulating transparency in Social Media Recommender systems. European Journal of Law and Technology, 11(2), 1–51.
Loidolt, S. (2018). Phenomenology of plurality: Hannah Arendt on political intersubjectivity. Routledge, Taylor & Francis Group.
Matzner, T. (2023). Algorithms: Technology, culture, politics. Routledge.
McCarthy, J. (1974). Review of Artificial intelligence: A general survey, by James Lighthill. Artificial Intelligence, 5(3), 317–322. https://doi.org/10.1016/0004-3702(74)90016-2
McQuillan, D. (2018). People’s Councils for Ethical Machine Learning. Social Media + Society, SI: Ethics as Method, 1–10. https://doi.org/10.1177/2056305118768303
Merleau-Ponty, M. (2012). Phenomenology of perception. Routledge.
Mertens, M., King, O. C., Van Putten, M. J. A. M., & Boenink, M. (2022). Can we learn from hidden mistakes? Self-fulfilling prophecy and responsible neuroprognostic innovation. Journal of Medical Ethics, 48(11), 922–928. https://doi.org/10.1136/medethics-2020-106636
Miller, T. (2019). Explanation in artificial intelligence: Insights from the social sciences. Artificial Intelligence, 267, 1–38. https://doi.org/10.1016/j.artint.2018.07.007
Milton, A., Ajmani, L., DeVito, M. A., & Chancellor, S. (2023). I See Me Here: Mental Health Content, Community, and Algorithmic Curation on TikTok. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–17. https://doi.org/10.1145/3544548.3581489
Moran, D. (2000). Introduction to Phenomenology. Routledge.
Munn, L. (2019). Approaching Algorithmic Power [PhD]. Western Sydney University.
O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Penguin Books.
Papacharissi, Z. (2010). A private sphere: Democracy in a digital age. Polity.
Papacharissi, Z. (Ed.). (2011). A networked self: Identity, community and culture on social network sites. Routledge.
Pasquale, F. (2015). The Black Box Society: The Secret algorithms that control money and information (Vol. JSTOR). Harvard University Press. http://www.jstor.org/stable/j.ctt13x0hch
Perel, M. & N. Elkin-Koren. (2017). Black box tinkering: Beyond disclosure in algorithmic enforcement. Florida Law Review, 69(1).
Rader, E. (2015). R. Gray (Ed.), Understanding user beliefs about algorithmic curation in the Facebook News Feed. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems 173 182 https://doi.org/10.1145/2702123.2702174.
Rai, A. (2020). Explainable AI: From black box to glass box. Journal of the Academy of Marketing Science, 48(1), 137–141. https://doi.org/10.1007/s11747-019-00710-5
Rosenberger, R. & Verbeek, P.-P. (2015). A field guide to postphenomenology. In Rosenberger, R. & Verbeek, P.-P. (Eds.) Postphenomenological investigations: Essays on human-technology relations (pp. 9–42). Lexington Books.
Schellewald, A. (2021). Communicative forms on TikTok. Perspectives From Digital Ethnography.
Schellewald, A. (2023). Understanding the popularity and affordances of TikTok through user experiences. Media Culture & Society, 45(8), 1568–1582. https://doi.org/10.1177/01634437221144562
Schulz, C. (2023). A new algorithmic imaginary. Media Culture & Society, 45(3), 646–655. https://doi.org/10.1177/01634437221136014
Seaver, N. (2017). Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems. Big Data & Society 4(2): 1–12. https://doi.org/10.1177/2053951717738104
Seaver, N. (2018). What should an Anthropology of algorithms do? Cultural Anthropology, 33(3), 375–385. https://doi.org/10.14506/ca33.3.04
Seaver, N. (2019). Knowing algorithms. In J. Vertesi, & D. Ribes (Eds.), digitalSTS: A Field Guide for Science & Technology studies (pp. 412–422). Princeton University Press.
Steinbock, A. J. (2017). Limit-phenomena and phenomenology in Husserl. Rowman & Littlefield International.
Tchir, T. (2017). Hannah Arendt’s Theory of Political Action. Springer International Publishing. https://doi.org/10.1007/978-3-319-53438-1
Verrycken, K., Cools, A., & Van Herck, W. (2013). Metaphors in modern and contemporary philosophy. University Press Antwerp.
Vial, S. (2019). Being and the screen: How the digital changes perception (P. Baudoin, Trans.). MIT Press.
Vilone, G., & Longo, L. (2021). Notions of explainability and evaluation approaches for explainable artificial intelligence. Information Fusion, 76, 89–106. https://doi.org/10.1016/j.inffus.2021.05.009
Von Hilgers, P. (2011). The history of the Black Box: The clash of a thing and its Concept. Cultural Politics, 7(1), 41–58. https://doi.org/10.2752/175174311X12861940861707
Wehrle, M. (2022). (Re)turning to Normality? A Bottom-Up Approach to Normativity. In S. Heinämaa, M. Hartimo, & I. Hirvonen (Eds.), Contemporary Phenomenologies of Normativity: Norms, Goals, and Values (pp. 199–218). Routledge. https://doi.org/10.4324/9781003179740
Wiltse, H. (2020). Mediating (Infra)structures: Technology, Media, Environment. In Van den Y. Eede, S. O’Neal Irwin, & G. P. Wellner (Eds.), Postphenomenology and Media: Essays on human-media-world relations (pp. 3–25). Lexington Books.
Zahavi, D. (2008). Husserl’s phenomenology. Stanford University Press.
Zahavi, D. (2018). Phenomenology: The basics. Routledge.
Zuboff, S. (1988). In the age of the smart machine: The future of work and power. Basic Books.
Funding
The research conducted for this article was funded by the Research Foundation – Flanders (FWO). Grant number: 1119522 N.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing Interests
The author has no competing interests to declare.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Longo, A. How Do Social Media Algorithms Appear? A Phenomenological Response to the Black Box Metaphor. Minds & Machines 35, 15 (2025). https://doi.org/10.1007/s11023-025-09716-1
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s11023-025-09716-1