skip to main content
10.1145/3679318.3685501acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

In a Quasi-Social Relationship With ChatGPT. An Autoethnography on Engaging With Prompt-Engineered LLM Personas

Published: 13 October 2024 Publication History

Abstract

As conversational AI like ChatGPT becomes more sophisticated, understanding emerging quasi-social relationships with it is crucial. Through analytical autoethnography, we explore the nuances of these relationships with two autobiographically designed ChatGPT personas to augment the social needs of the first author: the Endless Enthusiast (always responding positively and encouragingly) and the Socratic Tutor (asking questions to stimulate critical thinking). After six weeks of interaction, we find that for a successful relationship, the non-human counterpart must be authentic about its machine nature and limitations. Using deception to appear more human-like makes the relationship fail. We thus suggest designing machine relationships as complementary to human-human relationships. For authentic interactions, humans should be in control, with the machine authentically assuming a role that "naturally" fits machines. Here, the unique qualities of machines in social interactions offer promising starting points for designing such roles.

References

[1]
Ruben Albers, Judith Dörrenbächer, Martin Weigel, Dirk Ruiken, Thomas Weisswange, Christian Goerick, and Marc Hassenzahl. 2022. Meaningful Telerobots in Informal Care. In Nordic Human-Computer Interaction Conference, October 08, 2022. ACM, New York, NY, USA, 1–11. https://doi.org/10.1145/3546155.3546696
[2]
Hussam Alkaissi and Samy I. McFarlane. 2023. Artificial Hallucinations in ChatGPT: Implications in Scientific Writing. Cureus 15, 2 (February 2023). https://doi.org/10.7759/cureus.35179
[3]
Leon Anderson. 2006. Analytic autoethnography. J Contemp Ethnogr 35, 4 (August 2006), 373–395. https://doi.org/10.1177/0891241605280449
[4]
Emily M. Bender, Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. 2021. On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? In Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’21), March 2021. Association for Computing Machinery, 610–623. https://doi.org/10.1145/3442188.3445922
[5]
Claus Bossen and Kathleen H. Pine. 2023. Batman and Robin in Healthcare Knowledge Work: Human-AI Collaboration by Clinical Documentation Integrity Specialists. ACM Transactions on Computer-Human Interaction 30, 2 (March 2023). https://doi.org/10.1145/3569892
[6]
Petter Bae Brandtzaeg, Marita Skjuve, and Asbjørn Følstad. 2022. My AI Friend: How Users of a Social Chatbot Understand Their Human-AI Friendship. Hum Commun Res 48, 3 (July 2022), 404–429. https://doi.org/10.1093/hcr/hqac008
[7]
Jesse Chandler and Norbert Schwarz. 2010. Use does not wear ragged the fabric of friendship: Thinking of objects as alive makes people less willing to replace them. Journal of Consumer Psychology 20, 2 (April 2010), 138–145. https://doi.org/10.1016/j.jcps.2009.12.008
[8]
Heewon Chang. 2008. Authoethnography as a Method. Routledge.
[9]
Lingjiao Chen, Matei Zaharia, and James Zou. 2023. How is ChatGPT's behavior changing over time? arXiv: 2307.09009 (2023). Retrieved from http://arxiv.org/abs/2307.09009
[10]
Wei Chi Chien and Marc Hassenzahl. 2020. Technology-Mediated Relationship Maintenance in Romantic Long-Distance Relationships: An Autoethnographical Research through Design. Hum Comput Interact 35, 3 (December 2020), 240–287. https://doi.org/10.1080/07370024.2017.1401927
[11]
Emre Çitak. 2023. ChatGPT knowledge cutoff will no longer a be big problem - gHacks Tech News. Retrieved November 15, 2023 from https://www.ghacks.net/2023/11/07/chatgpt-knowledge-cutoff-extended/
[12]
Mark Coeckelbergh. 2011. You, robot: On the linguistic construction of artificial others. AI Soc 26, 1 (2011), 61–69. https://doi.org/10.1007/s00146-010-0289-z
[13]
P. C. Cozby. 1972. Self-disclosure, reciprocity and liking. Sociometry 35, 1 (March 1972), 151–160. https://doi.org/10.2307/2786555
[14]
Emmelyn A. J. Croes and Marjolijn L. Antheunis. 2021. Can we be friends with Mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. J Soc Pers Relat 38, 1 (September 2021), 279–300. https://doi.org/10.1177/0265407520959463
[15]
Smit Desai, Tanusree Sharma, and Pratyasha Saha. 2023. Using ChatGPT in HCI Research—A Trioethnography. In Proceedings of the 5th International Conference on Conversational User Interfaces, July 19, 2023. ACM, New York, NY, USA, 1–6. https://doi.org/10.1145/3571884.3603755
[16]
J. Dörrenbächer, D. Löffler, and M. Hassenzahl. 2020. Becoming a Robot – Overcoming Anthropomorphism with Techno-Mimesis. CHI 2020, April 25–30, 2020, Honolulu, HI, USA (2020), 1–12.
[17]
Sabit Ekin. 2023. Prompt Engineering For ChatGPT: A Quick Guide To Techniques, Tips, And Best Practices. (May 2023). https://doi.org/10.36227/TECHRXIV.22683919.V2
[18]
Carolyn Ellis, Tony E. Adams, and Arthur P. Bochner. 2010. Autoethnography: An Overview. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 12, 1 (November 2010). https://doi.org/10.17169/FQS-12.1.1589
[19]
Carolyn Ellis and Arthur Bochner. 2000. Autoethnography, Personal Narrative, Reflexivity: Researcher as Subject. In Handbook of Qualitative Research. 733–768.
[20]
Yunhe Feng, Sreecharan Vanam, Manasa Cherukupally, Weijian Zheng, Meikang Qiu, and Haihua Chen. 2023. Investigating Code Generation Performance of ChatGPT with Crowdsourcing Social Data. In Proceedings - International Computer Software and Applications Conference, 2023. 876–885. https://doi.org/10.1109/COMPSAC57700.2023.00117
[21]
Kathleen Kara Fitzpatrick, Alison Darcy, and Molly Vierhile. 2017. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Ment Health 4, 2 (June 2017), e7785. https://doi.org/10.2196/mental.7785
[22]
Jesse Fox, Sun Joo (Grace) Ahn, Joris H. Janssen, Leo Yeykelis, Kathryn Y. Segovia, and Jeremy N. Bailenson. 2015. Avatars Versus Agents: A Meta-Analysis Quantifying the Effect of Agency on Social Influence. Hum Comput Interact 30, 5 (September 2015), 401–432. https://doi.org/10.1080/07370024.2014.921494
[23]
Russell Fulmer, Angela Joerin, Breanna Gentile, Lysanne Lakerink, and Michiel Rauws. 2018. Using psychological artificial intelligence (tess) to relieve symptoms of depression and anxiety: Randomized controlled trial. JMIR Ment Health 5, 4 (December 2018), e9782. https://doi.org/10.2196/mental.9782
[24]
Andrew Gambino, Jesse Fox, and Rabindra Ratan. 2020. Building a Stronger CASA: Extending the Computers Are Social Actors Paradigm. Human-Machine Communication 1, (February 2020), 71–86. https://doi.org/10.30658/hmc.1.5
[25]
William Gaver, Frances Gaver, and Park Lane. 2023. Living with Light Touch. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, April 19, 2023. Association for Computing Machinery, New York, 1–14. https://doi.org/10.1145/3544548.3580807
[26]
Louie Giray. 2023. Prompt Engineering with ChatGPT: A Guide for Academic Writers. Annals of Biomedical Engineering 2023 1, (June 2023), 1–5. https://doi.org/10.1007/S10439-023-03272-4
[27]
Victoria Groom and Clifford Nass. 2007. Can robots be teammates? Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 8, 3 (November 2007), 483–500. https://doi.org/10.1075/is.8.3.10gro
[28]
Marc Hassenzahl, Jan Borchers, Susanne Boll, Astrid Rosenthal-von der Pütten, and Volker Wulf. 2021. Otherware: how to best interact with autonomous systems. Interactions 28, 1 (January 2021), 54–57. https://doi.org/10.1145/3436942
[29]
James M Honeycutt and Suzette P Bryan. 2021. Scripts and Communication for Relationships. Peter Lang Verlag, New York, United States of America. Retrieved from https://www.peterlang.com/document/1143731
[30]
Don Ihde. 1990. Technology and the Lifeworld. Indiana University Press, Bloomington. https://doi.org/10.5840/teachphil199215117
[31]
Don Ihde. 1995. Postphenomenology: Essays in the postmodern context. Northwestern University Press.
[32]
Jialun Aaron Jiang, Kandrea Wade, Casey Fiesler, and Jed R. Brubaker. 2021. Supporting Serendipity: Opportunities and Challenges for Human-AI Collaboration in Qualitative Analysis. Proc ACM Hum Comput Interact 5, CSCW1 (April 2021). https://doi.org/10.1145/3449168
[33]
Evangelos Karapanos, Jean-Bernard Martens, and Marc Hassenzahl. 2010. On the retrospective assessment of users’ experiences over time. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems, April 10, 2010. ACM, New York, NY, USA, 4075–4080. https://doi.org/10.1145/1753846.1754105
[34]
Nicole C. Krämer, Sabrina Eimler, Astrid von der Pütten, and Sabine Payr. 2011. Theory of Companions: What Can Theoretical Models Contribute to Applications and Understanding of Human-Robot Interaction? Applied Artificial Intelligence 25, 6 (July 2011), 474–502. https://doi.org/10.1080/08839514.2011.587153
[35]
Udo Kuckartz and Stefan Rädiker. 2022. Qualitative Inhaltsanalyse - Methoden, Praxis, Computerunterstützung / Udo Kuckartz, Stefan Rädiker (5th ed.). Juventa Verlag.
[36]
Linnea Laestadius, Andrea Bishop, Michael Gonzalez, Diana Illenčík, and Celeste Campos-Castillo. 2022. Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika. New Media Soc (December 2022). https://doi.org/10.1177/14614448221142007
[37]
Matthias Laschke, Robin Neuhaus, Judith Dörrenbächer, Marc Hassenzahl, Volker Wulf, Astrid Rosenthal-von der Pütten, Jan Borchers, and Susanne Boll. 2020. Otherware needs Otherness: Understanding and Designing Artificial Counterparts. In Proceedings of the 11th Nordic Conference on Human-Computer Interaction: Shaping Experiences, Shaping Society, October 25, 2020. ACM, New York, NY, USA, 1–4. https://doi.org/10.1145/3419249.3420079
[38]
Seo Young Lee and Junho Choi. 2017. Enhancing user experience with conversational agent for movie recommendation: Effects of self-disclosure and reciprocity. International Journal of Human Computer Studies 103, (July 2017), 95–105. https://doi.org/10.1016/j.ijhcs.2017.02.005
[39]
Yi Chieh Lee, Naomi Yamashita, Yun Huang, and Wai Fu. 2020. “I Hear You, I Feel You”: Encouraging Deep Self-disclosure through a Chatbot. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20), April 21, 2020. Association for Computing Machinery. https://doi.org/10.1145/3313831.3376175
[40]
Yi Liu, Gelei Deng, Zhengzi Xu, Yuekang Li, Yaowen Zheng, Ying Zhang, Lida Zhao, Tianwei Zhang, and Yang Liu. 2023. Jailbreaking ChatGPT via Prompt Engineering: An Empirical Study. arXiv: 2305.13860 (2023). Retrieved from https://arxiv.org/abs/2305.13860v1
[41]
Priscilla Y. Lo. 2024. An Autoethnographic Reflection of Prompting a Custom GPT Based on Oneself. In Extended Abstracts of the CHI Conference on Human Factors in Computing Systems, May 11, 2024. ACM, New York, NY, USA, 1–9. https://doi.org/10.1145/3613905.3651096
[42]
Andrés Lucero. 2018. Living without a mobile phone: An autoethnography. In Proceedings of the 2018 Designing Interactive Systems Conference (DIS ’18), June 13, 2018. Association for Computing Machinery, Inc, 765–776. https://doi.org/10.1145/3196709.3196731
[43]
Nathan J. McNeese, Mustafa Demir, Nancy J. Cooke, and Christopher Myers. 2018. Teaming With a Synthetic Teammate: Insights into Human-Autonomy Teaming. Hum Factors 60, 2 (March 2018), 262–273. https://doi.org/10.1177/0018720817743223
[44]
Prakash M. Nadkarni, Lucila Ohno-Machado, and Wendy W. Chapman. 2011. Natural language processing: An introduction. Journal of the American Medical Informatics Association 18, 544–551. https://doi.org/10.1136/amiajnl-2011-000464
[45]
Clifford Nass and Youngme Moon. 2000. Machines and mindlessness: Social responses to computers. Journal of Social Issues 56, 1 (January 2000), 81–103. https://doi.org/10.1111/0022-4537.00153
[46]
Clifford Nass, Jonathan Steuer, and Ellen R. Tauber. 1994. Computers are social actors. In Conference companion on Human factors in computing systems  - CHI ’94, 1994. ACM Press, New York, New York, USA, 204. https://doi.org/10.1145/259963.260288
[47]
Leonard Nelson. 1980. The Socratic Method. Thinking: The Journal of Philosophy for Children 2, 2 (October 1980), 34–38. https://doi.org/10.5840/thinking1980228
[48]
Robin Neuhaus, Ronda Ringfort-Felner, Judith Dörrenbächer, and Marc Hassenzahl. 2022. How to Design Robots with Superpowers. In Meaningful Futures with Robots—Designing a New Coexistence. Taylor & Francis, 43–54. https://doi.org/10.1201/9781003287445-3
[49]
Carman Neustaedter and Phoebe Sengers. 2012. Autobiographical design in HCI research. In Proceedings of the Designing Interactive Systems Conference, June 11, 2012. ACM, New York, NY, USA, 514–523. https://doi.org/10.1145/2317956.2318034
[50]
Luminița Nicolescu and Monica Teodora Tudorache. 2022. Human-Computer Interaction in Customer Service: The Experience with AI Chatbots—A Systematic Literature Review. Electronics 2022, Vol. 11, Page 1579 11, 10 (May 2022), 1579. https://doi.org/10.3390/ELECTRONICS11101579
[51]
Marta Olasik. 2023. “Good morning, ChatGPT, can we become friends?”: an interdisciplinary scholar's experience of ‘getting acquainted’ with the OpenAI's Chat GPT: an auto ethnographical report. European Research Studies Journal 26, 2 (2023), 190–205.
[52]
Thomas O'Neill, Nathan McNeese, Amy Barron, and Beau Schelble. 2022. Human–Autonomy Teaming: A Review and Analysis of the Empirical Literature. Hum Factors 64, 5 (August 2022), 904–938. https://doi.org/10.1177/0018720820960865
[53]
OpenAI. 2021. DALL·E: Creating images from text. OpenAI. Retrieved June 13, 2022 from https://openai.com/research/dall-e
[54]
OpenAI. 2022. Introducing ChatGPT. OpenAI. Retrieved June 13, 2022 from https://openai.com/blog/chatgpt
[55]
OpenAI. 2023. Models - OpenAI API. OpenAI Documentation. Retrieved November 15, 2023 from https://platform.openai.com/docs/models
[56]
OpenAI. 2023. Introducing GPTs. Retrieved November 15, 2023 from https://openai.com/blog/introducing-gpts
[57]
OpenAI. 2023. Our approach to AI safety. OpenAI. Retrieved November 14, 2023 from https://openai.com/blog/our-approach-to-ai-safety
[58]
Simon Ottenberg. 1990. Thirty Years of Fieldnotes: Changing Relationships to the Text. In Fieldnotes. Cornell University Press, 139–160. https://doi.org/10.7591/9781501711954-007
[59]
Ashish Viswanath Prakash and Saini Das. 2020. Intelligent Conversational Agents in Mental Healthcare Services: A Thematic Analysis of User Perceptions. Pacific Asia Journal of the Association for Information Systems 12, 2 (June 2020), 1–34. https://doi.org/10.17705/1pais.12201
[60]
Amon Rapp, Lorenzo Curti, and Arianna Boldi. 2021. The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human Computer Studies 151, (July 2021), 102630. https://doi.org/10.1016/j.ijhcs.2021.102630
[61]
Partha Pratim Ray. 2023. ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope. Internet of Things and Cyber-Physical Systemsyber-Physical Systems 3, (January 2023), 121–154. https://doi.org/10.1016/j.iotcps.2023.04.003
[62]
Byron Reeves and Clifford Ivar Nass. 1996. The media equation: How people treat computers, television, and new media like real people and places. Cambridge University Press, New York, NY, US.
[63]
Ronda Ringfort-Felner, Matthias Laschke, Robin Neuhaus, Dimitra Theofanou-Fülbier, and Marc Hassenzahl. 2022. It Can Be More Than Just a Subservient Assistant. Distinct Roles for the Design of Intelligent Personal Assistants. In Nordic Human-Computer Interaction Conference, October 08, 2022. ACM, New York, NY, USA, 1–17. https://doi.org/10.1145/3546155.3546699
[64]
Malik Sallam. 2023. The Utility of ChatGPT as an Example of Large Language Models in Healthcare Education, Research and Practice: Systematic Review on the Future Perspectives and Potential Limitations. medRxiv 2023.02.19.23286155 (February 2023). https://doi.org/10.1101/2023.02.19.23286155
[65]
Roger Sanjek. 1990. A Vocabulary for Fieldnotes. In Fieldnotes. Cornell University Press, 92–136. https://doi.org/10.7591/9781501711954-006
[66]
Roger C Schank and Robert P Abelson. 1977. Scripts, plans, goals and understanding: An inquiry into human knowledge structures. Lawrence Erlbaum, Oxford,  England.
[67]
Nicolas Schwenke, Heinrich Söbke, and Eckhard Kraft. 2023. Potentials and Challenges of Chatbot-Supported Thesis Writing: An Autoethnography. Trends in Higher Education 2, 4 (November 2023), 611–635. https://doi.org/10.3390/higheredu2040037
[68]
Yeganeh Shahsavar and Avishek Choudhury. 2023. User Intentions to Use ChatGPT for Self-Diagnosis and Health-Related Purposes: Cross-sectional Survey Study. JMIR Hum Factors 10, 1 (May 2023), e47564. https://doi.org/10.2196/47564
[69]
Marita Skjuve, Asbjørn Følstad, Knut Inge Fostervold, and Petter Bae Brandtzaeg. 2021. My Chatbot Companion - a Study of Human-Chatbot Relationships. International Journal of Human Computer Studies 149, (May 2021), 102601. https://doi.org/10.1016/j.ijhcs.2021.102601
[70]
Alexander Skulmowski and Kate Man Xu. 2021. Understanding Cognitive Load in Digital and Online Learning: a New Perspective on Extraneous Cognitive Load. Educational Psychology Review 2021 34:1 34, 1 (June 2021), 171–196. https://doi.org/10.1007/S10648-021-09624-7
[71]
Xia Song, Bo Xu, and Zhenzhen Zhao. 2022. Can people experience romantic love for artificial intelligence? An empirical study of intelligent assistants. Information and Management 59, 2 (March 2022), 103595. https://doi.org/10.1016/j.im.2022.103595
[72]
Konrad Sowa, Aleksandra Przegalinska, and Leon Ciechanowski. 2021. Cobots in knowledge work: Human – AI collaboration in managerial professions. J Bus Res 125, (March 2021), 135–142. https://doi.org/10.1016/j.jbusres.2020.11.038
[73]
Tom Taulli. 2023. Large Language Models. In Generative AI. Apress, Berkeley, CA, Berkeley, CA, 93–125. https://doi.org/10.1007/978-1-4842-9367-6_5
[74]
Sarah Wall. 2012. Ethics and the Socio-political Context of International Adoption: Speaking from the Eye of the Storm. Ethics Soc Welf 6, 4 (December 2012), 318–332. https://doi.org/10.1080/17496535.2011.576256
[75]
Echo Wen Wan and Rocky Peng Chen. 2021. Anthropomorphism and object attachment. Curr Opin Psychol 39, (June 2021), 88–93. https://doi.org/10.1016/j.copsyc.2020.08.009
[76]
David Watson. 2019. The Rhetoric and Reality of Anthropomorphism in Artificial Intelligence. Minds Mach (Dordr) 29, 3 (September 2019), 417–440. https://doi.org/10.1007/s11023-019-09506-6
[77]
Julika Welge and Marc Hassenzahl. 2016. Better Than Human: About the Psychological Superpowers of Robots. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Arvin Agah, John-John Cabibihan, Ayanna M. Howard, Miguel A. Salichs and Hongsheng He (eds.). Springer International Publishing, Cham, 993–1002. https://doi.org/10.1007/978-3-319-47437-3_97
[78]
Jules White, Quchen Fu, Sam Hays, Michael Sandborn, Carlos Olea, Henry Gilbert, Ashraf Elnashar, Jesse Spencer-Smith, and Douglas C. Schmidt. 2023. A Prompt Pattern Catalog to Enhance Prompt Engineering with ChatGPT. arXiv:2302.11382 (2023). Retrieved from http://arxiv.org/abs/2302.11382
[79]
Carolin Wienrich, Astrid Carolus, André Markus, Yannik Augustin, Jan Pfister, and Andreas Hotho. 2023. Long-Term Effects of Perceived Friendship with Intelligent Voice Assistants on Usage Behavior, User Experience, and Social Perceptions. Computers 12, 4 (April 2023). https://doi.org/10.3390/computers12040077
[80]
Kaiton Williams. 2015. An Anxious Alliance. In Aarhus Series on Human Centered Computing, August 17, 2015. Association for Computing Machinery, Inc, 121–131. https://doi.org/10.7146/aahcc.v1i1.21146
[81]
Myong Jin Won-Doornink. 1985. Self-Disclosure and Reciprocity in Conversation: A Cross-National Study. Soc Psychol Q 48, 2 (June 1985), 97. https://doi.org/10.2307/3033605
[82]
Chloe Xiang. 2023. “He Would Still Be Here”: Man Dies by Suicide After Talking with AI Chatbot, Widow Says. Vice Online. Retrieved November 14, 2023 from https://www.vice.com/en/article/pkadgm/man-dies-by-suicide-after-talking-with-ai-chatbot-widow-says
[83]
Tianling Xie, Iryna Pentina, and Tyler Hancock. 2023. Friend, mentor, lover: does chatbot engagement lead to psychological dependence? Journal of Service Management 34, 4 (June 2023), 806–828. https://doi.org/10.1108/JOSM-02-2022-0072
[84]
J D Zamfirescu-Pereira, R Wong, Bjoern Hartmann, and Qian Yang. 2023. Why Johnny Can't Prompt: How Non-AI Experts Try (and Fail) to Design LLM Prompts. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 19, 2023. ACM, New York, NY, USA, 1–21. https://doi.org/10.1145/3544548.3581388

Index Terms

  1. In a Quasi-Social Relationship With ChatGPT. An Autoethnography on Engaging With Prompt-Engineered LLM Personas

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '24: Proceedings of the 13th Nordic Conference on Human-Computer Interaction
    October 2024
    1236 pages
    ISBN:9798400709661
    DOI:10.1145/3679318
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 October 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. autobiographical research through design
    2. autoethnography
    3. human-AI interaction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Federal Ministry of Education and Research

    Conference

    NordiCHI 2024

    Acceptance Rates

    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 179
      Total Downloads
    • Downloads (Last 12 months)179
    • Downloads (Last 6 weeks)55
    Reflects downloads up to 22 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media