skip to main content
10.1145/3544549.3573867acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

Surfacing AI Explainability in Enterprise Product Visual Design to Address User Tech Proficiency Differences

Published: 19 April 2023 Publication History

Abstract

This case study presents an investigation on explainable artificial intelligence (AI) visualization in business applications. Design guidelines for human-AI interaction are broad and touch on a range of user experiences with AI. Oftentimes, guidelines are not specific to enterprise scenarios with late-stage end users with limited AI knowledge and experience. We present a three-phase study on a visual design of a machine learning (ML) algorithm output. We conducted a user study on an existing design with limited visual AI explanation cues, ran a redesign workshop with various design and data experts, and conducted a reassessment with systematically applied AI explanation guidelines in place. We surface how users with various tech proficiency and AI/ML backgrounds interact with designs and how visual explanation cues increase understanding and effective decision making of users with low AI/ML familiarity. This design process corroborated the application and impact of existing guidelines and surfaced specific design implications for AI explainability within enterprise design.

Supplementary Material

Supplemental Materials (3544549.3573867-supplemental-materials.zip)
MP4 File (3544549.3573867-talk-video.mp4)
Pre-recorded Video Presentation

References

[1]
Saleema Amershi, Dan Weld, Mihaela Vorvoreanu, Adam Fourney, Besmira Nushi, Penny Collisson, Jina Suh, Shamsi Iqbal, Paul N. Bennett, Kori Inkpen, Jaime Teevan, Ruth Kikin-Gil, and Eric Horvitz. 2019. Guidelines for Human-AI Interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (Glasgow, Scotland Uk) (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13.
[2]
Brittany Davis, Maria Glenski, William Sealy, and Dustin Arendt. 2020. Measure Utility, Gain Trust: Practical Advice for XAI Researchers. In 2020 IEEE Workshop on TRust and EXpertise in Visual Analytics (TREX). 1–8.
[3]
Wenkai Han and Hans-Jörg Schulz. 2020. Beyond Trust Building — Calibrating Trust in Visual Analytics. In 2020 IEEE Workshop on TRust and EXpertise in Visual Analytics (TREX). 9–15.
[4]
Sarah Hanses and Jennifer Wang. 2022. How Do Users Interact with AI Features in the Workplace? Understanding the AI Feature User Journey in Enterprise(CHI EA ’22). Association for Computing Machinery, New York, NY, USA, Article 36, 7 pages.
[5]
Alexander John Karran, Theophile Demazure, Antoine Hudon, Sylvain Senecal, and Pierre-Majorique Leger. 2022. Designing for Confidence: The Impact of Visualizing Artificial Intelligence Decisions. Frontiers in Neuroscience 16 (2022).
[6]
Clara Kliman-Silver, Oliver Siy, Kira Awadalla, Alison Lentz, Gregorio Convertino, and Elizabeth Churchill. 2020. Adapting User Experience Research Methods for AI-Driven Experiences. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI EA ’20). Association for Computing Machinery, New York, NY, USA, 1–8.
[7]
Tamara Munzner. 2015. Visualization analysis & design. Taylor & Francis Group.
[8]
Google Research. 2019. People and AI Guidebook. https://pair.withgoogle.com/
[9]
SAP SE. 2022. SAP Fiori Design Guidelines. https://experience.sap.com/fiori-design-web/explainable-ai/
[10]
Aaron Springer and Steve Whittaker. 2020. Progressive Disclosure: When, Why, and How Do Users Want Algorithmic Transparency Information?ACM Trans. Interact. Intell. Syst. 10, 4 (oct 2020).
[11]
Sara Tandon, Alfie Abdul-Rahman, and Rita Borgo. 2022. Measuring Effects of Spatial Visualization and Domain On Visualization Task Performance: A Comparative Study. IEEE Transactions on Visualization and Computer Graphics (2022), 1–11.
[12]
Colin Ware. 2004. Information visualization: Perception for design. Elsevier Science & Technology.
[13]
Qian Yang, Aaron Steinfeld, Carolyn Rosé, and John Zimmerman. 2020. Re-Examining Whether, Why, and How Human-AI Interaction Is Uniquely Difficult to Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13.
[14]
Sabah Zdanowska and Alex S Taylor. 2022. A Study of UX Practitioners Roles in Designing Real-World, Enterprise ML Systems. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (New Orleans, LA, USA) (CHI ’22). Association for Computing Machinery, New York, NY, USA, Article 531, 15 pages.
[15]
Yunfeng Zhang, Q. Vera Liao, and Rachel K. E. Bellamy. 2020. Effect of Confidence and Explanation on Accuracy and Trust Calibration in AI-Assisted Decision Making. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (Barcelona, Spain). Association for Computing Machinery, New York, NY, USA, 295–305.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
April 2023
3914 pages
ISBN:9781450394222
DOI:10.1145/3544549
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 April 2023

Check for updates

Author Tags

  1. Artificial intelligence
  2. enterprise
  3. explainable AI
  4. practitioners
  5. user experience
  6. user perceptions

Qualifiers

  • Extended-abstract
  • Research
  • Refereed limited

Conference

CHI '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 296
    Total Downloads
  • Downloads (Last 12 months)148
  • Downloads (Last 6 weeks)16
Reflects downloads up to 26 Jan 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media