skip to main content
10.1145/3290605.3300769acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Critter: Augmenting Creative Work with Dynamic Checklists, Automated Quality Assurance, and Contextual Reviewer Feedback

Published: 02 May 2019 Publication History

Abstract

Checklists and guidelines have played an increasingly important role in complex tasks ranging from the cockpit to the operating theater. Their role in creative tasks like design is less explored. In a needfinding study with expert web designers, we identified designers' challenges in adhering to a checklist of design guidelines. We built Critter, which addressed these challenges with three components: Dynamic Checklists that progressively disclose guideline complexity with a self-pruning hierarchical view, AutoQA to automate common quality assurance checks, and guideline-specific feedback provided by a reviewer to highlight mistakes as they appear. In an observational study, we found that the more engaged a designer was with Critter, the fewer mistakes they made in following design guidelines. Designers rated the AutoQA and contextual feedback experience highly, and provided feedback on the tradeoffs of the hierarchical Dynamic Checklists. We additionally found that a majority of designers rated the AutoQA experience as excellent and felt that it increased the quality of their work. Finally, we discuss broader implications for supporting complex creative tasks.

References

[1]
2009. Google Java Style Guide. https://google.github.io/styleguide/ javaguide.html
[2]
2015. Orchestra. http://orchestra.b12.io/
[3]
2015. Write your best with Grammarly. https://www.grammarly.com/
[4]
Jochen Bergs, Johan Hellings, Irina Cleemput, Ö Zurel, Vera De Troyer, Monique Van Hiel, J-L Demeere, Donald Claeys, and Dominique Vandijck. 2014. Systematic review and meta-analysis of the effect of the World Health Organization surgical safety checklist on postoperative complications. British Journal of Surgery 101, 3 (2014), 150--158.
[5]
Daniel Boorman. 2001. Today's electronic checklists reduce likelihood of crew errors and help prevent mishaps. ICAO Journal (2001).
[6]
C Marlin Brown. 1999. Human-computer interface design guidelines. Intellect Books.
[7]
Asaf Degani and Earl L Wiener. 1991. Human factors of flight-deck checklists: the normal checklist. (1991).
[8]
Steven Dow, Anand Kulkarni, Scott Klemmer, and Björn Hartmann. 2012. Shepherding the crowd yields better work. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work. ACM, 1013--1022.
[9]
Elfriede Dustin, Jeff Rashka, and John Paul. 1999. Automated software testing: introduction, management, and performance. Addison-Wesley Professional.
[10]
Gerhard Fischer, Kumiyo Nakakoji, Jonathan Ostwald, Gerry Stahl, and Tamara Sumner. 1993. Embedding critics in design environments. The knowledge engineering review 8, 4 (1993), 285--307.
[11]
Werner Funk, Vera Dammann, and Gerhild Donnevert. 2007. Quality assurance in analytical chemistry: applications in environmental, food and materials analysis, biotechnology, and medical engineering. John Wiley & Sons.
[12]
Jose L Galvan and Melisa C Galvan. 2017. Writing literature reviews: A guide for students of the social and behavioral sciences. Routledge.
[13]
Atul Gawande. 2010. Checklist manifesto, the (HB). Penguin Books India.
[14]
Jun Gong, Peter Tarasewich, et al. 2004. Guidelines for handheld mobile device interface design. In Proceedings of DSI 2004 Annual Meeting. 3751--3756.
[15]
Andrew Grochowski, Debashis Bhattacharya, TR Viswanathan, and Ken Laker. 1997. Integrated circuit testing for quality assurance in manufacturing: history, current status, and future trends. IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing 44, 8 (1997), 610--633.
[16]
Daniel Haas, Jason Ansel, Lydia Gu, and Adam Marcus. 2015. Argonaut: Macrotask Crowdsourcing for Complex Data Processing. Proc. VLDB Endow. 8, 12 (Aug. 2015), 1642--1653.
[17]
Brigette M Hales and Peter J Pronovost. 2006. The checklist -- a tool for error management and performance improvement. Journal of critical care 21, 3 (2006), 231--235.
[18]
John Hattie and Helen Timperley. 2007. The power of feedback. Review of educational research 77, 1 (2007), 81--112.
[19]
Alex B Haynes, Thomas G Weiser, William R Berry, Stuart R Lipsitz, Abdel-Hadi S Breizat, E Patchen Dellinger, Teodoro Herbosa, Sudhir Joseph, Pascience L Kibatala, Marie Carmela M Lapitan, et al. 2009. A surgical safety checklist to reduce morbidity and mortality in a global population. New England Journal of Medicine 360, 5 (2009), 491--499.
[20]
Robert L Helmreich. 2000. On error management: lessons from aviation. Bmj 320, 7237 (2000), 781--785.
[21]
Scott Henninger, Kyle Haynes, and Michael W Reith. 1995. A framework for developing experience-based usability guidelines. In Proceedings of the 1st conference on Designing interactive systems: processes, practices, methods, & techniques. ACM, 43--53.
[22]
Mariana G Hewson and Margaret L Little. 1998. Giving feedback in medical education: verification of recommended techniques. Journal of general internal medicine 13, 2 (1998), 111--116.
[23]
Stephen C Johnson. 1977. Lint, a C program checker. Citeseer.
[24]
Huhn Kim. 2010. Effective organization of design guidelines reflecting designer's design strategies. International Journal of Industrial Ergonomics 40, 6 (2010), 669--688.
[25]
Chinmay Kulkarni, Steven P Dow, and Scott R Klemmer. 2014. Early and repeated exposure to examples improves creative work. In Design thinking research. Springer, 49--62.
[26]
Chinmay Kulkarni, Koh Pang Wei, Huy Le, Daniel Chia, Kathryn Papadopoulos, Justin Cheng, Daphne Koller, and Scott R Klemmer. 2013. Peer and self assessment in massive online classes. ACM Transactions on Computer-Human Interaction (TOCHI) 20, 6 (2013), 33.
[27]
Sri Kurniawan and Panayiotis Zaphiris. 2005. derived web design guidelines for older people. In Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility. ACM, 129--135.
[28]
Jonas Löwgren and Ulrika Laurén. 1993. Supporting the use of guidelines and style guides in professional user interface design. Interacting with Computers 5, 4 (1993), 385 -- 396.
[29]
Paul Merrell, Eric Schkufza, Zeyang Li, Maneesh Agrawala, and Vladlen Koltun. 2011. Interactive furniture layout using interior design guidelines. In ACM transactions on graphics (TOG), Vol. 30. ACM, 87.
[30]
Jane N Mosier and Sidney L Smith. 1986. Application of guidelines for designing user interface software. Behaviour & information technology 5, 1 (1986), 39--46.
[31]
Tricia J Ngoon, C Ailie Fraser, Ariel S Weingarten, Mira Dontcheva, and Scott Klemmer. 2018. Interactive Guidance Techniques for Improving Creative Feedback. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 55.
[32]
Henry Petroski. 1985. To engineer is human: The role of failure in successful design. St Martins Press.
[33]
Daniela Retelny, Michael S Bernstein, and Melissa A Valentine. 2017. No Workflow Can Ever Be Enough: How Crowdsourcing Workflows Constrain Complex Work. Proceedings of the ACM on Human-Computer Interaction 1, 2 (2017), 23.
[34]
Daniela Retelny, Sébastien Robaszkiewicz, Alexandra To, Walter S Lasecki, Jay Patel, Negar Rahmati, Tulsee Doshi, Melissa Valentine, and Michael S Bernstein. 2014. Expert crowdsourcing with flash teams. In Proceedings of the 27th annual ACM symposium on User interface software and technology. ACM, 75--85.
[35]
Dominique L Scapin. 1990. Organizing human factors knowledge for the evaluation and design of interfaces. International Journal of Human-Computer Interaction 2, 3 (1990), 203--229.
[36]
Donald A Schön. 2017. The reflective practitioner: How professionals think in action. Routledge.
[37]
Michael Scriven. 2000. The logic and methodology of checklists. (2000).
[38]
Irving Seidman. 2013. Interviewing as qualitative research: A guide for researchers in education and the social sciences. Teachers college press.
[39]
Mary Shaw. 2003. Writing good software engineering research papers. In Software Engineering, 2003. Proceedings. 25th International Conference on. IEEE, 726--736.
[40]
Barry G Silverman. 1991. Expert critics: operationalizing the judgement/decisionmaking literature as a theory of "bugs" and repair strategies. Knowledge Acquisition 3, 2 (1991), 175--214.
[41]
Michael Terry and Elizabeth D Mynatt. 2002. Recognizing creative needs in user interface design. In Proceedings of the 4th conference on Creativity & cognition. ACM, 38--44.
[42]
Linda Tetzlaff and David R Schwartz. 1991. The use of guidelines in interface design. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 329--333.
[43]
Henrik Thovtrup and Jakob Nielsen. 1991. Assessing the usability of a user interface standard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 335--341.
[44]
Guido van Rossum, Barry Warsaw, and Nick Coghlan. 2001. PEP 8: style guide for Python code. Python. org (2001).
[45]
Mark Warschauer and Paige Ware. 2006. Automated writing evaluation: Defining the classroom research agenda. Language teaching research 10, 2 (2006), 157--180.
[46]
Jane Westberg and Hilliard Jason. 2001. Fostering reflection and providing feedback: Helping others learn from experience. Springer Publishing Company.
[47]
Alvin Yuan, Kurt Luther, Markus Krause, Sophie Isabel Vennix, Steven P Dow, and Bjorn Hartmann. 2016. Almost an Expert: The Effects of Rubrics and Expertise on Perceived Value of Crowdsourced Design Critiques. In Proceedings of the 19th ACM Conference on ComputerSupported Cooperative Work & Social Computing (CSCW '16). ACM, New York, NY, USA, 1005--1017.
[48]
Myung Hwan Yun, Heecheon You, Wooyeun Geum, and Dongjoon Kong. 2004. Affective evaluation of vehicle interior craftsmanship: systematic checklists for touch/feel quality of surface-covering material. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 48. SAGE Publications Sage CA: Los Angeles, CA, 971--975.
[49]
Haoqi Zhang, Edith Law, Rob Miller, Krzysztof Gajos, David Parkes, and Eric Horvitz. 2012. Human computation tasks with global constraints. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 217--226.

Cited By

View all
  • (2024)AINeedsPlanner: A Workbook to Support Effective Collaboration Between AI Experts and ClientsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661577(728-742)Online publication date: 1-Jul-2024
  • (2024)When to Give Feedback: Exploring Tradeoffs in the Timing of Design FeedbackProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3656183(292-310)Online publication date: 23-Jun-2024
  • (2023)Transitioning Cognitive Aids into Decision Support Platforms: Requirements and Design GuidelinesACM Transactions on Computer-Human Interaction10.1145/358243130:3(1-28)Online publication date: 10-Jun-2023
  • Show More Cited By

Index Terms

  1. Critter: Augmenting Creative Work with Dynamic Checklists, Automated Quality Assurance, and Contextual Reviewer Feedback

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
    May 2019
    9077 pages
    ISBN:9781450359702
    DOI:10.1145/3290605
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 May 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. checklists
    2. creative work
    3. feedback
    4. quality assurance

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHI '19
    Sponsor:

    Acceptance Rates

    CHI '19 Paper Acceptance Rate 703 of 2,958 submissions, 24%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)32
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 04 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)AINeedsPlanner: A Workbook to Support Effective Collaboration Between AI Experts and ClientsProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661577(728-742)Online publication date: 1-Jul-2024
    • (2024)When to Give Feedback: Exploring Tradeoffs in the Timing of Design FeedbackProceedings of the 16th Conference on Creativity & Cognition10.1145/3635636.3656183(292-310)Online publication date: 23-Jun-2024
    • (2023)Transitioning Cognitive Aids into Decision Support Platforms: Requirements and Design GuidelinesACM Transactions on Computer-Human Interaction10.1145/358243130:3(1-28)Online publication date: 10-Jun-2023
    • (2023)Rsourcer: Scaling Feedback on Research DraftsIntelligent Information Systems10.1007/978-3-031-34674-3_8(61-68)Online publication date: 8-Jun-2023
    • (2022)Tech Worker Perspectives on Considering the Interpersonal Implications of Communication TechnologiesProceedings of the ACM on Human-Computer Interaction10.1145/35675667:GROUP(1-22)Online publication date: 29-Dec-2022
    • (2022)Interpretable Directed Diversity: Leveraging Model Explanations for Iterative Crowd IdeationProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517551(1-28)Online publication date: 29-Apr-2022
    • (2022)Characterizing Practices, Limitations, and Opportunities Related to Text Information Extraction Workflows: A Human-in-the-loop PerspectiveProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502068(1-15)Online publication date: 29-Apr-2022
    • (2022)CReBotInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102898167:COnline publication date: 1-Nov-2022
    • (2021)Shöwn: Adaptive Conceptual Guidance Aids Example Use in Creative TasksProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462072(1834-1845)Online publication date: 28-Jun-2021
    • (2021)Plan Early, Revise MoreProceedings of the ACM on Human-Computer Interaction10.1145/34490985:CSCW1(1-22)Online publication date: 22-Apr-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media