skip to main content
10.1145/3607865.3616448acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
keynote

Fairness for Affective and Wellbeing Computing

Published:29 October 2023Publication History

ABSTRACT

Datasets, algorithms, machine learning models and AI powered tools that are used for perception, prediction and decision making constitute the core of affective and wellbeing computing. Majority of these are prone to data or algorithmic bias (e.g., along the demographic attributes of race, age, gender etc.) that could have catastrophic consequences for various members of the society. Therefore, making considerations and providing solutions to avoid and/or mitigate these are of utmost importance for creating and deploying fair and unbiased affective and wellbeing computing systems. This talk will present the Cambridge Affective Intelligence and Robotics (AFAR) Lab's (https://cambridge-afar.github.io/) research explorations in this area.

The first part of the talk will discuss the lack of publicly available datasets with consideration for fair distribution across the human population and it will present a systematic investigation of bias and fairness in facial expression recognition and mental health prediction by comparing various approaches on well-known benchmark datasets. The second part of the talk will question whether counterfactuals can provide a solution for data imbalance, and will introduce an attempt to achieve fairer prediction models for facial expression recognition, while noting the limitations of a counterfactual approach employed at the pre-processing, in-processing and post-processing stages to mitigate for bias.

Majority of the ML methods aiming to mitigate bias focus on balancing data distributions or learning to adapt to the imbalances by adjusting the learning algorithm. The third and last part of the talk will introduce our work demonstrating how continual learning (CL) approaches are well-suited for mitigating bias by balancing learning with respect to different attributes such as race and gender, without compromising on recognition accuracy. The talk at its various stages will also outline recommendations to achieve greater fairness for affective and wellbeing computing, while emphasising the need for such models to be deployed and tested in real world settings and applications, for example for robotic wellbeing coaching via physical robots.

References

  1. Jiaee Cheong, Sinan Kalkan, and Hatice Gunes. 2021. The Hitchhiker's Guide to Bias and Fairness in Facial Affective Signal Processing: Overview and techniques. IEEE Signal Process. Mag. , Vol. 38, 6 (2021), 39--49.Google ScholarGoogle ScholarCross RefCross Ref
  2. Jiaee Cheong, Sinan Kalkan, and Hatice Gunes. 2022. Counterfactual Fairness for Facial Expression Recognition. In Computer Vision - ECCV 2022 Workshops - Tel Aviv, Israel, October 23--27, 2022, Proceedings, Part V. 245--261.Google ScholarGoogle Scholar
  3. Jiaee Cheong, Selim Kuzucu, Sinan Kalkan, and Hatice Gunes. 2023. Towards Gender Fairness for Mental Health Prediction. In Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI 2023, 19th-25th August 2023, Macao, SAR, China. 5932--5940.Google ScholarGoogle ScholarCross RefCross Ref
  4. Jiaee Cheong, Micol Spitale, and Hatice Gunes. 2023 b. "It's not Fair!" -- Fairness for a Small Dataset of Multi-modal Dyadic Mental Well-being Coaching. In 11th International Conference on Affective Computing and Intelligent Interaction, ACII 2023, MIT Media Lab, Cambridge, MA, US, September 10--13, 2023. 1--8.Google ScholarGoogle Scholar
  5. Nikhil Churamani, Minja Axelsson, Atahan Caldir, and Hatice Gunes. 2022a. Continual Learning for Affective Robotics: A Proof of Concept for Wellbeing. In 10th International Conference on Affective Computing and Intelligent Interaction, ACII 2022 - Workshops and Demos, Nara, Japan, October 17--21, 2022. 1--8.Google ScholarGoogle Scholar
  6. Nikhil Churamani, Ozgur Kara, and Hatice Gunes. 2022b. Domain-Incremental Continual Learning for Mitigating Bias in Facial Expression and Action Unit Recognition. IEEE Transactions on Affective Computing (2022), 1--15.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Tian Xu, Jennifer White, Sinan Kalkan, and Hatice Gunes. 2020. Investigating Bias and Fairness in Facial Expression Recognition. In Computer Vision - ECCV 2020 Workshops - Glasgow, UK, August 23--28, 2020, Proceedings, Part VI. 506--523. nGoogle ScholarGoogle Scholar

Index Terms

  1. Fairness for Affective and Wellbeing Computing

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          MRAC '23: Proceedings of the 1st International Workshop on Multimodal and Responsible Affective Computing
          October 2023
          88 pages
          ISBN:9798400702884
          DOI:10.1145/3607865

          Copyright © 2023 Owner/Author

          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 29 October 2023

          Check for updates

          Qualifiers

          • keynote

          Upcoming Conference

          MM '24
          MM '24: The 32nd ACM International Conference on Multimedia
          October 28 - November 1, 2024
          Melbourne , VIC , Australia
        • Article Metrics

          • Downloads (Last 12 months)79
          • Downloads (Last 6 weeks)4

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader