Authors:
Christophe Hurter
1
;
Augustin Degas
1
;
Arnaud Guibert
1
;
Maelan Poyer
1
;
Nicolas Durand
1
;
Alexandre Veyrie
1
;
Ana Ferreira
2
;
Nicola Cavagnetto
2
;
Stefano Bonelli
2
;
Mobyen Ahmed
3
;
Waleed Jmoona
3
;
Shaibal Barua
3
;
Shahina Begum
3
;
Giulia Cartocci
4
;
Gianluca Di Flumeri
4
;
Gianluca Borghini
4
;
Fabio Babiloni
4
and
Pietro Aricó
4
Affiliations:
1
Ecole Nationale de l’Aviation Civile, ENAC, University of Toulouse, France
;
2
Deep Blue, Rome, Italy
;
3
Artificial Intelligence and Intelligent Systems Research Group, School of Innovation, Design and Engineering, Mälardalen University, Västerås, Sweden
;
4
Department of Molecular Medicine, Sapienza University of Rome, Rome, Italy
Keyword(s):
Artificial Intelligence, eXplainable Artificial Intelligence, User-Centric XAI, Conflict Detection and Resolution, Air Traffic Management.
Abstract:
Artificial Intelligence (AI) has recently made significant advancements and is now pervasive across various application domains. This holds true for Air Transportation as well, where AI is increasingly involved in decision-making processes. While these algorithms are designed to assist users in their daily tasks, they still face challenges related to acceptance and trustworthiness. Users often harbor doubts about the decisions proposed by AI, and in some cases, they may even oppose them. This is primarily because AI-generated decisions are often opaque, non-intuitive, and incompatible with human reasoning. Moreover, when AI is deployed in safety-critical contexts like Air Traffic Management (ATM), the individual decisions generated by AI models must be highly reliable for human operators. Understanding the behavior of the model and providing explanations for its results are essential requirements in every life-critical domain. In this scope, this project aimed to enhance transparency
and explainability in AI algorithms within the Air Traffic Management domain. This article presents the results of the project’s validation conducted for a Conflict Detection and Resolution task involving 21 air traffic controllers (10 experts and 11 students) in En-Route position (i.e. hight altitude flight management). Through a controlled study incorporating three levels of explanation, we offer initial insights into the impact of providing additional explanations alongside a conflict resolution algorithm to improve decision-making. At a high level, our findings indicate that providing explanations is not always necessary, and our project sheds light on potential research directions for education and training purposes.
(More)