A method for team intention inference
Introduction
Along with the introduction of advanced automation technology, the reliability and safety of today's technological systems have been enhanced, and often the workload of human operators has been greatly reduced. On the other hand, these systems have increasingly complicated and their behaviours have become increasingly invisible to human operators. Even though new types of man–machine interfaces are proposed and implemented in real systems, there are some cases in which human operators cannot understand behaviours of these automated systems because the basic function of the man–machine interfaces is limited to information exchanges, lacking more conceptual and intentional aspects of communication that enable humans to manage cooperative work efficiently (Paris et al., 2000; Hutchins, 1995). Introduction of automation also raises a difficult problem in human–machine relations. As for the final authority of decision making, human-centred automation (Hollnagel, 1995) has been widely acknowledged because it is difficult to anticipate every situation beforehand in system design and the automated system cannot take responsibility for accidents. However humans do not always make an optimal decision because of the limitation of cognitive capability and thus, the probability of human errors cannot be eradicated (Reason, 1990). There might not be a straightforward answer to the problem of whether humans or machines run the show. In fact, invisible behaviours of the automated-system and a poor relationship between humans and machines can cause serious problems, and sometimes lead to a critical accident (e.g. TMI, airplane crash in NAGOYA airport).
In order to enhance the reliability and safety of highly automated systems, it is important to develop such a sophisticated means of man–machine communication that humans and machines can understand the process behind each other's behaviours and decisions and establish cooperative relations so that they can complementarily perform required tasks.
Research has been already carried out on intent inferencing or plan recognition methods to make an interactive system serve as a more cooperative partner in the user's task by plan completion, error handling, information management, and so on (Goodman and Litman, 1992; Rubin et al., 1988; Hollnagel, 1992) This research concerned only one person's intention and does not relate to social or team situations in which intentions of others are somewhat relevant to one’s intention. In other words they are only focused on man–machine communication. However, in large and complex artefacts such as power plants and aircrafts, a team operates the system. We have to therefore deal with the team intention in cooperative activities that is not the same as the mere summation of individual intentions. The present study aims to develop a method for team intention inference for sophisticated team–machine communication in process domains.
In Section 2, team intention is explained by individual intention and mutual belief based on philosophical arguments regarding cooperative activities. In Section 3, a team intention inference method is explained in detail. In Section 4, the implementation of our method for a plant simulator, dual reservoir system simulation (DURESS), is explicated. In Section 5, inference results of the proposed method applied to the log data of operation of a plant simulator operated by a two-person team are illustrated and discussed. Then conclusions are given in Section 6.
Section snippets
Team intention
In this section, we provide an explanation of team intention based on philosophical arguments regarding cooperative activities. We can find many discussions and analyses on various notions of intention. Such analyses typically have to do with intention of a single person and do not seriously relate to team situations. Most of the conventional intention inference methods are based on a model of single person. It is common place, however, that one's group or its members affect one's intention,
Method for team intention inference
When an observer outside of the team tries to understand team behaviours from the individual perspective, he or she infers individual intentions and beliefs of constituents to interpret collective behaviours and then specifies a team intention by checking consistencies among his/her inference results of each constituent's mental components. Considering this technique of humans to infer a team intention from the bottom up viewpoint, we propose a method that identifies a set of intentions and
Implementation
We used a thermal-hydraulic process simulation (DURESS) to generate decision plans and test data. In this section, the system architecture applied to this simulation is explicated.
Validation
The proposed method was applied to the operation of a plant simulator (DURESS), where two operators cooperatively controlled the system. The inference ability of the method was evaluated from the two standpoints. We compared the results by the proposed method and (i) the actual team intention obtained by a post-scenario interview and (ii) a human observer's inference results. The performance of the proposed method was also compared with that of mere summation of individual intention inference.
Conclusion
The major contribution of this study is to provide a method for team intention inference in process domains. The underlying assumption here is that the mechanism of intention formation behind team cooperative activities is different from that of the individual, and methods for individual intention inference cannot be directly applicable to team cooperative activities. The proposed method is based on the mechanism of team intention and uses expectations of the other members as clues to infer a
References (24)
The design of fault tolerant systemsprevention is better than cure
Reliability Engineering and System Safety
(1992)- et al.
Learning plans for an intelligent assistant by observing user behavior
International Journal of Man–Machine Studies
(1990) Shared cooperative activity
The Philosophical Review
(1992)Plan Recognition in Natural Language
(1993)Techniques for plan recognition
User Modeling and User Adapted Interaction
(2001)Institutions and intelligent systems
- Devaney, M., Ram, A., 1998. Needles in a haystack: plan recognition in large spatial domains involving multiAgents....
- Furuta, K., Sakai, T., Kondo, S., 1998. Heuristics for intention inferencing in plant operation. Proceedings of the 4th...
- et al.
On the interaction between plan recognition and intelligent interfaces
User Modeling and User-Adapted Interaction
(1992) - Hatakeyama, N., Furuta, K., 2000. Bayesian network modeling of operator's state recognition process. Proceeding of the...
Human Reliability Analysis
Cited by (25)
Method to assess the adherence of internal logistics equipment to the concept of CPS for industry 4.0
2020, International Journal of Production EconomicsCitation Excerpt :HM: 1 to 3 (cumulative) (Eason, 1991) IR: 1 to 3 (cumulative) (Kanno et al., 2003) UM: 1 to 10 (non-cumulative) (Parasuraman et al., 2000)
Understanding behaviour in problem structuring methods interventions with activity theory
2016, European Journal of Operational ResearchCitation Excerpt :X is used to denote the joint task, Xa to denote agent A's part of X, Ia and Ba to respectively denote A's intention and belief. A we intention (WI) between A and B consists of (IaXa⋀BaXb⋀BaBbXa) in A's mind and (IbXb⋀BbXa⋀BbBaXb) in B's mind (Kanno, Nakata, & Furuta, 2003). In our example, the joint task (X) is focused around the partially shared and partially contested activity object (achieving a zero-carbon enterprise zone).
A method for conflict detection based on team intention inference
2006, Interacting with ComputersCASPER: Cognitive Architecture for Social Perception and Engagement in Robots
2024, International Journal of Social RoboticsEnhancing Robot Task Completion Through Environment and Task Inference: A Survey from the Mobile Robot Perspective
2022, Journal of Intelligent and Robotic Systems: Theory and Applications