Content area
Full Text
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
INTRODUCTION
With the accelerated introduction of advanced automation into a variety of complex, human-operated systems, unexpected problems with overall system performance have been observed (Parasuraman, Sheridan, & Wickens, 2000; Wiener, 1989, 1993; Wiener, Chute, & Moses, 1999). Many of these problems have been attributed to deficiencies in communication and coordination between the human and the machine. They are especially acute in cases where the human and the machine share authority over the system's operation (Mellor, 1994). Notable examples of such systems are modern commercial aircraft with advanced flight management systems (Abbott, Slotte, & Stimson, 1996).
One important and well-documented class of advanced automation systems is the automatic flight control system (AFCS) of modern jetliners. In recent years, faulty pilot interaction with the AFCS has become a major concern in the civil transport industry. This problem has variously been described as lack of mode awareness, mode confusion, or automation surprises (Woods, Sarter, & Billings, 1997). Two main factors have frequently been cited, in accident and incident reports and in the scientific literature, as being responsible for such breakdowns: (a) The user has an inadequate "mental model" of the machine's behavior (Javaux & De Keyser, 1998; Sarter & Woods, 1995). (b) The interface between the user and the machine provides inadequate information about the status of the machine (Norman, 1990). Both factors may limit the user's ability...