Carl Westin

5-1 Introduction 85 A BSTRACT Recent research has highlighted transparency as a critical quality of automation for improv- ing task performance and facilitating appropriate calibration of operators’ acceptance and trust. Additionally, the exploration of personalized decision aids for improving human- automation interaction has received recent attention in time-critical and highly automated work environments, such as ATC. This paper investigated the interaction between automa- tion transparency and a personalized decision aid on air traffic controllers’ performance and acceptance of that aid in a conflict detection and resolution task. Automation transparency (the degree of meta-information provided) and resolution advisory conformance (the degree of personalization) were varied in a real-time ATC simulation. A repeated measures design was used with two versions of interface representations (high and low transparency) and two versions of resolution advisories (conformal and nonconformal). While no statistical significant acceptance or performance effects were found, results indicated that participants used the two interfaces differently, and preferred conformal advisories (i.e., advisories based on their own solution style) irrespective of transparency. In addition, post-simulation ques- tionnaires revealed a strong preference for the high transparency condition which partly reflected simulation results. This study concludes that increasing transparency involves providing more information, which can incur a cognitive cost in information processing that needs to be traded off with the expected benefits of affording more transparency. 5-1 Introduction An inevitable result of introducing automation is that it distances the operator from the “hands-on” experience of conducting the task. The automation functions as a relay between the operator and task, with the intention to greatly enhance task execution and improve safety and efficiency. With the introduction of advanced automation, however, comes the ability for the system to conduct work and solve problems differently (i.e., nonconformal) from the human. 155 Occasionally the op- erator’s understanding of what the system is doing is broken. This breakdown has been labeled automation surprise, leaving the operator “out- of-the-loop” and confused as the automation is not performing as expected, or acting in an unanticipated and uncommanded way. 186 To counter such automation-induced pitfalls, it is essential that the automation can communicate effectively and facilitate understanding. Ironically, advanced automation often features high levels of opac- ity, as the system’s complexity is hidden from the operator inside its “black box”. 187 Previous research has shown that opaque systems generally have a negative effect on acceptance and trust as operators question the automation (what is going on? what is the system doing?) 186, 188 Thus the paradox: with increased automation, the human and communication between the two become more important, not less. 15, 187 The need for effective and clear communication is particularly important in

RkJQdWJsaXNoZXIy MTk4NDMw