Carl Westin

2-1 Introduction 19 A BSTRACT Cognitive engineering researchers have long studied the complexity and reliability of human-automation interaction. Historically, though, the area of human-automation decision-making compatibility has received less attention. Paradoxically, this could in the future become one of the most critical issues of all, as mismatches between human and automation problem-solving styles could threaten the adoption of automation. This paper presents the concept of strategic conformance as a potential key factor influencing initial acceptance of automation, specifically decision aiding systems capable of guiding decision and action. Here, strategic conformance represents the match in problem-solving style be- tween decision aiding automation and the individual operator. The theoretical foundation builds on the compatibility construct found in technology acceptance theories such as the innovation diffusion and technology acceptance models. The paper concludes with a critical discussion on the limitations and drawbacks of strategic conformance. It is proposed that the construct would be most applicable at the introductory phase of new decision aiding automation, in helping to foster operators’ initial acceptance of such automation. 2-1 Introduction Since the advent of the microprocessor nearly 50 years ago, numerous work envi- ronments have come to increasingly rely on some form of computer automation. Although we have come to accept automation taking over routine and low level tasks, there remains some resistance to automation of safety-critical functions, es- pecially in work domains that mandate automation use and rely on well-educated, well-trained, and highly skilled professionals. 51–54 Cognitive engineering (CE) researchers have studied automation use in relation to such underlying factors as situational awareness, trust, workload, risk, reliabil- ity, and level of automation. 13, 15, 16, 42, 55 Findings suggest that: a) trust in automa- tion develops over time as a result of prolonged experience, 15 b) acceptance and operator performance decrease when the authority and autonomy of automation in- crease, 8, 56, 57 and c) acceptance and operator performance benefit from automation actively involving the operator in the control and decision-making loops. 58 CE researchers have, however, historically paid less attention to factors affect- ing the initial acceptance of new technology, thus factors possibly preceding trust, reliability, and others. Notice that the rejection of new technology can begin at first exposure, perhaps even before an operator has actually used that technology. 59 No- tice in this a potential paradox: an operator might only develop trust after using a system, but might also be unwilling to trust a system he/she has not used. For this reason, initial acceptance of advanced decision-making automation can play a critical role in its successful deployment.

RkJQdWJsaXNoZXIy MTk4NDMw