Carl Westin

7-4 Limitations and pitfalls of strategic conformance 173 not personalized. Similar to anthropomorphic features of automation that do not concern its performance, 48, 114, 287 conformal automation may influence acceptance inappropriately in relation to the automation’s actual reliability. Operators’ expec- tations on conformal automation may be inflated and unreasonably high, leading to issues of automation bias and complacency, and large decays in trust and acceptance when the automation fails (in line with the perfect automation schema). Ironically, however, conformal automation may counter adverse effects of the perfect automa- tion schema, automation bias, and complacency. Research indicates that the more human-like automation is, the more it is treated as a human (i.e., the media-equation effect). 48, 118, 124 Future research should explore whether similar effects can be found in relation to conformal automation. Strategic conformal automation can, however, be considered controversial in that acceptance and trust may be influenced independently from how “good” the actual decision-making strategy is. That is, the perceived “good” behavior of au- tomation can conceal poor performance. There is a risk of skewed belief in the capabilities and qualities of the automation leading to unnatural high expectations of the automation. 114, 183 Ideally, automation should be designed to facilitate that operators’ attitudes toward the system reflect the system’s actual reliability. Be- cause the interaction with automation is influenced by the degree of conformance, the purpose of the automation and the desired response from the operator must be considered during design. With this in mind, the degree to which the automation is conformal to the human (whether in problem-solving or by appearance) is also an ethical one. Is it ethically acceptable to develop conformal automation that influences a person’s behavior, even if the system’s behavior is designed to match that same person? Moreover, is it ethical to have a machine learn about a person and partly mimic that person? Most important are perhaps the ethical issues of privacy, related to the gathering and treat- ment of an individual’s data. As a suggestion for future research, ethical guidelines from research on persuasive technologies 289 can perhaps be used to mitigate some of theses concerns of conformal automation. 7-4-11 Why not fully automate? Finally, it is worth questioning the contribution of strategic conformance given the development towards a fully automated system that, eventually, may allow design- ers to entirely disregard the human way of solving problems. Indeed, machines have matured cognitively and become increasingly smart. Abilities previously con- sidered as exclusively human are being automated in a fast pace. This includes combinations of sensors and algorithms providing the visually impaired with key aspects of sight (e.g., OrCam system), artificial intelligent decision aids supporting

RkJQdWJsaXNoZXIy MTk4NDMw