Carl Westin

References 231 144 Alberdi, E., Ayton, P., Povyakalo, A. A., and Strigini, L., “Automation bias and system design: A case study in a medical application,” Proc. IEE MOD HFI DTC , London, UK, Nov. 16-17 2005, pp. 53–60. 145 Goddard, K., Roudsari, A., and Wyatt, J. C., “Automation bias: A systematic review of frequency, effect mediators, and mitigators,” J. Amer. Med. Inform. Assoc. , Vol. 19, No. 1, 2012, pp. 121– 127. 146 Rieh, S. Y. and Danielson, D. R., “Credibility: A multidisciplinary framework,” Annual Rev. Info. Sci. & Technol. , Vol. 41, No. 1, 2007, pp. 307–364. 147 Kelly, C., Boardman, M., Goillau, P., and Jeannot, E., “Guidelines for trust in future ATM systems: A literature review,” Rep. hrs/hsp-005-gui-01, EUROCONTROL, Brussels, Belgium, May. 2003. 148 Wang, L., Jamieson, G. A., and Hollands, J. G., “Trust and reliance on an automated combat identification system,” Hum. Factors , Vol. 51, No. 3, 2009, pp. 281–291. 149 Oleson, K. E., Hancock, P. A., Billings, D. R., and Schesser, C. D., “Trust in unmanned aerial systems: A synthetic, distributed trust model,” Proc. 16th ISAP , Dayton, OH, May. 2-5 2011. 150 BFU, “Investigation report,” Tech. Rep. AX001-1-2/02, German Federal Bureau of Aircraft Acci- dents Investigation, Braunschweig, Germany, May 2004. 151 NTSB, “Aircraft accident report: Descent below visual glidepath and impact with seawall,” Tech. Rep. NTSB/AAR-14/01, National Transportation Safety Board, Washington, DC, Jun. 2014. 152 Lee, J. D., “Review of a pivotal human factors article: “Humans and automation: Use, misuse, disuse, abuse”,” Hum. Factors , Vol. 50, No. 3, June 2008, pp. 404–410. 153 Wiener, E. L., “Cockpit automation,” Human factors in aviation , edited by E. L. Wiener and D. C. Nagel, Harcourt Press, San Diego, CA, 1988, pp. 433–461. 154 Szalma, J. L., “Individual differences in human-technology interaction: Incorporating variation in human characteristics into human factors and ergonomics research and design,” Theor. Issues Ergon. Sci. , Vol. 10, No. 5, 2009, pp. 381–397. 155 Westin, C., Borst, C., and Hilburn, B., “Strategic conformance: Overcoming acceptance issues of decision aiding automation?” IEEE Trans. Human–Mach. Syst. , Vol. 46, No. 1, 2016, pp. 41– 52. 156 Muir, B. M., “Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems,” Ergonomics , Vol. 37, No. 11, 1994, pp. 1905–1922. 157 Tseng, S. and Fogg, B. J., “Credibility and computing technology,” Commun. ACM , Vol. 42, No. 5, 1999, pp. 39–44. 158 Earle, T. C., “Trust in risk management: A model-based review of empirical research,” Risk anal. , Vol. 30, No. 4, 2010, pp. 541–574. 159 Kim, J. and Moon, J. Y., “Designing towards emotional usability in customer interfaces – trustwor- thiness of cyber-banking system interfaces,” Interact. comput. , Vol. 10, No. 1, 1998, pp. 1–29. 160 Lewandowsky, S., Mundy, M., and Tan, G. P., “The dynamics of trust: Comparing humans to automation,” J. Exp. Psychol.-Appl. , Vol. 6, No. 2, 2000, pp. 104–123. 161 Dzindolet, M. T., Beck, H. P., Pierce, L. G., and Dawe, L. A., “A framework of automation use,” Tech. rep. arl-tr-2412, Army Res. Lab., Aberdeen Proving Ground, MD, Mar. 2001. 162 Gao, J. and Lee, J. D., “Extending the decision field theory to model operators’ reliance on au- tomation in supervisory control situations,” IEEE Trans. Syst., Man, Cybern. A, Syst. ,Humans , Vol. 36, No. 5, 2006, pp. 943–959. 163 Moray, N., Inagaki, T., and Itoh, M., “Adaptive automation, trust, and self-confidence in fault management of time-critical tasks,” J. Exp. Psychol.-Appl. , Vol. 6, No. 1, 2000, pp. 44–58.

RkJQdWJsaXNoZXIy MTk4NDMw