1,320
Views
40
CrossRef citations to date
0
Altmetric
Articles

Understanding human management of automation errors

, &
Pages 545-577 | Received 04 Sep 2012, Accepted 17 Jun 2013, Published online: 07 Aug 2013

References

  • Bahner, J.E., M.F. Elepfandt, and D. Manzey. 2008. “Misuse of Diagnostic Aids in Process Control: The Effects of Automation Misses on Complacency and Automation Bias.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 52 (19): 1330–1334. doi:10.1177/154193120805201906.
  • Bahner, J.E., A.-D. Hüper, and D. Manzey. 2008. “Misuse of Automated Decision Aids: Complacency, Automation Bias and the Impact of Training Experience.” International Journal of Human-Computer Studies 66 (9): 688–699. doi:10.1016/j.ijhcs.2008.06.001.
  • Bailey, N.R., and M.W. Scerbo. 2007. “Automation-Induced Complacency for Monitoring Highly Reliable Systems: The Role of Task Complexity, System Experience, and Operator Trust.” Theoretical Issues in Ergonomics Science 8 (4): 321–348. doi:10.1080/14639220500535301.
  • Bainbridge, L. 1983. “Ironies of Automation.” Automatica 19: 776–779.
  • Bliss, J.P., and M.C. Dunn. 2000. “Behavioural Implications of Alarm Mistrust as a Function of Task Workload.” Ergonomics 43 (9): 1283–1300.
  • Clark, K. 2011. The GPS: A Fatally Misleading Travel Companion [online]. National Public Radio (NPR). Available from: http://www.npr.org/2011/07/26/137646147/the-gps-a-fatally-misleading-travel-companion [Accessed 4 September 2011].
  • Cohen, M.S., R. Parasuraman, and J.T. Freeman. 1998. “Trust in Decision Aids: A Model and Its Training Implications.” Paper presented at the 1998 Command and Control Research and Technology Symposium. Monterey, CA.
  • Dixon, S.R., and C.D. Wickens. 2006. “Automation Reliability in Unmanned Aerial Vehicle Control: A Reliance-Compliance Model of Automation Dependence in High Workload.” Human Factors 48 (3): 474–486.
  • Dixon, S.R., C.D. Wickens, and D. Chang. 2005. “Mission Control of Multiple Unmanned Aerial Vehicles: A Workload Analysis.” Human Factors 47 (3): 479–487. doi:10.1518/001872005774860005.
  • Dixon, S.R., C.D. Wickens, and J.S. McCarley. 2007. “On the Independence of Compliance and Reliance: Are Automation False Alarms Worse Than Misses?” Human Factors 49 (4): 564–572. doi:10.1518/001872007×215656.
  • Dzindolet, M.T., S.A. Peterson, R.A. Pomranky, L.G. Pierce, and H.P. Beck. 2003. “The Role of Trust in Automation Reliance.” International Journal of Human-Computer Studies 58 (6): 697–718. doi:10.1016/s1071-5819(03)00038-7.
  • Dzindolet, M.T., L.G. Pierce, H.P. Beck, L.A. Dawe, and B.W. Anderson. 2001. “Predicting Misuse and Disuse of Combat Identification Systems.” Military Psychology 13 (3): 147–164.
  • Endsley, M.R. 1996. “Automation and Situation Awareness.” In Automation and Human Performance: Theory and Applications, edited by R. Parasuraman and M. Mouloua, 163–181. Mahwah, NJ: Lawrence Erlbaum.
  • Endsley, M.R., and D.B. Kaber. 1999. “Level of Automation Effects on Performance, Situation Awareness and Workload in a Dynamic Control Task.” Ergonomics 42 (3): 462–492.
  • Endsley, M.R., and E.O. Kiris. 1995. “The Out-of-the-Loop Performance Problem and Level of Control in Automation.” Human Factors 37 (2): 381–394.
  • Ezer, N., A.D. Fisk, and W.A. Rogers. 2008. “Age-Related Differences in Reliance Behaviour Attributable to Costs Within A Human-Decision Aid System.” Human Factors 50 (6): 853–863. doi:10.1518/001872008×375018.
  • Green, D.M., and J.A. Swets. 1988. Signal Detection Theory and Psychophysics. New York, NY: Wiley.
  • Hancock, P.A., R.J. Jagacinski, R. Parasuraman, C.D. Wickens, G.F. Wilson, and D.B. Kaber. 2013. “Human-Automation Interaction Research: Past, Present, and Future.” Ergonomics in Design: The Quarterly of Human Factors Applications 21 (2): 9–14. doi:10.1177/1064804613477099.
  • Kaber, D.B., and M.R. Endsley. 2004. “The Effects of Level of Automation and Adaptive Automation on Human Performance, Situation Awareness and Workload in a Dynamic Control Task.” Theoretical Issues in Ergonomics Science 5 (2): 113–153. doi:10.1080/1463922021000054335.
  • Kaber, D.B., P.A. Hancock, R.J. Jagacinski, R. Parasurman, C.D. Wickens, G.F. Wilson, E. Bass, K. Feigh, and J. Ockerman. 2011. “Pioneers in Cognitive Engineering & Decision Making Research – Foundational Contributions to the Science of Human-Automation Interaction.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 55 (1): 321–325. doi:10.1177/1071181311551066.
  • Kanse, L., and T. van der Schaaf. 2001. “Recovery From Failures in the Chemical Process Industry.” International Journal of Cognitive Ergonomics 5 (3): 199–211.
  • Kantowitz, B.H., R.J. Hanowski, and S.C. Kantowitz. 1997. “Driver Acceptance of Unreliable Traffic Information in Familiar and Unfamiliar Settings.” Human Factors 39 (2): 164–176. doi:10.1518/001872097778543831.
  • Keller, D., and S. Rice. 2010. “System-Wide Versus Component-Specific Trust Using Multiple Aids.” Journal of General Psychology 137 (1): 114–128. doi:10.1080/00221300903266713.
  • Kontogiannis, T. 1999. “User Strategies in Recovering from Errors in Man-Machine Systems.” Safety Science 32 (1): 49–68. doi:10.1016/s0925-7535(99)00010-7.
  • Kontogiannis, T. 2011. “A Systems Perspective of Managing Error Recovery and Tactical Re-Planning of Operating Teams in Safety Critical Domains.” Journal of Safety Research 42 (2): 73–85. doi:10.1016/j.jsr.2011.01.003.
  • Kontogiannis, T., and S. Malakis. 2009. “A Proactive Approach to Human Error Detection and Identification in Aviation and air Traffic Control.” Safety Science 47 (5): 693–706. doi:10.1016/j.ssci.2008.09.007.
  • Lee, J., and N. Moray. 1992. “Trust, Control Strategies and Allocation of Function in Human-Machine Systems.” Ergonomics 35 (10): 1243–1270. doi:10.1080/00140139208967392.
  • Lee, J.D., and K.A. See. 2004. “Trust in Automation: Designing for Appropriate Reliance.” Human Factors 46 (1): 50–80. doi:10.1518/hfes.46.1.50.30392.
  • Lorenz, B., F. Di Nocera, S. Röttger, and R. Parasuraman. 2002. “Automated Fault-Management in a Simulated Spaceflight Micro-World.” Aviation, Space, and Environmental Medicine 73 (9): 886–897.
  • McBride, S.E., W.A. Rogers, and A.D. Fisk. 2011. “Understanding the Effect of Workload on Automation Use for Younger and Older Adults.” Human Factors 53 (6): 672–686. doi:10.1177/0018720811421909.
  • Maltz, M., and D. Shinar. 2004. “Imperfect In-Vehicle Collision Avoidance Warning Systems can Aid Drivers.” Human Factors 46 (2): 357–366.
  • Metzger, U., and R. Parasuraman. 2005. “Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload.” Human Factors 47 (1): 35–49.
  • Molloy, R., and R. Parasuraman. 1996. “Monitoring an Automated System for a Single Failure: Vigilance and Task Complexity Effects.” Human Factors 38 (2): 311–322. doi:10.1518/001872096779048093.
  • Moray, N. 2003. “Monitoring, Complacency, Scepticism and Eutactic Behaviour.” International Journal of Industrial Ergonomics 31 (3): 175–178.
  • Moray, N., T. Inagaki, and M. Itoh. 2000. “Adaptive Automation, Trust, and Self-Confidence in Fault Management of Time-Critical Tasks.” Journal of Experimental Psychology-Applied 6 (1): 44–58. doi:10.1037//0278-7393.6.1.44.
  • Mosier, K.L., and L.J. Skitka. 1996. “Human Decision Makers and Automated Decision Aids: Made for Each Other?” In Automation and Human Performance: Theory and Applications, edited by R. Parasuraman and M. Mouloua, 201–220. Mahwah, NJ: Lawrence Erlbaum.
  • Mosier, K.L., L.J. Skitka, M. Dunbar, and L. McDonnell. 2001. “Aircrews and Automation Bias: The Advantages of Teamwork?” International Journal of Aviation Psychology 11 (1): 1–14. doi:10.1207/s15327108ijap1101_1.
  • Mosier, K.L., L.J. Skitka, S. Heers, and M. Burdick. 1998. “Automation Bias: Decision Making and Performance in High-Tech Cockpits.” International Journal of Aviation Psychology 8 (1): 47–63. doi:10.1207/s15327108ijap0801_3.
  • Muir, B.M. 1994. “Trust in Automation: I. Theoretical Issues in the Study of Trust and Human Intervention in Automated Systems.” Ergonomics 37 (11): 1905–1922.
  • Muir, B.M., and N. Moray. 1996. “Trust in Automation. Part II. Experimental Studies of Trust and Human Intervention in a Process Control Simulation.” Ergonomics 39 (3): 429–460. doi:10.1080/00140139608964474.
  • Nikolic, M.I., and N.B. Sarter. 2001. “Peripheral Visual Feedback: A Powerful Means of Supporting Effective Attention Allocation in Event-Driven, Data-Rich Environments.” Human Factors 43 (1): 30–38. doi:10.1518/001872001775992525.
  • Norman, D.A. 1990. “The ‘Problem’ with Automation: Inappropriate Feedback and Interaction, not ‘Over-Automation’.” In Human Factors in Hazardous Situations, edited by D.E. Broadbent, J.T. Reason, and A.D. Baddeley, 137–145. New York, NY: Clarendon Press/Oxford University Press.
  • Palmer, E., and A. Degani. 1991. “Electronic Checklists: Evaluation of Two Levels of Automation.” In Proceedings of the Sixth International Aviation Psychology Symposium, 29 April–2 May 1991, 178–183. Columbus, OH: Department of Aviation, Ohio State University.
  • Parasuraman, R., and D.H. Manzey. 2010. “Complacency and Bias in Human Use of Automation: An Attentional Integration.” Human Factors 52 (3): 381–410. doi:10.1177/0018720810376055.
  • Parasuraman, R., E. de Visser, M.-K. Lin, and P.M. Greenwood. 2012. “Dopamine Beta Hydroxylase Genotype Identifies Individuals Less Susceptible to Bias in Computer-Assisted Decision Making.” Public Library of Science ONE 7 (6): 1–9. doi:10.1371/journal.pone.0039675.
  • Parasuraman, R., R. Molloy, and I.L. Singh. 1993. “Performance Consequences of Automation-Induced ‘Complacency’.” International Journal of Aviation Psychology 3 (1): 1–23. doi:10.1207/s15327108ijap0301_1.
  • Parasuraman, R., and V. Riley. 1997. “Humans and Automation: Use, Misuse, Disuse, Abuse.” Human Factors 39 (2): 230–253.
  • Parasuraman, R., T.B. Sheridan, and C.D. Wickens. 2000. “A Model for Types and Levels of Human Interaction with Automation.” IEEE Transactions on Systems Man and Cybernetics Part A-Systems and Humans 30 (3): 286–297.
  • Parasuraman, R., and C.D. Wickens. 2008. “Humans: Still Vital After all these Years of Automation.” Human Factors 50 (3): 511–520. doi:10.1518/001872008×312198.
  • Prinzel, L.J., III, F.G. Freeman, and H.D. Prinzel. 2005. “Individual Differences in Complacency and Monitoring for Automation Failures.” Individual Differences Research 3 (1): 27–49.
  • Reason, J. 1990. Human Error. Cambridge: Cambridge University Press.
  • Rice, S., and K. Geels. 2010. “Using System-Wide Trust Theory to Make Predictions About Dependence on Four Diagnostic Aids.” Journal of General Psychology 137 (4): 362–375. doi:10.1080/00221309.2010.499397.
  • Riley, V. 1996. “Operator Reliance on Automation: Theory and Data.” In Automation and Human Performance: Theory and Applications, edited by R. Parasuraman and M. Mouloua, 19–35. Mahwah, NJ: Lawrence Erlbaum.
  • Rovira, E., K. McGarry, and R. Parasuraman. 2007. “Effects of Imperfect Automation on Decision Making in a Simulated Command and Control Task.” Human Factors 49 (1): 76–87.
  • Sanchez, J., A.D. Fisk, W.A. Rogers, and E. Rovira. 2011. “Understanding Reliance on Automation: Effects of Error Type, Error Distribution, Age and Experience.” Theoretical Issues in Ergonomics Science. Advance online publication. doi:10.1080/1463922X.2011.611269.
  • Sarter, N. 2008. “Investigating Mode Errors on Automated Flight Decks: Illustrating the Problem-Driven, Cumulative, and Interdisciplinary Nature of Human Factors Research.” Human Factors 50 (3): 506–510. doi:10.1518/001872008×312233.
  • Sarter, N.B., R.J. Mumaw, and C.D. Wickens. 2007. “Pilots’ Monitoring Strategies and Performance on Automated Flight Decks: An Empirical Study Combining Behavioral and Eye-Tracking Data.” Human Factors 49 (3): 347–357. doi:10.1518/001872007×196685.
  • Sarter, N.B., and D.D. Woods. 1994. “Pilot Interaction With Cockpit Automation II: An Experimental Study of Pilots’ Model and Awareness of the Flight Management and Guidance System.” International Journal of Aviation Psychology 4 (1): 1–28.
  • Sarter, N.B., D.D. Woods, and C.E. Billings. 1997. “Automation Surprises.” In Handbook of Human Factors and Ergonomics. 2nd ed., edited by G. Salvendy, 1926–1943. New York, NY: Wiley.
  • Sheridan, T.B. 2008. “Risk, Human Error, and System Resilience: Fundamental Ideas.” Human Factors 50 (3): 418–426. doi:10.1518/001872008×250773.
  • Singh, I.L., R. Molloy, and R. Parasuraman. 1993a. “Automation-Induced ‘Complacency’: Development of the Complacency-Potential Rating Scale.” International Journal of Aviation Psychology 3 (2): 111–122. doi:10.1207/s15327108ijap0302_2.
  • Singh, I.L., R. Molloy, and R. Parasuraman. 1993b. “Individual Differences in Monitoring Failures of Automation.” Journal of General Psychology 120 (3): 357.
  • Skitka, L.J., K.L. Mosier, and M. Burdick. 1999. “Does Automation Bias Decision-Making?” International Journal of Human-Computer Studies 51 (5): 991–1006. doi:10.1006/ijhc.1999.0252.
  • Skitka, L.J., K. Mosier, and M.D. Burdick. 2000. “Accountability and Automation Bias.” International Journal of Human-Computer Studies 52 (4): 701–717. doi:10.1006/ijhc.1999.0349.
  • Skitka, L.J., K.L. Mosier, M. Burdick, and B. Rosenblatt. 2000. “Automation Bias and Errors: Are Crews Better than Individuals?” International Journal of Aviation Psychology 10 (1): 85–97.
  • Skjerve, A., and G. Skraaning. 2004. “The Quality of Human-Automation Cooperation in Human-System Interface for Nuclear Power Plants.” International Journal of Human-Computer Studies 61 (5): 649–677. doi:10.1016/j.ijhcs.2004.06.001.
  • Sorkin, R.D., and D.D. Woods. 1985. “Systems with Human Monitors: A Signal Detection Analysis.” Human-Computer Interaction 1 (1): 49–75.
  • Tsang, P.S., and M.A. Vidulich. 2006. “Mental Workload and Situation Awareness.” In Handbook of Human Factors and Ergonomics. 3rd ed., edited by G. Salvendy, 243–268. Hoboken, NJ: Wiley.
  • Wickens, C.D., and C.M. Carswell. 2006. “Information Processing.” In Handbook of Human Factors and Ergonomics. 3rd ed., edited by G. Salvendy, 1570–1596. Hoboken, NJ: Wiley.
  • Wickens, C.D., and S.R. Dixon. 2007. “The Benefits of Imperfect Diagnostic Automation: A Synthesis of the Literature.” Theoretical Issues in Ergonomics Science 8 (3): 201–212. doi:10.1080/14639220500370105.
  • Wickens, C.D., H. Li, A. Santamaria, A. Sebok, and N.B. Sarter. 2010. “Stages and Levels of Automation: An Integrated Meta-Analysis.” Proceedings of the Human Factors and Ergonomics Society 54th Annual Meeting 54 (4): 389–393.
  • Wiener, E.L., and R.E. Curry. 1980. “Flight-Deck Automation: Promises and Problems.” Ergonomics 23 (10): 995–1011. doi:10.1080/00140138008924809.
  • Wilkison, B., A.D. Fisk, and W.A. Rogers. 2007. “Effects of Mental Model Quality on Collaborative System Performance.” Proceedings of the Human Factors and Ergonomics Society 51st Annual Meeting 51 (22): 1506–1510.
  • Woods, D.D. 1996. “Decomposing Automation: Apparent Simplicity, Real Complexity.” In Automation and Human Performance: Theory and Applications, edited by R. Parasuraman and M. Mouloua, 3–17. Mahwah, NJ: Lawrence Erlbaum.
  • Woods, D.D., S. Dekker, R. Cook, L. Johannesen, and N. Sarter. 2010. Behind Human Error. 2nd ed. Burlington, VT: Ashgate.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.