References
- Albayram, Y., Jensen, T., Khan, M. M. H., Buck, R., & Coman, E. (2019). Investigating the effect of system reliability, risk, and role on users’ emotions and attitudes toward a safety-critical drone system. International Journal of Human–Computer Interaction, 35(9), 761–772. https://doi.org/10.1080/10447318.2018.1491665
- American Psychological Association. (2017). Ethical principles of psychologists and code of conduct (2002, amended effective June 1, 2010, and January 1, 2017). American Psychological Association. https://www.apa.org/ethics/code/index.aspx
- Asan, O., & Carayon, P. (2017). Human factors of health information technology—challenges and opportunities. International Journal of Human-computer Interaction, 33(4), 255–257. https://doi.org/10.1080/10447318.2017.1282755
- Bagheri, N., & Jamieson, G. A. (2004). Considering subjective trust and monitoring behavior in assessing automation-induced “complacency”. In D. A. Vincenzi, M. Mouloua, & P. A. Hancock (Eds.), Human performance, situation awareness, and automation. Current research and trends HPSAA II, volumes I and II (Vol. 1, pp. 54–59). Mahwah, NJ: Lawrence Erlbaum Associates Inc. https://apps.dtic.mil/sti/pdfs/ADA426023.pdf#page=64
- Bagheri, N., & Jamieson, G. A. (2004). The impact of context-related reliability on automation failure detection and scanning behaviour. In 2004 IEEE International Conference on Systems, Man and Cybernetics (Vol. 1, pp. 212–217). IEEE. https://doi.org/10.1109/ICSMC.2004.1398299
- Bailey, N. R., & Scerbo, M. W. (2007). Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust. Theoretical Issues in Ergonomics Science, 8(4), 321–348. https://doi.org/10.1080/14639220500535301
- Bainbridge, L. 1983. Ironies of automation. Automatica, 19, 775–779.Elsevier. https://doi.org/10.1016/0005-1098(83)90046-8
- Cegarra, J., & Hoc, J.-M. (2008). The role of algorithm and result comprehensibility of automated scheduling on complacency. Human Factors and Ergonomics in Manufacturing & Service Industries, 18(6), 603–620. https://doi.org/10.1002/hfm.20129
- Cegarra, J., Valéry, B., Avril, E., Calmettes, C., & Navarro, J. (2019). OpenMATB: A multi-attribute task battery promoting task customization, software extendability and experiment replicability. Manuscript submitted for publication.
- Chavaillaz, A., Wastell, D., & Sauer, J. (2016). System reliability, performance and trust in adaptable automation. Applied Ergonomics, 52, 333–342. https://doi.org/10.1016/j.apergo.2015.07.012
- Comstock, J. R., & Amegard, R. J. (1992). The multi-attribute task battery for human operator workload and strategic behavior research (Tech. Memorandum 104174). NASA Langley Research Center.
- Crocoll, W. M., & Coury, B. G. (1990). Status or recommendation: Selecting the type of information for decision aiding. In Proceedings of the human factors society annual meeting, (Vol. 34, No. 19, pp. 1524–1528). Los Angeles, CA: SAGE Publications.
- Cummings, M. L., & Mitchell, P. J. (2007). Operator scheduling strategies in supervisory control of multiple UAVs. Aerospace Science and Technology, 11(4), 339–348. https://doi.org/10.1016/j.ast.2006.10.007
- Ferraro, J. C., & Mouloua, M. (2020). Effects of automation reliability on error detection and attention to auditory stimuli in a multi-tasking environment. Applied Ergonomics, 91, 103303. https://doi.org/10.1016/j.apergo.2020.103303
- Gressgard, L. J., Hansen, K., & Iversen, F. (2013). Automation systems and work process safety: Assessing the significance of human and organizational factors in offshore drilling automation. Journal of Information Technology Management, 24(2), 47. http://jitm.ubalt.edu/XXIV-2/article4.pdf
- Hoc, J.-M. (2000). La relation homme-machine en situation dynamique. Revue D’intelligence Artificielle, 14(1–2), 55–71. http://jeanmichelhoc.free.fr/pdf/Hoc%202000b.pdf
- Itoh, M., & Inagaki, T. (2004). A microworld approach to identifying issues of human-automation systems design for supporting operator’s situation awareness. International Journal of Human-computer Interaction, 17(1), 3–24. https://doi.org/10.1207/s15327590ijhc1701_2
- Jones, L. M. (2007). Effect of repeated function allocation and reliability on automation-induced monitoring inefficiency. Dissertation abstracts International, 69, 1366. http://purl.fcla.edu/fcla/etd/CFE0001874
- Karpinsky, N. D., Chancey, E. T., Palmer, D. B., & Yamani, Y. (2018). Automation trust and attention allocation in multitasking workspace. Applied Ergonomics, 70, 194–201. https://doi.org/10.1016/j.apergo.2018.03.008
- Lorenz, B., Di Nocera, F., Röttger, S., & Parasuraman, R. (2002). Automated fault management in a simulated space flight microworld. Aviation, Space, and Environmental Medicine, 73(9), 886–897. https://europepmc.org/article/med/12234040
- Manzey, D., Reichenbach, J., & Onnasch, L. (2009). Human performance consequences of automated decision aids in states of fatigue. In Proceedings of the human factors and ergonomics society annual meeting (Vol. 53. No. 4, pp. 329–333). Los Angeles, CA: Sage Publications.
- Meyer, J. (2004). Conceptual issues in the study of dynamic hazard warnings. Human Factors, 46(2), 196–204. https://doi.org/10.1518/hfes.46.2.196.37335
- Molloy, R., & Parasuraman, R. (1996). Monitoring an automated system for a single failure: Vigilance and task complexity effects. Human Factors, 38(2), 311–322. https://doi.org/10.1177/001872089606380211
- Moray, N., & Inagaki, T. (2000). Attention and complacency. Theoretical Issues in Ergonomics Science, 1(4), 354–365. https://doi.org/10.1080/14639220052399159
- Mosier, K. L., & Skitka, L. J. (1996). Human decision makers and automated decision aids: Made for each other? In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and application (pp. 201–220). Erlbaum.
- Navarro, J. (2019). A state of science on highly automated driving. Theoretical Issues in Ergonomics Science, 20(3), 366–396. https://doi.org/10.1080/1463922X.2018.1439544
- Navarro, J., François, M., & Mars, F. (2016). Obstacle avoidance under automated steering: Impact on driving and gaze behaviours. Transportation Research. Part F, Traffic Psychology and Behaviour, 43, 315–324. https://doi.org/10.1016/j.trf.2016.09.007
- Navarro, J., Heuveline, L., Avril, E., & Cegarra, J. (2018). Influence of Human-Machine Interactions and Task Demand on Automation Selection and Use. Ergonomics, 1–39.
- Navarro, J., Osiurak, F., Ovigue, M., Charrier, L., & Reynaud, E. (2019). Highly automated driving impact on drivers’ gaze behaviors during a car-following task. International Journal of Human–Computer Interaction, 35(11), 1008–1017. https://doi.org/10.1080/10447318.2018.1561788
- Oakley, B., Mouloua, M., & Hancock, P. (2003). Effects of reliability on human monitoring performance. In Proceedings of the human factors and ergonomics society annual meeting, (Vol. 47, No. 1, pp. 188–190). Denver, Colorado, USA: Sage CA: Los Angeles, CA: SAGE Publications. https://doi.org/10.1177/154193120304700139
- Onnasch, L. (2015). Crossing the boundaries of automation—Function allocation and reliability. International Journal of Human-computer Studies, 76, 12–21. https://doi.org/10.1016/j.ijhcs.2014.12.004
- Onnasch, L., Wickens, C. D., Li, H., & Manzey, D. (2014). Human performance consequences of stages and levels of automation: An integrated meta-analysis. Human Factors, 56(3), 476–488. https://doi.org/10.1177/0018720813501549
- Parasuraman, R., & Manzey, D. H. (2010). Complacency and bias in human use of automation: An attentional integration. Human Factors, 52(3), 381–410. https://doi.org/10.1177/0018720810376055
- Parasuraman, R., Molloy, R., & Singh, I. L. (1993). Performance consequences of automation-induced ‘complacency’. The International Journal of Aviation Psychology, 3(1), 1–23. https://doi.org/10.1207/s15327108ijap0301_1
- Parasuraman, R., & Riley, V. (1997). Humans and Automation: Use. Misuse, Disuse, Abuse, 39(2), 230–253. https://doi.org/10.1518/001872097778543886
- Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 30(3), 286–297. https://doi.org/10.1109/3468.844354
- Rovira, E., Cross, A., Leitch, E., & Bonaceto, C. (2014). Displaying contextual information reduces the costs of imperfect decision automation in rapid retasking of ISR assets. Human Factors, 56(6), 1036–1049. https://doi.org/10.1177/0018720813519675
- Rovira, E., McGarry, K., & Parasuraman, R. (2007). Effects of imperfect automation on decision making in a simulated command and control task. Human Factors, 49(1), 76–87. https://doi.org/10.1518/001872007779598082
- Rovira, E., Zinni, M., & Parasuraman, R. (2002). Effects of information and decision automation on multi-task performance. In Proceedings of the human factors and ergonomics society annual meeting, (Vol. 46, No. 3, pp. 327–331). Baltimore, Maryland, USA: Sage CA: Los Angeles, CA: SAGE Publications.
- Santiago-Espada, Y., Myer, R. R., Latorella, K. A., & Comstock, J. R. (2011). The multi-attribute task battery II (MATB-II) soft- ware for human performance and workload research: A user’s guide. National Aeronautics and Space Administration, Langley Research Center.
- Sarter, N. B., Woods, D. D., & Billings, C. E. (1997). Automation surprises. In G. Salvendy (Ed.), Handbook of human factors and ergonomics (2nd ed., pp. 1926–1943). Wiley.
- Sarter, N. B., & Schroeder, B. (2001). Supporting decision making and action selection under time pressure and uncertainty: The case of in-flight icing. Human Factors, 43(4), 573–583. https://doi.org/10.1518/001872001775870403
- Singh, I. L., Molloy, R., & Parasuraman, R. (1993). Individual differences in monitoring failures of automation. The Journal of General Psychology, 120(3), 357–373. https://doi.org/10.1080/00221309.1993.9711153
- Singh, I. L., Sharma, H. O., & Parasuraman, R. (2000). Effects of training and automation reliability on monitoring performance in a flight simulation task. In Proceedings of the human factors and ergonomics society annual meeting, (Vol. 44, No. 13, pp. 53–56). Los Angeles, CA: SAGE Publications.
- Stanton, N. A., Young, M. S., Walker, G. H., Turner, H., & Randle, S. (2001). Automating the driver’s control tasks. International Journal of Cognitive Ergonomics, 5(3), 221–236. https://doi.org/10.1207/S15327566IJCE0503_5
- Walker, G. H., Stanton, N. A., & Young, M. S. (2001). Where is computing driving cars?. InternationalJournal of Human-Computer Interaction, 13(2), 203–229. https://doi.org/10.1207/S15327590IJHC1302_7
- Wesley, D., & Dau, L. A. (2017). Complacency and automation bias in the enbridge pipeline disaster. Ergonomics in Design, 25(1), 17–22. https://doi.org/10.1177/1064804616652269
- Wickens, C. D., Clegg, B. A., Vieane, A. Z., & Sebok, A. L. (2015). Complacency and automation bias in the use of imperfect automation. Human Factors, 57(5), 728–739. https://doi.org/10.1177/0018720815581940
- Wickens, C. D., & Dixon, S. R. (2007). The benefits of imperfect diagnostic automation: A synthesis of the literature. Theoretical Issues in Ergonomics Science, 8(3), 201–212. https://doi.org/10.1080/14639220500370105
- Wickens, C. D., & Hollands, J. G. (2000). Engineering psychology andhuman performance (3rd ed.). PrenticeHall.
- Wickens, C. D., Sebok, A., Li, H., Sarter, N., & Gacy, A. M. (2015). Using modeling and simulation to predict operator performance and automation-induced complacency with robotic automation: A case study and empirical validation. Human Factors, 57(6), 959–975. https://doi.org/10.1177/0018720814566454
- Wiener, E. L., & Curry, R. E. (1980). Flight-deck automation: Promises and problems. Ergonomics, 23(10), 995–1011. https://doi.org/10.1080/00140138008924809
- Zybak, S., Scerbo, M. W., & Ashdown, A. (2016). System reliability, trust, and complacency in fetal heart rate monitoring. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 60(1), 1250–1254. https://doi.org/10.1177/1541931213601291