460
Views
2
CrossRef citations to date
0
Altmetric
Articles

Enhancing component-specific trust with consumer automated systems through humanness design

ORCID Icon, & ORCID Icon
Pages 291-302 | Received 29 Nov 2021, Accepted 06 May 2022, Published online: 27 May 2022

References

  • Bean, N. H., S. C. Rice, and M. D. Keller. 2011. “The Effect of Gestalt Psychology on the System-Wide Trust Strategy in Automation.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 55 (1): 1417–1421. doi:10.1177/1071181311551295.
  • Beggiato, M., and J. F. Krems. 2013. “The Evolution of Mental Model, Trust and Acceptance of Adaptive Cruise Control in Relation to Initial Information.” Transportation Research Part F: Traffic Psychology and Behaviour 18: 47–57. doi:10.1016/j.trf.2012.12.006.
  • Biros, D. P., M. Daly, and G. Gunsch. 2004. “The Influence of Task Load and Automation Trust on Deception Detection.” Group Decision and Negotiation 13 (2): 173–189. doi:10.1023/B:GRUP.0000021840.85686.57.
  • Campbell, D. T. 2007. “Common Fate, Similarity, and Other Indices of the Status of Aggregates of Persons as Social Entities.” Behavioral Science 3 (1): 14–25. doi:10.1002/bs.3830030103.
  • De Visser, E. J., R. Pak, and T. H. Shaw. 2018. “From ‘Automation’ to ‘Autonomy’: The Importance of Trust Repair in Human-Machine Interaction .” Ergonomics 61 (10): 1409–1427. doi:10.1080/00140139.2018.1457725.
  • Durso, F. T, and S. D. Gronlund. 1999. “Situation Awareness.” In Handbook of Applied Cognition, edited by F.T. Durso, R.S. Nickerson, R.W. Schvaneveldt, S.T. Dumais, D.S. Lindsay, and M.T.H. Chi, 283–314. New York: John Wiley & Sons Ltd.
  • Endsley, M. R. 2017. “From Here to Autonomy: Lessons Learned from Human–Automation Research.” Human Factors 59 (1): 5–27. doi:10.1177/0018720816681350.
  • Endsley, M. R., B. Bolte, and D. G. Jones. 2003. Designing for Situation Awareness: An Approach to Human-Centered Design. London, UK: Taylor & Francis.
  • Foroughi, C. K., C. Sibley, N. L. Brown, E. Rovira, R. Pak, and J. T. Coyne. 2019. “Detecting Automation Failures in a Simulated Supervisory Control Environment.” Ergonomics 62 (9): 1150–1161. doi:10.1080/00140139.2019.1629639.
  • Fox, J. M. 1996. “Effects of Information Accuracy on User Trust and Compliance.” In CHI 1996 Conference on Human Factors in Computing Systems, 35–36. New York: Association for Computing Machinery.
  • Geels-Blair, K., S. Rice, and J. Schwark. 2013. “Using System-Wide Trust Theory to Reveal the Contagion Effects of Automation False Alarms and Misses on Compliance and Reliance in a Simulated Aviation Task.” The International Journal of Aviation Psychology 23 (3): 245–266. doi:10.1080/10508414.2013.799355.
  • Gong, L. 2008. “How Social is Social Responses to Computers? The Function of the Degree of Anthropomorphism in Computer Representations.” Computers in Human Behavior 24 (4): 1494–1509. doi:10.1016/j.chb.2007.05.007.
  • Guyton, Z, and R. Pak. 2020. “The Impact of Automation Etiquette on User Performance and Trust in Non-Personified Technology.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 64 (1): 1805–1809. doi:10.1177/1071181320641435.
  • Hoff, K. A., and M. Bashir. 2015. “Trust in Automation: Integrating Empirical Evidence on Factors That Influence Trust.” Human Factors 57 (3): 407–434.
  • Itoh, M. 2012. “Toward Overtrust-Free Advanced Driver Assistance Systems.” Cognition, Technology & Work 14 (1): 51–60. doi:10.1007/s10111-011-0195-2.
  • Kantowitz, B. H., R. J. Hanowski, and S. C. Kantowitz. 1997. “Driver Acceptance of Unreliable Traffic Information in Familiar and Unfamiliar Settings.” Human Factors: The Journal of the Human Factors and Ergonomics Society 39 (2): 164–176. doi:10.1518/001872097778543831.
  • Kazi, T. A., N. A. Stanton, G. H. Walker, and M. S. Young. 2007. “Designer Driving: drivers' Conceptual Models and Level of Trust in Adaptive Cruise Control.” International Journal of Vehicle Design 45 (3): 339–360. doi:10.1504/IJVD.2007.014909.
  • Keller, D., and S. Rice. 2009. “System-Wide Versus Component-Specific Trust Using Multiple Aids.” The Journal of General Psychology 137 (1): 114–128. doi:10.1080/00221300903266713.
  • Kim, P. H., D. L. Ferrin, C. D. Cooper, and K. T. Dirks. 2004. “Removing the Shadow of Suspicion: The Effects of Apology Versus Denial for Repairing Competence-Versus Integrity-Based Trust Violations.” The Journal of Applied Psychology 89 (1): 104–118.
  • Kluck, M., S. C. Kohn, J. C. Walliser, E. J. de Visser, and T. H. Shaw. 2018. “Stereotypical of Us to Stereotype Them: The Effect of System-Wide Trust on Heterogeneous Populations of Unmanned Autonomous Vehicles.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 62 (1): 1103–1107. doi:10.1177/1541931218621253.
  • Lee, J. D., and N. Moray. 1994. “Trust, Self-Confidence, and Operators’ Adaptation to Automation.” International Journal of Human-Computer Studies 40 (1): 153–184. doi:10.1006/ijhc.1994.1007.
  • Lee, J., and N. Moray. 1992. “Trust, Control Strategies and Allocation of Function in Human-Machine Systems.” Ergonomics 35 (10): 1243–1270.
  • Lee, J. D., and K. A. See. 2004. “Trust in Automation: Designing for Appropriate Reliance.” Human Factors 46 (1): 50–80.
  • Lewandowsky, S., M. Mundy, and G. Tan. 2000. “The Dynamics of Trust: Comparing Humans to Automation.” Journal of Experimental Psychology. Applied 6 (2): 104–123.
  • Lopez, J, and R. Pak. 2020. “Does One Bad Machine Spoil the Bunch?: A Review of Trust in Multiple-Component Systems.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 64 (1): 1546–1550. doi:10.1177/1071181320641370.
  • Mehta, R., and S. Rice. 2016. “System Wide Trust: A Possible Contagion Effect.” Collegiate Aviation Review International 34 (1):12–34. doi:10.22488/okstate.18.100465.
  • Moray, N. 1998. “Identifying Mental Models of Complex Human–Machine Systems.” International Journal of Industrial Ergonomics 22 (4–5): 293–297. doi:10.1016/S0169-8141(97)00080-2.
  • Nass, C., J. Steuer, and E. R. Tauber. 1994. “Computers Are Social Actors.” In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 72–78. New York, NY: ACM.
  • Pak, R., N. Fink, M. Price, B. Bass, and L. Sturre. 2012. “Decision Support Aids With Anthropomorphic Characteristics Influence Trust and Performance in Younger and Older Adults.” Ergonomics 55 (9): 1059–1072.
  • Pak, R., E. Rovira, A. C. McLaughlin, and N. Baldwin. 2017. “Does the Domain of Technology Impact User Trust? Investigating Trust in Automation Across Different Consumer-Oriented Domains in Young Adults, Military, and Older Adults.” Theoretical Issues in Ergonomics Science 18 (3): 199–220. doi:10.1080/1463922X.2016.1175523.
  • Parasuraman, R., and V. Riley. 1997. “Humans and Automation: Use, Misuse, Disuse, Abuse.” Human Factors: The Journal of the Human Factors and Ergonomics Society 39 (2): 230–253. doi:10.1518/001872097778543886.
  • Rice, S., and K. Geels. 2010. “Using System-Wide Trust Theory to Make Predictions About Dependence on Four Diagnostic Aids.” The Journal of General Psychology 137 (4): 362–375.
  • Rice, S., S. R. Winter, J. E. Deaton, and I. Cremer. 2016. “What Are the Predictors of System-Wide Trust Loss in Transportation Automation?” Journal of Aviation Technology and Engineering 6 (1): 1. doi:10.7771/2159-6670.1120.
  • Ross, J. M. 2008. “Moderators of Trust and Reliance across Multiple Decision Aids.” Doctoral diss., University of Central Florida.
  • Sauer, J., and B. Rüttinger. 2007. “Automation and Decision Support in Interactive Consumer Products.” Ergonomics 50 (6): 902–919.
  • Singh, I. L., R. Molloy, and R. Parasuraman. 1993. “Automation-Induced “Complacency”: Development of the Complacency-Potential Rating Scale.” The International Journal of Aviation Psychology 3 (2): 111–122. doi:10.1207/s15327108ijap0302_2.
  • Tabachnick, B. G, and L. S. Fidell. 2013. Using Multivariate Statistics (6th ed.). Boston, MA: Pearson Education Ltd.
  • Walliser, J. C., E. J. de Visser, and T. H. Shaw. 2016. “Application of a System-Wide Trust Strategy When Supervising Multiple Autonomous Agents.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 60 (1): 133–137. doi:10.1177/1541931213601031.
  • Wickens, C. D., A. S. Mavor, R. Parasuraman and J. P. McGee, eds. 1998. The Future of Air Traffic Control: Human Operators and Automation. Washington: National Academy Press.
  • Wickens, C. D., and S. R. Dixon. 2007. “The Benefits of Imperfect Diagnostic Automation: A Synthesis of the Literature.” Theoretical Issues in Ergonomics Science 8 (3): 201–212. doi:10.1080/14639220500370105.
  • Wickens, Christopher D., John Helleberg, and Xidong Xu. 2002. “Pilot Maneuver Choice and Workload in Free Flight.” Human Factors 44 (2): 171–188.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.