294
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Team at Your Service: Investigating Functional Specificity for Trust Calibration in Automated Driving with Conversational Agents

Pages 3254-3267 | Received 30 Dec 2022, Accepted 08 May 2023, Published online: 29 Jun 2023

References

  • Antrobus, V., Large, D. R., & Burnett, G. E. (2018). ‘Trust me – I’m autocab’: Using natural language interfaces to improve the trust and acceptance of level 4/5 autonomous vehicles. In 6th Humanist Conference. https://nottingham-repository.worktribe.com/output/1218942
  • Bainbridge, L. (1983). Ironies of automation. In G. Johannsen & J. Rijnsdorp (Eds.), Analysis, design and evaluation of man–machine systems (pp. 129–135). Pergamon. https://www.sciencedirect.com/science/article/pii/B9780080293486500269
  • Biondi, F., Alvarez, I., & Jeong, K.-A. (2019). Human–vehicle cooperation in automated driving: A multidisciplinary review and appraisal. International Journal of Human–Computer Interaction, 35(11), 932–946. https://doi.org/10.1080/10447318.2018.1561792
  • Blömacher, K., Nöcker, G., & Huff, M. (2018). The role of system description for conditionally automated vehicles. Transportation Research Part F: Traffic Psychology and Behaviour, 54(APR), 159–170. https://doi.org/10.1016/j.trf.2018.01.010
  • Chien, S.-Y., Lewis, M., Hergeth, S., Semnani-Azad, Z., & Sycara, K. (2015). Cross-country validation of a cultural scale in measuring trust in automation. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 59, pp. 686–690). https://doi.org/10.1177/1541931215591149
  • Choi, J. K., & Ji, Y. G. (2015). Investigating the importance of trust on adopting an autonomous vehicle. International Journal of Human–Computer Interaction, 31(10), 692–702. https://doi.org/10.1080/10447318.2015.1070549
  • Colley, M., Bräuner, C., Lanzer, M., Walch, M., Baumann, M., & Rukzio, E. (2020). Effect of visualization of pedestrian intention recognition on trust and cognitive load. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 181–191). https://doi.org/10.1145/3409120.3410648
  • Derakhshan, S., Nezami, F. N., Wächter, M. A., Czeszumski, A., Keshava, A., Lukanov, H., De Palol, M. V., Pipa, G., & König, P. (2022). Talking cars, doubtful users—A population study in virtual reality. IEEE.
  • Detjen, H., Faltaous, S., Pfleging, B., Geisler, S., & Schneegass, S. (2021). How to increase automated vehicles’ acceptance through in-vehicle interaction design: A review. International Journal of Human–Computer Interaction, 37(4), 308–330. https://doi.org/10.1080/10447318.2020.1860517
  • Detjen, H., Pfleging, B., & Schneegass, S. (2020). A wizard of oz field study to understand non-driving-related activities, trust, and acceptance of automated vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 19–29). Association for Computing Machinery. https://doi.org/10.1145/3409120.3410662
  • Dosovitskiy, A., Ros, G., Codevilla, F., Lopez, A., & Koltun, V. (2017, November 13–15). CARLA: An open urban driving simulator. In S. Levine, V. Vanhoucke, & K. Goldberg (Eds.), Proceedings of the 1st Annual Conference on Robot Learning (Vol. 78, pp. 1–16). PMLR. https://proceedings.mlr.press/v78/dosovitskiy17a.html
  • Ekman, F., Johansson, M., & Sochor, J. (2018). Creating appropriate trust in automated vehicle systems: A framework for HMI design. IEEE Transactions on Human–Machine Systems, 48(1), 95–101. https://doi.org/10.1109/THMS.2017.2776209
  • Endsley, M. R. (2017). Toward a theory of situation awareness in dynamic systems. In Situational awareness (pp. 9–42). Routledge.
  • Feldhütter, A., Gold, C., Hüger, A., & Bengler, K. (2016). Trust in automation as a matter of media influence and experience of automated vehicles. Human Factors, 60(1), 2024–2028. https://doi.org/10.1177/1541931213601460
  • Frison, A.-K., Wintersberger, P., Riener, A., Schartmüller, C., Boyle, L. N., Miller, E., & Weigl, K. (2019). In ux we trust: Investigation of aesthetics and usability of driver-vehicle interfaces and their impact on the perception of automated driving. In Proceedings of the 2019 Chi Conference on Human Factors in Computing Systems (pp. 1–13). Association for Computing Machinery. https://doi.org/10.1145/3290605.3300374
  • Gold, C., Körber, M., Hohenberger, C., Lechner, D., & Bengler, K. (2015). Trust in automation–before and after the experience of take-over scenarios in a highly automated vehicle. Procedia Manufacturing, 3(2015), 3025–3032. https://doi.org/10.1016/j.promfg.2015.07.847
  • Hassenzahl, M., & Monk, A. (2010). The inference of perceived usability from beauty. Human–Computer Interaction, 25(3), 235–260. https://doi.org/10.1080/07370024.2010.500139
  • Hegner, S. M., Beldad, A. D., & Brunswick, G. J. (2019). In automatic we trust: Investigating the impact of trust, control, personality characteristics, and extrinsic and intrinsic motivations on the acceptance of autonomous vehicles. International Journal of Human–Computer Interaction, 35(19), 1769–1780. https://doi.org/10.1080/10447318.2019.1572353
  • Hergeth, S., Lorenz, L., Krems, J. F., Toenert, L. (2015). Effects of take-over requests and cultural background on automation trust in highly automated driving. In Proceedings of the 8th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design (Vol. 2015, pp. 330–336). University of Iowa.
  • Hergeth, S., Lorenz, L., Vilimek, R., & Krems, J. F. (2016). Keep your scanners peeled: Gaze behavior as a measure of automation trust during highly automated driving. Human Factors, 58(3), 509–519. https://doi.org/10.1177/0018720815625744
  • Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407–434. https://doi.org/10.1177/0018720814547570
  • Holländer, K., Wintersberger, P., & Butz, A. (2019). Overtrust in external cues of automated vehicles: An experimental investigation. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 211–221). Association for Computing Machinery. https://doi.org/10.1145/3342197.3344528
  • Holthausen, B. E., Wintersberger, P., Walker, B. N., & Riener, A. (2020). Situational trust scale for automated driving (STS-AD): Development and initial validation. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 40–47). Association for Computing Machinery. https://doi.org/10.1145/3409120.3410637
  • Huang, H.-Y., & Bashir, M. (2017). Personal influences on dynamic trust formation in human-agent interaction. In Proceedings of the 5th International Conference on Human Agent Interaction (pp. 233–243). Association for Computing Machinery. https://doi.org/10.1145/3125739.3125749
  • Jessup, S. A., Schneider, T. R., Alarcon, G. M., Ryan, T. J., & Capiola, A. (2019). The measurement of the propensity to trust automation. In J. Y. Chen & G. Fragomeni (Eds.), Virtual, augmented and mixed reality. applications and case studies (pp. 476–489). Springer International Publishing.
  • Jian, J.-Y., Bisantz, A. M., & Drury, C. G. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53–71. https://doi.org/10.1207/S15327566IJCE0401_04
  • Krefting, I., Trende, A., Unni, A., Rieger, J., Luedtke, A., & Fränzle, M. (2021). Evaluation of graphical human–machine interfaces for turning manoeuvres in automated vehicles. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 77–80). Association for Computing Machinery. https://doi.org/10.1145/3473682.3480268
  • Kunze, A., Summerskill, S. J., Marshall, R., & Filtness, A. J. (2018). Augmented reality displays for communicating uncertainty information in automated driving. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 164–175). Association for Computing Machinery. https://doi.org/10.1145/3239060.3239074
  • Lanzer, M., Babel, F., Yan, F., Zhang, B., You, F., Wang, J., & Baumann, M. (2020). Designing communication strategies of autonomous vehicles with pedestrians: An intercultural study. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 122–131). Association for Computing Machinery. https://doi.org/10.1145/3409120.3410653
  • Large, D. R., & Burnett, G. E. (2013). Drivers’ preferences and emotional responses to satellite navigation voices. International Journal of Vehicle Noise and Vibration, 9(1/2), 28–46. https://doi.org/10.1504/IJVNV.2013.053815
  • Large, D. R., Burnett, G., & Clark, L. (2019). Lessons from oz: Design guidelines for automotive conversational user interfaces. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings (pp. 335–340). Association for Computing Machinery. https://doi.org/10.1145/3349263.3351314
  • Large, D. R., Clark, L., Burnett, G., Harrington, K., Luton, J., Thomas, P., & Bennett, P. (2019). “It’s small talk, Jim, but not as we know it.”: Engendering trust through human-agent conversation in an autonomous, self-driving car. In Proceedings of the 1st International Conference on Conversational User Interfaces. Association for Computing Machinery. https://doi.org/10.1145/3342775.3342789
  • Large, D. R., Harrington, K., Burnett, G., Luton, J., Thomas, P., & Bennett, P. (2019). To please in a pod: Employing an anthropomorphic agent-interlocutor to enhance trust and user experience in an autonomous, self-driving vehicle. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 49–59). Association for Computing Machinery. https://doi.org/10.1145/3342197.3344545
  • Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46(1), 50–80. https://doi.org/10.1518/hfes.46.1.50_30392
  • Lee, S. C., Jeong, S., Wang, M., Hock, P., Baumann, M., & Jeon, M. (2021). “To go or not to go? that is the question”: When in-vehicle agents argue with each other. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 223–224). Association for Computing Machinery. https://doi.org/10.1145/3473682.3481876
  • Lin, R., Ma, L., & Zhang, W. (2018). An interview study exploring tesla drivers’ behavioural adaptation. Applied Ergonomics, 72(OCT), 37–47. https://doi.org/10.1016/j.apergo.2018.04.006
  • Lindgaard, G., Dudek, C., Sen, D., Sumegi, L., & Noonan, P. (2011). An exploration of relations between visual appeal, trustworthiness and perceived usability of homepages. ACM Transactions on Computer-Human Interaction, 18(1), 1–30. https://doi.org/10.1145/1959022.1959023
  • Litman, T. (2020). Autonomous vehicle implementation predictions: Implications for transport planning.
  • Merritt, S. M., & Ilgen, D. R. (2008). Not all trust is created equal: Dispositional and history-based trust in human-automation interactions. Human Factors, 50(2), 194–210. https://doi.org/10.1518/001872008X288574
  • Morando, A., Gershon, P., Mehler, B., & Reimer, B. (2021a). A model for naturalistic glance behavior around tesla autopilot disengagements. Accident; Analysis and Prevention, 161(OCT), 106348. https://doi.org/10.1016/j.aap.2021.106348
  • Morando, A., Gershon, P., Mehler, B., & Reimer, B. (2021b). Visual attention and steering wheel control: From engagement to disengagement of tesla autopilot. Human Factors, 65(1), 1390–1394. https://doi.org/10.1177/1071181321651118
  • Nees, M. A. (2016). Acceptance of self-driving cars: An examination of idealized versus realistic portrayals with a self-driving car acceptance scale. Human Factors, 60(1), 1449–1453. https://doi.org/10.1177/1541931213601332
  • O’Kane, S. (2020, February 25). Tesla autopilot, distracted driving to blame in deadly 2018 crash. The Verge. Retrieved March 11, 2020, from https://www.theverge.com/2020/2/25/21153320/tesla-autopilot-walter-huang-death-ntsb-probable-cause
  • Oord, A., Dieleman, S., Zen, H., Simonyan, K., Vinyals, O., Graves, A., Kalchbrenner, N., Senior, A., & Kavukcuoglu, K. (2016). Wavenet: A generative model for raw audio. arXiv preprint arXiv:1609.03499.
  • Reig, S., Norman, S., Morales, C. G., Das, S., Steinfeld, A., & Forlizzi, J. (2018). A field study of pedestrians and autonomous vehicles. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 198–209). Association for Computing Machinery. https://doi.org/10.1145/3239060.3239064
  • Sadeghian, S., Wintersberger, P., Laschke, M., & Hassenzahl, M. (2022). Designing sustainable mobility: Understanding users’ behavior. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 34–44). https://doi.org/10.1145/3543174.3546833
  • SAE (2018). SAE j3016 taxonomy and definitions for terms related to on-road motor vehicle automated driving systems. SAE International.
  • Schartmüller, C. (2021). Towards a mobile office: User interfaces for safety and productivity in conditionally automated vehicles/submitted by dipl.-ing.clemens schartmüller.
  • Schwieren, J. (2020). Conversing with automated cars: Exploring voice assistants for trustful user interactions with fully automated vehicles [B.S. thesis]. University of Twente.
  • Spain, R. D., Bustamante, E. A., & Bliss, J. P. (2008). Towards an empirically developed scale for system trust: Take two. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 52(19), 1335–1339. https://doi.org/10.1177/154193120805201907
  • Tolmeijer, S., Zierau, N., Janson, A., Wahdatehagh, J. S., Leimeister, J. M. M., & Bernstein, A. (2021). Female by default? – Exploring the effect of voice assistant gender and pitch on trait and trust attribution. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery. https://doi.org/10.1145/3411763.3451623
  • Torggler, A., Edwards, J., & Wintersberger, P. (2022). Beyond the halo: Investigation of trust and functional specificity in automated driving with conversational agents. In Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 195–203). Association for Computing Machinery. https://doi.org/10.1145/3543174.3546834
  • Tractinsky, N., Katz, A. S., & Ikar, D. (2000). What is beautiful is usable. Interacting with Computers, 13(2), 127–145. https://doi.org/10.1016/S0953-5438(00)00031-X
  • Van De Walle, S., & Six, F. (2014). Trust and distrust as distinct concepts: Why studying distrust in institutions is important. Journal of Comparative Policy Analysis: Research and Practice, 16(2), 158–174. https://doi.org/10.1080/13876988.2013.785146
  • Van Schaik, P., Hassenzahl, M., & Ling, J. (2012). User-experience from an inference perspective. ACM Transactions on Computer–Human Interaction, 19(2), 1–25. https://doi.org/10.1145/2240156.2240159
  • Victor, P., Cornelis, C., De Cock, M., & Pinheiro da Silva, P. (2009). Gradual trust and distrust in recommender systems. Fuzzy Sets and Systems, 160(10), 1367–1382. https://doi.org/10.1016/j.fss.2008.11.014
  • Victor, T. W., Tivesten, E., Gustavsson, P., Johansson, J., Sangberg, F., & Ljung Aust, M. (2018). Automation expectation mismatch: Incorrect prediction despite eyes on threat and hands on wheel. Human Factors, 60(8), 1095–1116. https://doi.org/10.1177/0018720818788164
  • Walker, F., Wang, J., Martens, M., & Verwey, W. (2019). Gaze behaviour and electrodermal activity: Objective measures of drivers’ trust in automated vehicles. Transportation Research Part F: Traffic Psychology and Behaviour, 64(JUL), 401–412. https://doi.org/10.1016/j.trf.2019.05.021
  • Wintersberger, P. (2020). Automated driving: Towards trustworthy and safe human-machine cooperation/submitted by dipl.-ing. philipp wintersberger (Unpublished doctoral dissertation). Universität Linz.
  • Wintersberger, P., Dmitrenko, D., Schartmüller, C., Frison, A.-K., Maggioni, E., Obrist, M., Riener, A. (2019). S(c)entinel: Monitoring automated vehicles with olfactory reliability displays. In Proceedings of the 24th International Conference on Intelligent User Interfaces (pp. 538–546).
  • Wintersberger, P., Frison, A.-K., Riener, A., & Sawitzky, T. v (2018). Fostering user acceptance and trust in fully automated vehicles: Evaluating the potential of augmented reality. Presence: Teleoperators and Virtual Environments, 27(1), 46–62. https://doi.org/10.1162/pres_a_00320
  • Wintersberger, P., Nicklas, H., Martlbauer, T., Hammer, S., & Riener, A. (2020). Explainable automation: Personalized and adaptive UIS to foster trust and understanding of driving automation systems. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 252–261). Association for Computing Machinery. https://doi.org/10.1145/3409120.3410659
  • Wintersberger, P., Schartmüller, C., Sadeghian, S., Frison, A.-K., & Riener, A. (2021). Evaluation of imminent take-over requests with real automation on a test track. Human Factors. https://doi.org/10.1177/00187208211051435
  • Wong, P. N. Y., Brumby, D. P., Babu, H. V. R., & Kobayashi, K. (2019). Voices in self-driving cars should be assertive to more quickly grab a distracted driver’s attention. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (pp. 165–176). Association for Computing Machinery. https://doi.org/10.1145/3342197.3344535

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.