333
Views
1
CrossRef citations to date
0
Altmetric
Research Articles

Sonification Use Cases in Highly Automated Vehicles: Designing and Evaluating Use Cases in Level 4 Automation

ORCID Icon, , , , & ORCID Icon
Pages 3122-3132 | Received 07 Aug 2022, Accepted 09 Feb 2023, Published online: 26 Feb 2023

References

  • Bergold, J., & Thomas, S. (2012). Participatory research methods: A methodological approach in motion. Historische Sozialforschung [Historical Social Research], 37(4(142)), 191–222. https://doi.org/10.17169/fqs-13.1.1801
  • Bittner, K., & Spence, I. (2003). Use case modeling. Addison-Wesley Professional.
  • Blattner, M. M., Sumikawa, D. A., & Greenberg, R. M. (1989). Earcons and icons: Their structure and common design principles. Human-Computer Interaction, 4(1), 11–44. https://doi.org/10.1207/s15327051hci0401_1
  • Bosch, E., Oehl, M., Jeon, M., Alvarez, I., Healey, J., Ju, W., & Jallais, C. (2018). Emotional GaRage: A workshop on in-car emotion recognition and regulation [Paper presentation]. Adjunct Proceedings – 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2018, Toronto, ON, Canada (pp. 44–49).
  • Brewster, S. A. (1994). Providing a structured method for integrating non-speech audio into human-computer interfaces. ISNI: 0000 0001 1455 7967 [Doctoral dissertation, University of York]. http://www.dcs.gla.ac.uk/∼stephen/papers/theses/Brewster_thesis_tagged.pdf
  • Brown, E., & Cairns, P. (2004). A grounded investigation of game immersion. CHI'04 Extended Abstracts on Human Factors in Computing Systems, 1297–1300. Association for Computing Machinery.
  • Choi, J. K., & Ji, Y. G. (2015). Investigating the importance of trust on adopting an autonomous vehicle. International Journal of Human-Computer Interaction, 31(10), 692–702. https://doi.org/10.1080/10447318.2015.1070549
  • Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods research. SAGE Publications.
  • Du, N., Zhou, F., Pulver, E., Tilbury, D., Robert, L. P., Pradhan, A. K., & Yang, X. J. (2020). Predicting takeover performance in conditionally automated driving. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA. https://doi.org/10.1145/3334480.3382963
  • Endsley, M. R. (1988a). Design and evaluation for situation awareness enhancement. Proceedings of the Human Factors Society Annual Meeting. Sage Publications.
  • Endsley, M. R. (1988b). Situation awareness global assessment technique (SAGAT). Proceedings of the IEEE 1988 National Aerospace and Electronics Conference. IEEE.
  • Fakhrhosseini, S. M., Landry, S., Tan, Y. Y., Bhattarai, S., & Jeon, M. (2014). If you’re angry, turn the music on: Music can mitigate anger effects on driving performance [Paper presentation]. AutomotiveUI 2014 – 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, in Cooperation with ACM SIGCHI – Proceedings (September), Seattle, WA, USA.
  • Frison, A. K., Jeon, M., Pfleging, B., Alvarez, I., Riener, A., & Ju, W. (2017). Workshop on user-centered design for automated driving systems [Paper presentation]. AutomotiveUI 2017 – 9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, Adjunct Proceedings, Oldenburg, Germany (October) (pp. 22–27).
  • Gable, T. M., & Walker, B. N. (2013). Georgia tech simulator sickness screening protocol. Georgia Tech School of Psychology Tech Report GT-PSYC-TR-2013-01. GTSSSP--GT-PSYC-TechReport-2013-01.pdf
  • Gang, N., Sibi, S., Michon, R., Mok, B., Chafe, C., & Ju, W. (2018). Don’t be alarmed: Sonifying Autonomous Vehicle Perception to Increase Situation Awareness. Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Association for Computing Machinery.
  • Gaver, W. W. (1986). Auditory icons: Using sound in computer interfaces. Human-Computer Interaction, 2(2), 167–177. https://doi.org/10.1207/s15327051hci0202_3
  • Gold, C., Damböck, D., Lorenz, L., Bengler, K. (2013). Take over! How long does it take to get the driver back into the loop? Proceedings of the Human Factors and Ergonomics Society (pp. 1938–1942). Sage Publications.
  • Hancock, P. A., Kajaks, T., Caird, J. K., Chignell, M. H., Mizobuchi, S., Burns, P. C., Feng, J., Fernie, G. R., Lavallière, M., Noy, I. Y., Redelmeier, D. A., & Vrkljan, B. H. (2020). Challenges to human drivers in increasingly automated vehicles. Human Factors, 62(2), 310–328. https://doi.org/10.1177/0018720819900402
  • Haspiel, J., Du, N., Meyerson, J., Robert, L. P., Tilbury, D., Yang, X. J., Pradhan, A. K. (2018). Explanations and expectations: Trust building in automated vehicles. ACM/IEEE International Conference on Human-Robot Interaction(Dc) (pp.119–120). Association for Computing Machinery.
  • Hester, M., Lee, K., & Dyre, B. P. (2017). “Driver take over": A preliminary exploration of driver trust and performance in autonomous vehicles. Proceedings of the Human Factors and Ergonomics Society, 2017-October (pp. 1969–1973).
  • Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors, 57(3), 407–434. https://doi.org/10.1177/0018720814547570
  • Jeon, M. (2019). Multimodal displays for take-over in level 3 automated vehicles while playing a game. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA ’19). Association for Computing Machinery. https://doi.org/10.1145/3290607.3313056
  • Jeon, M., Roberts, J., Raman, P., Yim, J. B., & Walker, B. N. (2011). Participatory design process for an in-vehicle affect detection and regulation system for various drivers. ASSETS'11: Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 271–272).
  • Jian, J.-Y., Bisantz, A. M., & Drury, C. G. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53–71. https://doi.org/10.1207/S15327566IJCE0401_04
  • Köhn, T., Gottlieb, M., Schermann, M., & Krcmar, H. (2019). Improving take-over quality in automated driving by interrupting non-driving tasks. In Proceedings of the 24th International Conference on Intelligent User Interfaces (IUI ’19) 510–517. Association for Computing Machinery. https://doi.org/10.1145/3301275.3302323
  • Kun, A. L., Boll, S., & Schmidt, A. (2016). Shifting gears: User interfaces in the age of autonomous driving. IEEE Pervasive Computing, 15(1), 32–38. https://doi.org/10.1109/MPRV.2016.14
  • Landry, S., Jeon, M., FakhrHosseini, M., & Tascarella, D. (2016). Listen to Your Drive: An In-Vehicle Sonification Prototyping Tool for Driver State and Performance Data. Adjunct Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’16) (pp. 21–26). Association for Computing Machinery.
  • Lee, S. C., Nadri, C., Sanghavi, H., & Jeon, M. (2022). Eliciting user needs and design requirements for user experience in fully automated vehicles. International Journal of Human–Computer Interaction, 38(3), 227–239. https://doi.org/10.1080/10447318.2021.1937875
  • Lee, S. C., Ko, S., Sanghavi, H., & Jeon, M. (2019). Autonomous driving with an agent: Speech style and embodiment [Paper presentation]. Adjunct Proceedings – 11th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2019 (September), Utrecht, Netherlands (209–214).
  • Lee, S. C., Nadri, C., Sanghavi, H., & Jeon, M. (2020). Exploring user needs and design requirements in fully automated vehicles [Paper presentation]. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA.
  • Litman, T. (2020). Autonomous vehicle implementation predictions: Implications for transport planning. Victoria Transport Policy Institute Report. TRID, Victoria, British Columbia, Canada. https://trid.trb.org/View/1678741
  • Liu, Y. C. (2001). Comparative study of the effects of auditory, visual and multimodality displays on drivers’ performance in advanced traveler information systems. Ergonomics, 44(4), 425–442. https://doi.org/10.1080/00140130010011369
  • Mauney, B. S., & Walker, B. N. (2004). Creating functional and livable soundscapes for peripheral monitoring of dynamic data. In S. Barrass, & P. Vickers (Eds.), Proceedings of ICAD 04. Tenth Meeting of the International Conference on Auditory Display. International Community for Auditory Display. http://hdl.handle.net/1853/50847
  • Meschtscherjakov, A., Krome, S., & McCall, R. (2015). 3rd Workshop on User Experience of Autonomous Driving. https://doi.org/10.13140/RG.2.1.2726.1928
  • Nadri, C., Chan Lee, S., Kekal, S., Li, Y., Li, X., Lautala, P., Nelson, D., & Jeon, M. (2021). Effects of auditory display types and acoustic variables on subjective driver assessment in a rail-crossing context. Transportation Research Record, 2675(9), 1457–1468. https://doi.org/10.1177/03611981211007838
  • National Highway Traffic Safety Administration. (2016). Human factors design guidance for driver-vehicle interfaces (US Department of Transportation, National Highway Traffic Safety Administration Report DOT HS 812 360). https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/documents/812360_humanfactorsdesignguidance.pdf
  • Nees, M. A., & Walker, B. N. (2011). Auditory displays for in-vehicle technologies. Reviews of Human Factors and Ergonomics, 7(1), 58–99. https://doi.org/10.1177/1557234X11410396
  • On-Road Automated Driving (ORAD) Committee. (2014). Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems, SAE International (Ground Vehicle Standard J3016_201401). https://doi.org/10.4271/J3016_201401
  • Petermeijer, S., Bazilinskyy, P., Bengler, K., & de Winter, J. (2017). Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop. Applied Ergonomics, 62, 204–215. https://doi.org/10.1016/j.apergo.2017.02.023
  • Pfadenhauer, M. (2009). At eye level: The expert interview—a talk between expert and quasi-expert. In A. Bogner, B. Littig, & W. Menz (Eds.), Interviewing experts (pp. 81–97). Springer.
  • Riener, A., Jeon, M. P., Alvarez, I., Pfleging, B., Mirnig, A., Tscheligi, M., & Chuang, L. (2016). 1st Workshop on ethically inspired user interfaces for automated driving. Adjunct Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Association for Computing Machinery.
  • Šabić, E., Henning, D., & MacDonald, J. (2019). Adaptive auditory alerts for smart in-vehicle interfaces. Proceedings of the Human Factors and Ergonomics Society Annual Meeting (vol. 63, pp. 1545–1549). Sage Publications.
  • Sanghavi, H. (2020). Exploring the influence of anger on takeover performance in semi-automated vehicles (Publication Number vt_gsexam: 25723) [Masters, Virginia Tech]. https://vtechworks.lib.vt.edu/bitstream/handle/10919/98540/Sanghavi_HK_T_2020.pdf?sequence=1&isAllowed=y
  • von Sawitzky, T., Wintersberger, P., Riener, A., Gabbard, J. L. (2019). Increasing trust in fully automated driving: Route indication on an augmented reality head-up display. Proceedings of the 8th ACM International Symposium on Pervasive Displays. Association for Computing Machinery.
  • Wintersberger, P., Riener, A., & Frison, A.-K. (2016). Automated driving system, male, or female driver: Who’d You Prefer? Comparative analysis of passengers’ mental conditions, emotional states & qualitative feedback. Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Association for Computing Machinery. https://doi.org/10.1145/3003715.3005410

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.