Abstract
Communicating effectively with autonomous vehicles requires contextualized visual and auditory cues to ensure clear message delivery. Evaluating the user experience involves assessing which types of information can be safely reacted to without additional monitoring and how it is presented. Validating the visual and auditory cues supports the driver’s course of action. This study investigates message types and preferred modalities of driver-to-driver communication via vehicle-to-everything (V2X) in advanced driver assistance systems technologies for autonomous and manual driving and proposes efficient ways to respond to event situations. Four modalities, including baseline and three message types with different information were proposed to investigate the information required by drivers. Results indicate that providing notifications during autonomous driving is more helpful and less workload-intensive than during manual driving. Most notifications were highly visible and easy to recognize. Although behavioral messages in both autonomous and manual driving enhance usability, providing advice and behavioral messages is safer for autonomous driving. Designing V2X notification information based on future events is vital because of its highly pragmatic nature.
Disclosure statement
No potential conflict of interest was reported by the author(s).
Additional information
Funding
Notes on contributors
Gayoung Ryu
Gayoung Ryu is a PhD candidate in the Department of Cognitive Science at Yonsei University, South Korea. Her research interests include usability, UX, HCI and human factors issues in autonomous vehicle environments.
Yeun Joo Lee
Yeun Joo Lee is a Master’s student in the Department of Industrial Engineering at Yonsei University, South Korea. Her research focuses on improving user experience within XR, VR, and AR through Computer Sciences and HCI methodologies, exploring how to elevate the quality of future immersive technologies.
Yulim Kim
Yulim Kim is a Master’s student in the Department of Industrial Engineering at Yonsei University, South Korea. Her research interests include HCI, human factors, and ergonomics, with a focus on human-automation interaction.
Yong Gu Ji
Yong Gu Ji is a professor in the Department of Industrial Engineering at Yonsei University, where he directs the Interaction Design Laboratory. He received his PhD in Human Factors/HCI from Purdue University. His research interests include usability/UX in future mobility and autonomous vehicles.