630
Views
3
CrossRef citations to date
0
Altmetric
Original Articles

Multisensory integration effect of humanoid robot appearance and voice on users’ affective preference and visual attention

ORCID Icon, ORCID Icon, &
Pages 2387-2406 | Received 20 Jan 2022, Accepted 13 Sep 2022, Published online: 19 Sep 2022
 

ABSTRACT

Appearance and voice are essential factors impacting users’ affective preferences for humanoid robots. However, little is known about how the appearance and voice of humanoid robots jointly influence users’ affective preferences and visual attention. We conducted a mixed-design eye-tracking experiment to examine the multisensory integration effect of humanoid robot appearances and voices on users’ affective preferences and visual attention. The results showed that the combinations of affectively preferred voices and appearances attracted more affective preferences and shorter average fixation durations. The combinations of non-preferred voices and preferred appearances captured less affective preferences and longer fixation durations. The results suggest that congruent combinations of affectively preferred voices and appearances might motivate a facilitation effect on users’ affective preference and the depth of visual attention through audiovisual complements. Incongruent combinations of non-preferred voices and preferred appearances might stimulate an attenuation effect and result in less affective preferences and a deeper retrieval of visual information. Besides, the head attracted the most amount of visual attention regardless of voice conditions. This paper contributes to deepening the understanding of the multisensory integration effect on users’ affective preferences and visual attention and providing practical implications for designing humanoid robots satisfying users’ affective preferences.

Acknowledgments

We thank Vincent G. Duffy from the Purdue University, who helps us proofread the language of this paper. Also, we are grateful to all the experimental participants for this study. The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by National Natural Science Foundation of China: [Grants Number 72071035, 71771045].

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.