630
Views
3
CrossRef citations to date
0
Altmetric
Original Articles

Multisensory integration effect of humanoid robot appearance and voice on users’ affective preference and visual attention

ORCID Icon, ORCID Icon, &
Pages 2387-2406 | Received 20 Jan 2022, Accepted 13 Sep 2022, Published online: 19 Sep 2022

References

  • Amso, D., S. Haas, and J. Markant. 2014. “An eye Tracking Investigation of Developmental Change in Bottom-up Attention Orienting to Faces in Cluttered Natural Scenes.” PloS one 9 (1): e85701. doi:10.1371/journal.pone.0085701.
  • Ansani, A., M. Marini, F. D’Errico, and I. Poggi. 2020. “How Soundtracks Shape What We See: Analyzing the Influence of Music on Visual Scenes Through Self-Assessment, Eye Tracking, and Pupillometry.” Frontiers in Psychology 11 (2242): 2242. doi:10.3389/fpsyg.2020.02242.
  • Bartneck, C., D. Kulic, E. Croft, and S. Zoghbi. 2009. “Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots.” International Journal of Social Robotics 1 (1): 71–81. doi:10.1007/s12369-008-0001-3.
  • Behe, B. K., M. Bae, P. T. Huddleston, and L. Sage. 2015. “The Effect of Involvement on Visual Attention and Product Choice.” Journal of Retailing and Consumer Services 24: 10–21. doi:10.1016/j.jretconser.2015.01.002.
  • Belin, P. 2006. “Voice Processing in Human and non-Human Primates.” Philosophical Transactions of the Royal Society B: Biological Sciences 361 (1476): 2091–2107. doi:10.1098/rstb.2006.1933.
  • Bennett, C. C., and S. Sabanovic. 2015. “The Effects of Culture and Context on Perceptions of Robotic Facial Expressions.” Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systems 16 (2): 272–302. doi:10.1075/is.16.2.11ben.
  • Beran, T. N., A. Ramirez-Serrano, R. Kuzyk, M. Fior, and S. Nugent. 2011. “Understanding how Children Understand Robots: Perceived Animism in Child-Robot Interaction.” International Journal of Human-Computer Studies 69 (7–8): 539–550. doi:10.1016/j.ijhcs.2011.04.003.
  • Bernotat, J., F. Eyssel, and J. Sachse. 2021. “The (Fe)Male Robot: How Robot Body Shape Impacts First Impressions and Trust Towards Robots.” International Journal of Social Robotics 13 (3): 477–489. doi:10.1007/s12369-019-00562-7.
  • Berthoz, A., and I. Viaud-Delmon. 1999. “Multisensory Integration in Spatial Orientation.” Current Opinion in Neurobiology 9 (6): 708–712. doi:10.1016/S0959-4388(99)00041-0.
  • Boer, D., J., Minke, Başkent, D., & Cornelissen, W. Frans. (2020). Eyes on Emotion: Dynamic Gaze Allocation During Emotion Perception from Speech-Like Stimuli. Multisensory Research, 34(1), 17-47. doi:10.1163/22134808-bja10029
  • Bossi, F., Willemse, C., Cavazza, J., Marchesi, S., Murino, V., & Wykowska, A. (2020). The Human Brain Reveals Resting State Activity Patterns That are Predictive of Biases in Attitudes Toward Robots. Science Robotics, 5(46), abb6652. doi:10.1126/scirobotics.abb6652
  • Broadbent, E., V. Kumar, X. Li, J. Sollers, I. I. I. Stafford, R. Q. MacDonald, B. A. & Wegner, and D. M. 2013. “Robots with Display Screens: A Robot with a More Humanlike Face Display Is Perceived to Have More Mind and a Better Personality.” PloS One 8 (8): e72589. doi:10.1371/journal.pone.0072589.
  • Burleigh, T. J., J. R. Schoenherr, and G. L. Lacroix. 2013. “Does the Uncanny Valley Exist? An Empirical Test of the Relationship Between Eeriness and the Human Likeness of Digitally Created Faces.” Computers in Human Behavior 29 (3): 759–771. doi:10.1016/j.chb.2012.11.021.
  • Cabral, J. P., Cowan, B. R., Zibrek, K., & McDonnell, R. (2017). The Influence of Synthetic Voice on the Evaluation of a Virtual Character. 19th annual conference of the international speech communication association (interspeech 2018), Vols 1-6,
  • Carrasco, M. 2011. “Visual Attention: The Past 25 Years.” Vision Research 51 (13): 1484–1525. doi:10.1016/j.visres.2011.04.012.
  • Casado-Aranda, L.-A., L. N. Van der Laan, and J. Sanchez-Fernandez. 2018. “Neural Correlates of Gender Congruence in Audiovisual Commercials for Gender-Targeted Products: An fMRI Study.” Human Brain Mapping 39 (11): 4360–4372. doi:10.1002/hbm.24276.
  • Chang, R. C.-S., H.-P. Lu, and P. Yang. 2018. “Stereotypes or Golden Rules? Exploring Likable Voice Traits of Social Robots as Active Aging Companions for Tech-Savvy Baby Boomers in Taiwan.” Computers in Human Behavior 84: 194–210. doi:10.1016/j.chb.2018.02.025.
  • Cheetham, M. 2011. “The Human Likeness Dimension of the “Uncanny Valley Hypothesis”: Behavioral and Functional MRI Findings.” Frontiers in Human Neuroscience 5: 126. doi:10.3389/fnhum.2011.00126.
  • Chen, X., L. Han, Z. Pan, Y. Luo, and P. Wang. 2016. “Influence of Attention on Bimodal Integration During Emotional Change Decoding: Erp Evidence.” International Journal of Psychophysiology 106: 14–20. doi:10.1016/j.ijpsycho.2016.05.009.
  • Cohen, J. 1988. Statistical Power Analysis for the Behavioral Sciences. 2nd ed. Erlbaum.
  • Corbetta, M., and G. L. Shulman. 2002. “Control of Goal-Directed and Stimulus-Driven Attention in the Brain.” Nature Reviews Neuroscience 3 (3): 201–215. doi:10.1038/nrn755.
  • Cornelio, P., C. Velasco, and M. Obrist. 2021. “Multisensory Integration as per Technological Advances: A Review.” Frontiers in Neuroscience 15: 652611. doi:10.3389/fnins.2021.652611.
  • Coronado, E., G. Venture, and N. Yamanobe. 2021. “Applying Kansei/Affective Engineering Methodologies in the Design of Social and Service Robots: A Systematic Review.” International Journal of Social Robotics 13 (5): 1161–1171. doi:10.1007/s12369-020-00709-x.
  • Costa, S., F. Soares, and C. Santos. 2013. “Facial Expressions and Gestures to Convey Emotions with a Humanoid Robot.” In Social Robotics, 542–551. Springer International Publishing. doi:10.1007/978-3-319-02675-6_54
  • Dou, X., C. F. Wu, K. C. Linz, S. Z. Gan, and T. M. Tseng. 2021. “Effects of Different Types of Social Robot Voices on Affective Evaluations in Different Application Fields.” International Journal of Social Robotics 13 (4): 615–628. doi:10.1007/s12369-020-00654-9.
  • Driver, J., and C. Spence. 1998. “Crossmodal Attention.” Current Opinion in Neurobiology 8 (2): 245–253. doi:10.1016/S0959-4388(98)80147-5.
  • Edwards, C., A. Edwards, B. Stoll, X. Lin, and N. Massey. 2019. “Evaluations of an Artificial Intelligence Instructor’s Voice: Social Identity Theory in Human-Robot Interactions.” Computers in Human Behavior 90: 357–362. doi:10.1016/j.chb.2018.08.027.
  • Eyssel, F., D. Kuchenbrandt, S. Bobinger, L. De Ruiter, and F. Hegel. 2012-01-01. “If you Sound Like me, you Must be More Human.” Proceedings of the seventh annual ACM/IEEE international conference on human-robot interaction - HRI ‘12.
  • Faul, F., E. Erdfelder, A.-G. Lang, and A. Buchner. 2007. “G*Power 3: A Flexible Statistical Power Analysis Program for the Social, Behavioral, and Biomedical Sciences.” Behavior Research Methods 39 (2): 175–191. doi:10.3758/BF03193146.
  • Gameiro, Ramos, R. Kaspar, K. König, S. U. Nordholt, and S. & König. 2017. “Exploration and Exploitation in Natural Viewing Behavior.” Scientific Reports 7 (1): 2311. doi:10.1038/s41598-017-02526-1.
  • Gan, Y., Y. Ji, S. Jiang, X. Liu, Z. Feng, Y. Li, and Y. Liu. 2021. “Integrating Aesthetic and Emotional Preferences in Social Robot Design: An Affective Design Approach with Kansei Engineering and Deep Convolutional Generative Adversarial Network.” International Journal of Industrial Ergonomics 83: 103128. doi:10.1016/j.ergon.2021.103128.
  • Gao, C., C. E. Weber, D. H. Wedell, and S. V. Shinkareva. 2020. “An fMRI Study of Affective Congruence Across Visual and Auditory Modalities.” Journal of Cognitive Neuroscience 32 (7): 1251–1262. doi:10.1162/jocn_a_01553.
  • Gerdes, A. B. M., G. W. Alpers, H. Braun, S. Kohler, U. Nowak, and L. Treiber. 2021. “Emotional Sounds Guide Visual Attention to Emotional Pictures: An eye-Tracking Study with Audio-Visual Stimuli.” Emotion 21 (4): 679–692. doi:10.1037/emo0000729.
  • Guo, F., Y. Ding, W. Liu, C. Liu, and X. Zhang. 2016. “Can eye-Tracking Data be Measured to Assess Product Design?: Visual Attention Mechanism Should be Considered.” International Journal of Industrial Ergonomics 53: 229–235. doi:10.1016/j.ergon.2015.12.001.
  • Guo, F., M. M. Li, J. H. Chen, and V. G. Duffy. 2022. “Evaluating Users’ Preference for the Appearance of Humanoid Robots via Event-Related Potentials and Spectral Perturbations.” Behaviour & Information Technology 41 (7): 1381–1397. doi:10.1080/0144929X.2021.1876763.
  • Hansen, K., M. C. Steffens, T. Rakic, and H. Wiese. 2017. “When Appearance Does Not Match Accent: Neural Correlates of Ethnicity-Related Expectancy Violations.” Social Cognitive and Affective Neuroscience 12 (3): 507–515. doi:10.1093/scan/nsw148.
  • Hastie, H., K. Lohan, A. Deshmukh, F. Broz, and R. Aylett. 2017. “The Interaction Between Voice and Appearance in the Embodiment of a Robot Tutor.” In Social Robotics, Icsr 2017, edited by A. Kheddar, E. Yoshida, S. S. Ge, K. Suzuki, J. J. Cabibihan, F. Eyssel, and H. He, 10652, 64–74). doi:10.1007/978-3-319-70022-9_7
  • Hong, A., N. Lunscher, T. Hu, Y. Tsuboi, X. Zhang, S. Franco Dos Reis Alves, G. Nejat, and B. Benhabib. 2021. “A Multimodal Emotional Human–Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication.” IEEE Transactions on Cybernetics 51 (12): 5954–5968. doi:10.1109/TCYB.2020.2974688.
  • Husic-Mehmedovic, M., I. Omeragic, Z. Batagelj, and T. Kolar. 2017. “Seeing is not Necessarily Liking: Advancing Research on Package Design with eye-Tracking.” Journal of Business Research 80: 145–154. doi:10.1016/j.jbusres.2017.04.019.
  • Johansson-Pajala, R.-M., K. Thommes, J. A. Hoppe, O. Tuisku, L. Hennala, S. Pekkarinen, H. Melkas, and C. Gustafsson. 2020. “Care Robot Orientation: What, Who and How? Potential Users’ Perceptions.” International Journal of Social Robotics 12 (5): 1103–1117. doi:10.1007/s12369-020-00619-y.
  • Just, M. A., and P. A. Carpenter. 1976. “Eye Fixations and Cognitive Processes.” Cognitive Psychology 8 (4): 441–480. doi:10.1016/0010-0285(76)90015-3.
  • Kawasaki, M., and Y. Yamaguchi. 2012. “Individual Visual Working Memory Capacities and Related Brain Oscillatory Activities are Modulated by Color Preferences.” Frontiers in Human Neuroscience 6: 318. doi:10.3389/fnhum.2012.00318.
  • Klasen, M., Y.-H. Chen, and K. Mathiak. 2012. “Multisensory Emotions: Perception, Combination and Underlying Neural Processes.” Reviews in the Neurosciences 23 (4): 381–392. doi:10.1515/revneuro-2012-0040.
  • Klüber, K., and L. Onnasch. 2022. “Appearance is not Everything - Preferred Feature Combinations for Care Robots.” Computers in Human Behavior 128: 107128. doi:10.1016/j.chb.2021.107128.
  • Ko, S., X. Liu, J. Mamros, E. Lawson, H. Swaim, C. Yao, and M. Jeon, 2020. “The Effects of Robot Appearances, Voice Types, and Emotions on Emotion Perception Accuracy and Subjective Perception on Robots.” In Lecture Notes in Computer Science, 174-193. Springer International Publishing. doi:10.1007/978-3-030-60117-1_13
  • Koelewijn, T., A. Bronkhorst, and J. Theeuwes. 2010. “Attention and the Multiple Stages of Multisensory Integration: A Review of Audiovisual Studies.” Acta Psychologica 134 (3): 372–384. doi:10.1016/j.actpsy.2010.03.010.
  • Kreifelts, B., T. Ethofer, W. Grodd, M. Erb, and D. Wildgruber. 2007. “Audiovisual Integration of Emotional Signals in Voice and Face: An Event-Related fMRI Study.” NeuroImage 37 (4): 1445–1456. doi:10.1016/j.neuroimage.2007.06.020.
  • Krucien, N., M. Ryan, and F. Hermens. 2017. “Visual Attention in Multi-Attributes Choices: What Can eye-Tracking Tell us?” Journal of Economic Behavior & Organization 135: 251–267. doi:10.1016/j.jebo.2017.01.018.
  • Kuhne, K., M. H. Fischer, and Y. Zhou. 2020. “The Human Takes It All: Humanlike Synthesized Voices Are Perceived as Less Eerie and More Likable. Evidence from a Subjective Ratings Study.” Frontiers in Neurorobotics 14: 593732. doi:10.3389/fnbot.2020.593732.
  • Kuo, J.-Y., C.-H. Chen, S. Koyama, and D. Chang. 2021. “Investigating the Relationship Between Users’ eye Movements and Perceived Product Attributes in Design Concept Evaluation.” Applied Ergonomics 94: 103393. doi:10.1016/j.apergo.2021.103393.
  • Kwak, S. S., J. S. Kim, and J. J. Choi. 2017. “The Effects of Organism- Versus Object-Based Robot Design Approaches on the Consumer Acceptance of Domestic Robots.” International Journal of Social Robotics 9 (3): 359–377. doi:10.1007/s12369-016-0388-1.
  • Lee, I. E., C. V. Latchoumane, and J. Jeong. 2017. “The Effect of Calendula Officinalis on Oxidative Stress and Bone Loss in Experimental Periodontitis.” Frontiers in Physiology, 8, 440. doi:10.3389/fphys.2017.00440
  • Li, M., F. Guo, Z. Ren, and V. G. Duffy. 2022. “A Visual and Neural Evaluation of the Affective Impression on Humanoid Robot Appearances in Free Viewing.” International Journal of Industrial Ergonomics 88: 103159. doi:10.1016/j.ergon.2021.103159.
  • Liu, Y., F. Li, L. H. Tang, Z. Lan, J. Cui, O. Sourina, and C.-H. Chen. 2019-10-01. “Detection of Humanoid Robot Design Preferences Using EEG and Eye Tracker.” 2019 International Conference on Cyberworlds (CW), Kyoto, Japan.
  • Martinez-Miranda, J., H. Perez-Espinosa, I. Espinosa-Curiel, H. Avila-George, and J. Rodriguez-Jacobo. 2018. “Age-based Differences in Preferences and Affective Reactions Towards a Robot’s Personality During Interaction.” Computers in Human Behavior 84: 245–257. doi:10.1016/j.chb.2018.02.039.
  • Matthews, G., P. A. Hancock, J. Lin, A. R. Panganiban, L. E. Reinerman-Jones, J. L. Szalma, and R. W. Wohleber. 2021. “Evolution and Revolution: Personality Research for the Coming World of Robots, Artificial Intelligence, and Autonomous Systems.” Personality and Individual Differences 169: 109969. doi:10.1016/j.paid.2020.109969.
  • Mesfin, G., N. Hussain, A. Covaci, and G. Ghinea. 2019. “Using Eye Tracking and Heart-Rate Activity to Examine Crossmodal Correspondences QoE in Mulsemedia.” ACM Transactions on Multimedia Computing, Communications, and Applications 15 (2): 1–22. doi:10.1145/3303080.
  • Millet, B., J. Chattah, and S. Ahn. 2021. “Soundtrack Design: The Impact of Music on Visual Attention and Affective Responses.” Applied Ergonomics 93: 103301. doi:10.1016/j.apergo.2020.103301.
  • Mori, M., K. F. MacDorman, and N. Kageki. 2012. “The Uncanny Valley [from the Field].” IEEE Robotics & Automation Magazine 19 (2): 98–100. doi:10.1109/MRA.2012.2192811.
  • Motoki, K., T. Saito, and T. Onuma. 2021. “Eye-tracking Research on Sensory and Consumer Science: A Review, Pitfalls and Future Directions.” Food Research International 145: 110389. doi:10.1016/j.foodres.2021.110389.
  • Nagamachi, M. 1995. “Kansei Engineering: A new Ergonomic Consumer-Oriented Technology for Product Development.” International Journal of Industrial Ergonomics 15 (1): 3–11. doi:10.1016/0169-8141(94)00052-5.
  • Naneva, S., M. Sarda Gou, T. L. Webb, and T. J. Prescott. 2020. “A Systematic Review of Attitudes, Anxiety, Acceptance, and Trust Towards Social Robots.” International Journal of Social Robotics 12 (6): 1179–1201. doi:10.1007/s12369-020-00659-4.
  • Nass, C. I., and S. Brave. 2005. Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship. MIT press Cambridge.
  • Nass, C., Y. Moon, B. J. Fogg, B. Reeves, and D. C. Dryer. 1995. “Can Computer Personalities be Human Personalities?” International Journal of Human-Computer Studies 43 (2): 223–239. doi:10.1006/ijhc.1995.1042.
  • Niculescu, A., B. van Dijk, A. Nijholt, H. Li, and S. L. See. 2013. “Making Social Robots More Attractive: The Effects of Voice Pitch, Humor and Empathy.” International Journal of Social Robotics 5 (2): 171–191. doi:10.1007/s12369-012-0171-x.
  • Oh, Y. H., and D. Y. Ju. 2020. “Age-Related Differences in Fixation Pattern on a Companion Robot.” Sensors 20 (13): 3807. doi:10.3390/s20133807.
  • Okafuji, Y., J. Baba, J. Nakanishi, I. Kuramoto, K. Ogawa, Y. Yoshikawa, and H. Ishiguro. 2020. “Can a Humanoid Robot Continue to Draw Attention in an Office Environment?” Advanced Robotics 34 (14): 931–946. doi:10.1080/01691864.2020.1769724.
  • Özcan, E., G. C. Cupchik, and H. N. J. Schifferstein. 2017. “Auditory and Visual Contributions to Affective Product Quality.” International Journal of Design 11 (1): 35–50.
  • Paetzel-Prüsmann, M., G. Perugia, and G. Castellano. 2021. “The Influence of Robot Personality on the Development of Uncanny Feelings.” Computers in Human Behavior 120: 106756. doi:10.1016/j.chb.2021.106756.
  • Pan, Z., X. Liu, Y. Luo, and X. Chen. 2017. “Emotional Intensity Modulates the Integration of Bimodal Angry Expressions: Erp Evidence.” Frontiers in Neuroscience 11: 349. doi:10.3389/fnins.2017.00349.
  • Park, D., S. Park, W. Kim, I. Rhiu, and M. H. Yun. 2019. “A Comparative Study on Subjective Feeling of Engine Acceleration Sound by Automobile Types.” International Journal of Industrial Ergonomics 74: 102843. doi:10.1016/j.ergon.2019.102843.
  • Piwek, L., F. Pollick, and K. Petrini. 2015. “Audiovisual Integration of Emotional Signals from Others’ Social Interactions.” Frontiers in Psychology 9: 116. doi:10.3389/fpsyg.2015.00611.
  • Rhie, Y. L., G. W. Kim, and M. H. Yun. 2019. “Exploring the Relationship Between Psychoacoustic and Affective Variables in a Shutter-Press Sound.” Human Factors and Ergonomics in Manufacturing & Service Industries, 29(4), 372-386. doi:10.1002/hfm.20794
  • Robinson, F. A., O. Bown, and M. Velonaki. 2022. “Designing Sound for Social Robots: Candidate Design Principles.” International Journal of Social Robotics. doi:10.1007/s12369-022-00891-0
  • Roesler, E., D. Manzey, and L. Onnasch. 2021. “A Meta-Analysis on the Effectiveness of Anthropomorphism in Human-Robot Interaction.” Science Robotics 6 (58): eabj5425. doi:10.1126/scirobotics.abj5425.
  • Rohe, T., A. C. Ehlis, and U. Noppeney. 2019. “The Neural Dynamics of Hierarchical Bayesian Causal Inference in Multisensory Perception.” Nature Communications 10 (1): 1907. doi:10.1038/s41467-019-09664-2.
  • Rosenthal-von der Puetten, A. M., and N. C. Kraemer. 2014. “How Design Characteristics of Robots Determine Evaluation and Uncanny Valley Related Responses.” Computers in Human Behavior 36: 422–439. doi:10.1016/j.chb.2014.03.066.
  • Sarigul, B., I. Saltik, B. Hokelek, and B. A. Urgen. 2020-03-23. “Does the Appearance of an Agent Affect How We Perceive his/her Voice?” Companion of the 2020 ACM/IEEE international conference on human-robot interaction.
  • Seaborn, K., N. P. Miyake, P. Pennefather, and M. Otake-Matsuura. 2022. “Voice in Human–Agent Interaction.” Acm Computing Surveys 54 (4): 1–43. doi:10.1145/3386867.
  • Shen, H., and J. Sengupta. 2014. “The Crossmodal Effect of Attention on Preferences: Facilitation Versus Impairment.” Journal of Consumer Research 40 (5): 885–903. doi:10.1086/673261.
  • Song, S., J. Baba, J. Nakanishi, Y. Yoshikawa, and H. Ishiguro. 2020. “Mind The Voice!: Effect of Robot Voice Pitch, Robot Voice Gender, and User Gender on User Perception of Teleoperated Robots.” Extended abstracts of the 2020 CHI conference on human factors in computing systems, Honolulu, HI, USA. doi:10.1145/3334480.3382988
  • Song, Y., and Y. Luximon. 2021. “The Face of Trust: The Effect of Robot Face Ratio on Consumer Preference.” Computers in Human Behavior 116: 106620. doi:10.1016/j.chb.2020.106620.
  • Song, Y., A. Luximon, and Y. Luximon. 2021. “The Effect of Facial Features on Facial Anthropomorphic Trustworthiness in Social Robots.” Applied Ergonomics 94: 103420. doi:10.1016/j.apergo.2021.103420.
  • Spatola, N., B. Kühnlenz, and G. Cheng. 2021. “Perception and Evaluation in Human–Robot Interaction: The Human–Robot Interaction Evaluation Scale (HRIES)—A Multicomponent Approach of Anthropomorphism.” International Journal of Social Robotics, doi:10.1007/s12369-020-00667-4.
  • Spatola, N., and O. A. Wudarczyk. 2021. “Ascribing Emotions to Robots: Explicit and Implicit Attribution of Emotions and Perceived Robot Anthropomorphism.” Computers in Human Behavior 124: 106934. doi:10.1016/j.chb.2021.106934.
  • Spence, C., and M. U. Shankar. 2010. “The Influence of Auditory Cues on the Perception of, and Responses to, Food and Drink.” Journal of Sensory Studies 25 (3): 406–430. doi:10.1111/j.1745-459X.2009.00267.x.
  • Stroessner, S. J., and J. Benitez. 2019. “The Social Perception of Humanoid and Non-Humanoid Robots: Effects of Gendered and Machinelike Features.” International Journal of Social Robotics 11 (2): 305–315. doi:10.1007/s12369-018-0502-7.
  • Tamagawa, R., C. I. Watson, I. H. Kuo, B. A. MacDonald, and E. Broadbent. 2011. “The Effects of Synthesized Voice Accents on User Perceptions of Robots.” International Journal of Social Robotics 3 (3): 253–262. doi:10.1007/s12369-011-0100-4.
  • Tatarian, K., R. Stower, D. Rudaz, M. Chamoux, A. Kappas, and M. Chetouani. 2022. “How Does Modality Matter? Investigating the Synthesis and Effects of Multi-Modal Robot Behavior on Social Intelligence.” International Journal of Social Robotics 14 (4): 893–911. doi:10.1007/s12369-021-00839-w.
  • Thepsoonthorn, C., K.-I. Ogawa, and Y. Miyake. 2021. “The Exploration of the Uncanny Valley from the Viewpoint of the Robot’s Nonverbal Behaviour.” International Journal of Social Robotics 13 (6): 1443–1455. doi:10.1007/s12369-020-00726-w
  • Tsiourti, C., A. Weiss, K. Wac, and M. Vincze. 2019. “Multimodal Integration of Emotional Signals from Voice, Body, and Context: Effects of (In)Congruence on Emotion Recognition and Attitudes Towards Robots.” International Journal of Social Robotics 11 (4): 555–573. doi:10.1007/s12369-019-00524-z.
  • Unema, P. J. A., S. Pannasch, M. Joos, and B. M. Velichkovsky. 2005. “Time Course of Information Processing During Scene Perception: The Relationship Between Saccade Amplitude and Fixation Duration.” Visual Cognition 12 (3): 473–494. doi:10.1080/13506280444000409.
  • van der Laan, L. N., I. T. C. Hooge, D. T. D. de Ridder, M. A. Viergever, and P. A. M. Smeets. 2015. “Do you Like What you see? The Role of First Fixation and Total Fixation Duration in Consumer Choice.” Food Quality and Preference 39: 46–55. doi:10.1016/j.foodqual.2014.06.015.
  • Vines, B. W., C. L. Krumhansl, M. M. Wanderley, and D. J. Levitin. 2006. “Cross-modal Interactions in the Perception of Musical Performance.” Cognition 101 (1): 80–113. doi:10.1016/j.cognition.2005.09.003.
  • Walters, M. L., D. S. Syrdal, K. L. Koay, K. Dautenhahn, and R. Te Boekhorst. 2008-08-01. “Human Approach Distances to a Mechanical-Looking Robot with Different Robot Voice Styles.” RO-MAN 2008 - The 17th IEEE international symposium on robot and human interactive communication.
  • Wang, J., P. Antonenko, and K. Dawson. 2020. “Does Visual Attention to the Instructor in Online Video Affect Learning and Learner Perceptions? An eye-Tracking Analysis.” Computers & Education 146: 103779. doi:10.1016/j.compedu.2019.103779.
  • Wiese, E., G. A. Buzzell, A. Abubshait, and P. J. Beatty. 2018. “Seeing Minds in Others: Mind Perception Modulates low-Level Social-Cognitive Performance and Relates to Ventromedial Prefrontal Structures.” Cognitive, Affective, & Behavioral Neuroscience 18 (5): 837–856. doi:10.3758/s13415-018-0608-2.
  • Wolf, A., K. Ounjai, M. Takahashi, S. Kobayashi, T. Matsuda, and J. Lauwereyns. 2018. “Evaluative Processing of Food Images: A Conditional Role for Viewing in Preference Formation.” Frontiers in Psychology 9: 936. doi:10.3389/fpsyg.2018.00936.
  • Wolfe, J. M., and T. S. Horowitz. 2004. “What Attributes Guide the Deployment of Visual Attention and how do They do it?” Nature Reviews Neuroscience 5 (6): 495–501. doi:10.1038/nrn1411.
  • Xu, K. 2019. “First Encounter with Robot Alpha: How Individual Differences Interact with Vocal and Kinetic Cues in Users’ Social Responses.” New Media & Society 21 (11–12): 2522–2547. doi:10.1177/1461444819851479.
  • Yang, S., X. Chang, S. Chen, S. Lin, and W. T. Ross. 2022. “Does Music Really Work? The two-Stage Audiovisual Cross-Modal Correspondence Effect on Consumers’ Shopping Behavior.” Marketing Letters 33 (2): 251–276. doi:10.1007/s11002-021-09582-8.
  • Zajonc, R. B., and H. Markus. 1982. “Affective and Cognitive Factors in Preferences.” Journal of Consumer Research 9 (2): 123–131. doi:10.1086/208905.
  • Zhang, H., M. Liu, W. Li, and W. Sommer. 2020. “Human Voice Attractiveness Processing: Electrophysiological Evidence.” Biological Psychology 150: 107827. doi:10.1016/j.biopsycho.2019.107827.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.