705
Views
2
CrossRef citations to date
0
Altmetric
Research Articles

Evaluating Users’ Auditory Affective Preference for Humanoid Robot Voices through Neural Dynamics

ORCID Icon, ORCID Icon, &
Pages 3875-3893 | Received 07 Mar 2022, Accepted 28 Jul 2022, Published online: 15 Aug 2022

References

  • Aftanas, L. I., & Golocheikine, S. A. (2001). Human anterior and frontal midline theta and lower alpha reflect emotionally positive state and internalized attention: High-resolution EEG investigation of meditation. Neuroscience Letters, 310(1), 57–60. https://doi.org/10.1016/S0304-3940(01)02094-8
  • Ariely, D., & Berns, G. S. (2010). Neuromarketing: The hope and hype of neuroimaging in business. Nature Reviews. Neuroscience, 11(4), 284–292. https://doi.org/10.1038/nrn2795
  • Aronovitch, C. D. (1976). The voice of personality: Stereotyped judgments and their relation to voice quality and sex of speaker. The Journal of Social Psychology, 99(2), 207–220. https://doi.org/10.1080/00224545.1976.9924774
  • Babel, M., McGuire, G., & King, J. (2014). Towards a more nuanced view of vocal attractiveness. PLoS One, 9(2), e88616. https://doi.org/10.1371/journal.pone.0088616
  • Baird, A., Parada-Cabaleiro, E., Hantke, S., Burkhardt, F., Cummins, N., & Schuller, B. (2018). The perception and analysis of the likeability and human likeness of synthesized speech [Paper presentation]. 19th annual conference of the international speech communication association (Interspeech 2018), Hyderabad, India (Vol, 1–6, pp. 2863–2867). https://doi.org/10.21437/Interspeech.2018-1093
  • Bamford, S., Broyd, S. J., Benikos, N., Ward, R., Wiersema, J. R., & Sonuga-Barke, E. (2015). The late positive potential: A neural marker of the regulation of emotion-based approach-avoidance actions? Biological Psychology, 105, 115–123. https://doi.org/10.1016/j.biopsycho.2015.01.009
  • Başar, E., Güntekin, B., & Öniz, A. (2006). Principles of oscillatory brain dynamics and a treatise of recognition of faces and facial expressions. Progress in Brain Research, 159, 43–62. https://doi.org/10.1016/s0079-6123(06)59004-1
  • Beauchemin, M., De Beaumont, L., Vannasing, P., Turcotte, A., Arcand, C., Belin, P., & Lassonde, M. (2006). Electrophysiological markers of voice familiarity. The European Journal of Neuroscience, 23(11), 3081–3086. https://doi.org/10.1111/j.1460-9568.2006.04856.x
  • Bechara, A., Damasio, H., & Damasio, A. R. (2000). Emotion, decision making and the orbitofrontal cortex. Cerebral Cortex, 10(3), 295–307. https://doi.org/10.1093/cercor/10.3.295
  • Behrens, S. I., Egsvang, A. K. K., Hansen, M., & Møllegård-Schroll, A. M. (2018). Gendered robot voices and their influence on trust [Paper presentation]. Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA. https://doi.org/10.1145/3173386.3177009
  • Bekkedal, M. Y., Rossi, J., III, & Panksepp, J. (2011). Human brain EEG indices of emotions: Delineating responses to affective vocalizations by measuring frontal theta event-related synchronization. Neuroscience and Biobehavioral Reviews, 35(9), 1959–1970. https://doi.org/10.1016/j.neubiorev.2011.05.001
  • Belin, P. (2006). Voice processing in human and non-human primates. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 361(1476), 2091–2107. https://doi.org/10.1098/rstb.2006.1933
  • Beran, T. N., Ramirez-Serrano, A., Kuzyk, R., Fior, M., & Nugent, S. (2011). Understanding how children understand robots: Perceived animism in child-robot interaction. International Journal of Human-Computer Studies, 69(7-8), 539–550. https://doi.org/10.1016/j.ijhcs.2011.04.003
  • Bradley, M. M., Miccoli, L., Escrig, M. A., & Lang, P. J. (2008). The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology, 45(4), 602–607. https://doi.org/10.1111/j.1469-8986.2008.00654.x
  • Burleigh, T. J., Schoenherr, J. R., & Lacroix, G. L. (2013). Does the uncanny valley exist? An empirical test of the relationship between eeriness and the human likeness of digitally created faces. Computers in Human Behavior, 29(3), 759–771. https://doi.org/10.1016/j.chb.2012.11.021
  • Carretie, L., Martin-Loeches, M., Hinojosa, J. A., & Mercado, F. (2001). Emotion and attention interaction studied through event-related potentials. Journal of Cognitive Neuroscience, 13(8), 1109–1128. https://doi.org/10.1162/089892901753294400
  • Chang, R. C.-S., Lu, H.-P., & Yang, P. (2018). Stereotypes or golden rules? Exploring likable voice traits of social robots as active aging companions for tech-savvy baby boomers in Taiwan. Computers in Human Behavior, 84, 194–210. https://doi.org/10.1016/j.chb.2018.02.025
  • Charest, I., Pernet, C. R., Rousselet, G. A., Quiñones, I., Latinus, M., Fillion-Bilodeau, S., Chartrand, J.-P., & Belin, P. (2009). Electrophysiological evidence for an early processing of human voices. BMC Neuroscience, 10(1), 127. https://doi.org/10.1186/1471-2202-10-127
  • Chaudhry, A. M., Parkinson, J. A., Hinton, E. C., Owen, A. M., & Roberts, A. C. (2009). Preference judgements involve a network of structures within frontal, cingulate and insula cortices. The European Journal of Neuroscience, 29(5), 1047–1055. https://doi.org/10.1111/j.1460-9568.2009.06646.x
  • Chita-Tegmark, M., Lohani, M., & Scheutz, M. (2019, March 14). Gender effects in perceptions of robots and humans with varying emotional intelligence [Paper presentation]. 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea (South). https://doi.org/10.1109/HRI.2019.8673222
  • Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science, 1(3), 98–101. https://doi.org/10.1111/1467-8721.ep10768783
  • Cohen, M. X. (2014). Analyzing neural time series data: Theory and practice. MIT Press.
  • Cohen, M. X., & van Gaal, S. (2013). Dynamic interactions between large-scale brain networks predict behavioral adaptation after perceptual errors. Cerebral Cortex, 23(5), 1061–1072. https://doi.org/10.1093/cercor/bhs069
  • Cuthbert, B. N., Schupp, H. T., Bradley, M. M., Birbaumer, N., & Lang, P. J. (2000). Brain potentials in affective picture processing: Covariation with autonomic arousal and affective report. Biological Psychology, 52(2), 95–111. https://doi.org/10.1016/S0301-0511(99)00044-7
  • Delorme, A., & Makeig, S. (2004). EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. Journal of Neuroscience Methods, 134(1), 9–21. https://doi.org/10.1016/j.jneumeth.2003.10.009
  • Dou, X., Wu, C. F., Linz, K. C., Gan, S. Z., & Tseng, T. M. (2020). Effects of different types of social robot voices on affective evaluations in different application fields. International Journal of Social Robotics, 13(4), 615–628. https://doi.org/10.1007/s12369-020-00654-9
  • Dou, X., Wu, C. F., Niu, J., & Pan, K. R. (2022). Effect of voice type and head-light color in social robots for different applications. International Journal of Social Robotics, 14(1), 229–244. https://doi.org/10.1007/s12369-021-00782-w
  • Edwards, C., Edwards, A., Stoll, B., Lin, X., & Massey, N. (2019). Evaluations of an artificial intelligence instructor’s voice: Social identity theory in human-robot interactions. Computers in Human Behavior, 90, 357–362. https://doi.org/10.1016/j.chb.2018.08.027
  • Eyssel, F., Kuchenbrandt, D., Bobinger, S., De Ruiter, L., & Hegel, F. (2012, January 1). If you sound like me, you must be more human [Paper presentation]. Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction - HRI '12, Boston, MA, USA. https://doi.org/10.1145/2157689.2157717
  • Eyssel, F., Kuchenbrandt, D., Hegel, F., De Ruiter, L. (2012). Activating elicited agent knowledge: How robot and user features shape the perception of social robots. In 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication (pp. 851–857). IEEE. https://doi.org/10.1109/ROMAN.2012.6343858
  • Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. https://doi.org/10.3758/bf03193146
  • Feil-Seifer, D., & Mataric, M. J. (2005, June 28–July 1). Defining socially assistive robotics. In 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005 (pp. 465–468). IEEE. https://doi.org/10.1109/ICORR.2005.1501143
  • Feinberg, D. R., DeBruine, L. M., Jones, B. C., & Perrett, D. I. (2008). The role of femininity and averageness of voice pitch in aesthetic judgments of women’s voices. Perception, 37(4), 615–623. https://doi.org/10.1068/p5514
  • Friedman, D., Cycowicz, Y. M., & Gaeta, H. (2001). The novelty P3: An event-related brain potential (ERP) sign of the brain’s evaluation of novelty. Neuroscience & Biobehavioral Reviews, 25(4), 355–373. https://doi.org/10.1016/S0149-7634(01)00019-7
  • Garcia-Sierra, A., Rivera-Gaxiola, M., Percaccio, C. R., Conboy, B. T., Romo, H., Klarman, L., Ortiz, S., & Kuhl, P. K. (2011). Bilingual language learning: An ERP study relating early brain responses to speech, language input, and later word production. Journal of Phonetics, 39(4), 546–557. https://doi.org/10.1016/j.wocn.2011.07.002
  • Guo, F., Ding, Y., Wang, T., Liu, W., & Jin, H. (2016). Applying event related potentials to evaluate user preferences toward smartphone form design. International Journal of Industrial Ergonomics, 54, 57–64. https://doi.org/10.1016/j.ergon.2016.04.006
  • Guo, F., Hu, M., Duffy, V. G., Shao, H., & Ren, Z. (2021). Kansei evaluation for group of users: A data-driven approach using dominance-based rough sets. Advanced Engineering Informatics, 47, 101241. https://doi.org/10.1016/j.aei.2020.101241
  • Guo, F., Li, M. M., Chen, J. H., & Duffy, V. G. (2022). Evaluating users’ preference for the appearance of humanoid robots via event-related potentials and spectral perturbations. Behaviour & Information Technology, 41(7), 1381–1397. https://doi.org/10.1080/0144929X.2021.1876763
  • Guo, F., Wang, X-s., Liu, W-l., & Ding, Y. (2018). Affective preference measurement of product appearance based on event-related potentials. Cognition, Technology & Work, 20(2), 299–308. https://doi.org/10.1007/s10111-018-0463-5
  • Hajcak, G., & Olvet, D. M. (2008). The persistence of attention to emotion: Brain potentials during and after picture presentation. Emotion, 8(2), 250–255. https://doi.org/10.1037/1528-3542.8.2.250
  • Handy, T. C., Smilek, D., Geiger, L., Liu, C., & Schooler, J. W. (2010). ERP evidence for rapid hedonic evaluation of logos. Journal of Cognitive Neuroscience, 22(1), 124–138. https://doi.org/10.1162/jocn.2008.21180
  • Harris, K. D., & Mrsic-Flogel, T. D. (2013). Cortical connectivity and sensory coding. Nature, 503(7474), 51–58. https://doi.org/10.1038/nature12654
  • Hu, L., Xiao, P., Zhang, Z. G., Mouraux, A., & Iannetti, G. D. (2014). Single-trial time–frequency analysis of electrocortical signals: Baseline correction and beyond. NeuroImage, 84, 876–887. https://doi.org/10.1016/j.neuroimage.2013.09.055
  • Huffmeijer, R., Bakermans-Kranenburg, M. J., Alink, L. R. A., & van Ijzendoorn, M. H. (2014). Reliability of event-related potentials: The influence of number of trials and electrodes. Physiology & Behavior, 130, 13–22. https://doi.org/10.1016/j.physbeh.2014.03.008
  • James, J., Watson, C. I., & Macdonald, B. (2018, August 1). Artificial empathy in social robots: An analysis of emotions in speech [Paper presentation]. 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China. https://doi.org/10.1109/ROMAN.2018.8525652
  • Kang, J.-H., Kim, S. J., Cho, Y. S., & Kim, S.-P. (2015). Modulation of alpha oscillations in the human EEG with facial preference. PLoS One, 10(9), e0138153. https://doi.org/10.1371/journal.pone.0138153
  • Kawasaki, M., & Yamaguchi, Y. (2012a). Effects of subjective preference of colors on attention-related occipital theta oscillations. NeuroImage, 59(1), 808–814. https://doi.org/10.1016/j.neuroimage.2011.07.042
  • Kawasaki, M., & Yamaguchi, Y. (2012b). Individual visual working memory capacities and related brain oscillatory activities are modulated by color preferences. Frontiers in Human Neuroscience, 6, 318. https://doi.org/10.3389/fnhum.2012.00318
  • Keil, A., Bradley, M. M., Hauk, O., Rockstroh, B., Elbert, T., & Lang, P. J. (2002). Large-scale neural correlates of affective picture processing. Psychophysiology, 39(5), 641–649. https://doi.org/10.1111/1469-8986.3950641
  • Kim, H., Adolphs, R., O'Doherty, J. P., & Shimojo, S. (2007). Temporal isolation of neural processes underlying face preference decisions. Proceedings of the National Academy of Sciences of the United States of America, 104(46), 18253–18258. https://doi.org/10.1073/pnas.0703101104
  • Klimesch, W. (1999). EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Research. Brain Research Reviews, 29(2–3), 169–195. https://doi.org/10.1016/S0165-0173(98)00056-3
  • Klimesch, W., Sauseng, P., & Hanslmayr, S. (2007). EEG alpha oscillations: The inhibition–timing hypothesis. Brain Research Reviews, 53(1), 63–88. https://doi.org/10.1016/j.brainresrev.2006.06.003
  • Knyazev, G. G., Slobodskoj-Plusnin, J. Y., & Bocharov, A. V. (2009). Event-related delta and theta synchronization during explicit and implicit emotion processing. Neuroscience, 164(4), 1588–1600. https://doi.org/10.1016/j.neuroscience.2009.09.057
  • Kotz, S. A., & Paulmann, S. (2011). Emotion, language, and the brain. Language and Linguistics Compass, 5(3), 108–125. https://doi.org/10.1111/j.1749-818X.2010.00267.x
  • Kuhne, K., Fischer, M. H., & Zhou, Y. (2020). The human takes it all: Humanlike synthesized voices are perceived as less eerie and more likable. Evidence from a subjective ratings study. Frontiers in Neurorobotics, 14, 593732. https://doi.org/10.3389/fnbot.2020.593732
  • Kwak, S. S., Kim, J. S., & Choi, J. J. (2017). The effects of organism-versus object-based robot design approaches on the consumer acceptance of domestic robots. International Journal of Social Robotics, 9(3), 359–377. https://doi.org/10.1007/s12369-016-0388-1
  • Lee, E. J., Nass, C., & Brave, S. (2000, April). Can computer-generated speech have gender?: An experimental test of gender stereotype. In M. Tremaine (Ed.), CHI ’00 extended abstracts on Human factors in computing systems (pp. 289–290). ACM. https://doi.org/10.1145/633292.633461
  • Levy, D. A., Granot, R., & Bentin, S. (2003). Neural sensitivity to human voices: ERP evidence of task and attentional influences. Psychophysiology, 40(2), 291–305. https://doi.org/10.1111/1469-8986.00031
  • Lin, H., & Liang, J. (2019). Contextual effects of angry vocal expressions on the encoding and recognition of emotional faces: An event-related potential (ERP) study. Neuropsychologia, 132, 107147. https://doi.org/10.1016/j.neuropsychologia.2019.107147
  • Lindsen, J. P., Jones, R., Shimojo, S., & Bhattacharya, J. (2010). Neural components underlying subjective preferential decision making. NeuroImage, 50(4), 1626–1632. https://doi.org/10.1016/j.neuroimage.2010.01.079
  • Liu, T. S., Pinheiro, A. P., Deng, G. H., Nestor, P. G., McCarley, R. W., & Niznikiewicz, M. A. (2012). Electrophysiological insights into processing nonverbal emotional vocalizations. Neuroreport, 23(2), 108–112. https://doi.org/10.1097/WNR.0b013e32834ea757
  • Liu, W., Liang, X., Wang, X., & Guo, F. (2019). The evaluation of emotional experience on webpages: An event-related potential study. Cognition, Technology & Work, 21(2), 317–326. https://doi.org/10.1007/s10111-018-0507-x
  • Luck, S. J. (2014). An introduction to the event-related potential technique (2nd ed.). MIT Press.
  • Makeig, S., Debener, S., Onton, J., & Delorme, A. (2004). Mining event-related brain dynamics. Trends in Cognitive Sciences, 8(5), 204–210. https://doi.org/10.1016/j.tics.2004.03.008
  • Mcginn, C., & Torre, I. (2019, March 1). Can you tell the robot by the voice? an exploratory study on the role of voice in the perception of robots [Paper presentation]. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea (South). https://doi.org/10.1109/HRI.2019.8673305
  • Müller, M., Höfel, L., Brattico, E., & Jacobsen, T. (2010). Aesthetic judgments of music in experts and laypersons—An ERP study. International Journal of Psychophysiology: Official Journal of the International Organization of Psychophysiology, 76(1), 40–51. https://doi.org/10.1016/j.ijpsycho.2010.02.002
  • Nass, C. I., & Brave, S. (2005). Wired for speech: How voice activates and advances the human-computer relationship. MIT Press.
  • Nass, C. I., Moon, Y., Fogg, B. J., Reeves, B., & Dryer, D. C. (1995). Can computer personalities be human personalities? International Journal of Human-Computer Studies, 43(2), 223–239. https://doi.org/10.1006/ijhc.1995.1042
  • Niculescu, A., van Dijk, B., Nijholt, A., Li, H., & See, S. L. (2013). Making social robots more attractive: The effects of voice pitch, humor and empathy. International Journal of Social Robotics, 5(2), 171–191. https://doi.org/10.1007/s12369-012-0171-x
  • Niculescu, A., van Dijk, B., Nijholt, A., & See, S. L. (2011, November 29–December 1). The influence of voice pitch on the evaluation of a social robot receptionist [Paper presentation]. 2011 International Conference on User Science and Engineering (i-USEr), Selangor, Malaysia. https://doi.org/10.1109/iUSEr.2011.6150529
  • Norman, D. A. (2004). Emotional design: Why we love (or hate) everyday things. Basic Books.
  • O'Connor, J. J. M., & Barclay, P. (2017). The influence of voice pitch on perceptions of trustworthiness across social contexts. Evolution and Human Behavior, 38(4), 506–512. https://doi.org/10.1016/j.evolhumbehav.2017.03.001
  • Oostenveld, R., Fries, P., Maris, E., & Schoffelen, J.-M. (2011). FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data. Computational Intelligence and Neuroscience, 2011, 156869–156869. https://doi.org/10.1155/2011/156869
  • Parvaz, M. A., Macnamara, A., Goldstein, R. Z., & Hajcak, G. (2012). Event-related induced frontal alpha as a marker of lateral prefrontal cortex activation during cognitive reappraisal. Cognitive, Affective & Behavioral Neuroscience, 12(4), 730–740. https://doi.org/10.3758/s13415-012-0107-9
  • Paulmann, S., & Kotz, S. A. (2008). Early emotional prosody perception based on different speaker voices. Neuroreport, 19(2), 209–213. https://doi.org/10.1097/WNR.0b013e3282f454db
  • Pell, M. D., Rothermich, K., Liu, P., Paulmann, S., Sethi, S., & Rigoulot, S. (2015). Preferential decoding of emotion from human non-linguistic vocalizations versus speech prosody. Biological Psychology, 111, 14–25. https://doi.org/10.1016/j.biopsycho.2015.08.008
  • Pinheiro, A. P., Vasconcelos, M., Dias, M., Arrais, N., & Gonçalves, Ó. F. (2015). The music of language: An ERP investigation of the effects of musical training on emotional prosody processing. Brain and Language, 140, 24–34. https://doi.org/10.1016/j.bandl.2014.10.009
  • Polka, L. (1991). Cross-language speech perception in adults: Phonemic, phonetic, and acoustic contributions. The Journal of the Acoustical Society of America, 89(6), 2961–2977. https://doi.org/10.1121/1.400734
  • Prentice, D. A., & Carranza, E. (2002). What women and men should be, shouldn't be, are allowed to be, and don't have to be: The contents of prescriptive gender stereotypes. Psychology of Women Quarterly, 26(4), 269–281. https://doi.org/10.1111/1471-6402.t01-1-00066
  • Přibil, J., Přibilová, A., & Matoušek, J. (2020). GMM-based evaluation of synthetic speech quality using 2D classification in pleasure-arousal scale. Applied Sciences, 11(1), 2. https://doi.org/10.3390/app11010002
  • Ramsøy, T. Z., Michael, N., & Michael, I. (2019). A consumer neuroscience study of conscious and subconscious destination preference. Scientific Reports, 9(1), 1–8. https://doi.org/10.1038/s41598-019-51567-1
  • Ramsoy, T. Z., Skov, M., Christensen, M. K., & Stahlhut, C. (2018). Frontal Brain Asymmetry and Willingness to Pay. Frontiers in Neuroscience, 12, 138. https://doi.org/10.3389/fnins.2018.00138
  • Robinson, F. A., Bown, O., & Velonaki, M. (2022). Designing sound for social robots: Candidate design principles. International Journal of Social Robotics. https://doi.org/10.1007/s12369-022-00891-0
  • Robinson, N. L., & Kavanagh, D. J. (2021). A social robot to deliver a psychotherapeutic treatment: Qualitative responses by participants in a randomized controlled trial and future design recommendations. International Journal of Human-Computer Studies, 155, 102700. https://doi.org/10.1016/j.ijhcs.2021.102700
  • Roesler, E., Manzey, D., & Onnasch, L. (2021). A meta-analysis on the effectiveness of anthropomorphism in human-robot interaction. Science Robotics, 6(58), eabj5425. https://doi.org/10.1126/scirobotics.abj5425
  • Rozenkrants, B., & Polich, J. (2008). Affective ERP processing in a visual oddball task: Arousal, valence, and gender. Clinical Neurophysiology: Official Journal of the International Federation of Clinical Neurophysiology, 119(10), 2260–2265. https://doi.org/10.1016/j.clinph.2008.07.213
  • Sammler, D., Grigutsch, M., Fritz, T., & Koelsch, S. (2007). Music and emotion: Electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology, 44(2), 293–304. https://doi.org/10.1111/j.1469-8986.2007.00497.x
  • Sandygulova, A., & O’Hare, G. M. P. (2018). Age- and gender-based differences in children’s interactions with a gender-matching robot. International Journal of Social Robotics, 10(5), 687–700. https://doi.org/10.1007/s12369-018-0472-9
  • Schirmer, A., & Gunter, T. C. (2017). Temporal signatures of processing voiceness and emotion in sound. Social Cognitive and Affective Neuroscience, 12(6), 902–909. https://doi.org/10.1093/scan/nsx020
  • Schirmer, A., & Kotz, S. A. (2006). Beyond the right hemisphere: Brain mechanisms mediating vocal emotional processing. Trends in Cognitive Sciences, 10(1), 24–30. https://doi.org/10.1016/j.tics.2005.11.009
  • Schreibelmayr, S., & Mara, M. (2022). Robot voices in daily life: Vocal human-likeness and application context as determinants of user acceptance. Frontiers in Psychology, 13, 787499. https://doi.org/10.3389/fpsyg.2022.787499
  • Schupp, H. T., Flaisch, T., Stockburger, J., & Junghöfer, M. (2006). Emotion and attention: Event-related brain potential studies. In H. T. Schupp, T. Flaisch, J. Stockburger, & M. Junghöfer (Eds.), Progress in brain research (Vol. 156, pp. 31–51). Elsevier. https://doi.org/10.1016/S0079-6123(06)56002-9
  • Seaborn, K., Miyake, N. P., Pennefather, P., & Otake-Matsuura, M. (2022). Voice in human-agent interaction: A survey. ACM Computing Surveys, 54(4), 1–43. https://doi.org/10.1145/3386867
  • Siegel, M., Breazeal, C., & Norton, M. I. (2009). Persuasive robotics: The influence of robot gender on human behavior. In 2009 IEEE/RSJ international conference on intelligent robots and systems (pp 2563–2568). IEEE. https://doi.org/10.1109/IROS.2009.5354116
  • Song, S., Baba, J., Nakanishi, J., Yoshikawa, Y., & Ishiguro, H. (2020). Mind the voice!: Effect of robot voice pitch, robot voice gender, and user gender on user perception of teleoperated robots. In R. Bernhaupt, F. F. Mueller, D. Verweij, J. Andres, J. McGrenere, A. Cockburn, I. Avellino, A. Goguey, P. Bjøn, S. Zhao, B. Paul Samson, & R. Kocielnik (Eds.), Extended abstracts of the 2020 CHI conference on human factors in computing systems, Honolulu, HI, USA (pp. 1–8). ACM. https://doi.org/10.1145/3334480.3382988
  • Steber, S., König, N., Stephan, F., & Rossi, S. (2020). Uncovering electrophysiological and vascular signatures of implicit emotional prosody. Scientific Reports, 10(1), 5807. https://doi.org/10.1038/s41598-020-62761-x
  • Tamagawa, R., Watson, C. I., Kuo, I. H., MacDonald, B. A., & Broadbent, E. (2011). The effects of synthesized voice accents on user perceptions of robots. International Journal of Social Robotics, 3(3), 253–262. https://doi.org/10.1007/s12369-011-0100-4
  • Tashiro, N., Sugata, H., Ikeda, T., Matsushita, K., Hara, M., Kawakami, K., Kawakami, K., & Fujiki, M. (2019). Effect of individual food preferences on oscillatory brain activity. Brain and Behavior, 9(5), e01262. https://doi.org/10.1002/brb3.1262
  • Tay, B., Jung, Y., & Park, T. (2014). When stereotypes meet robots: The double-edge sword of robot gender and personality in human–robot interaction. Computers in Human Behavior, 38, 75–84. https://doi.org/10.1016/j.chb.2014.05.014
  • Thorpe, S., Fize, D., & Marlot, C. (1996). Speed of processing in the human visual system. Nature, 381(6582), 520–522. https://doi.org/10.1038/381520a0
  • Trovato, G., Ramos, J. G., Azevedo, H., Moroni, A., Magossi, S., Simmons, R., Ishii, H., & Takanishi, A. (2017). A receptionist robot for Brazilian people: Study on interaction involving illiterates. Paladyn, Journal of Behavioral Robotics, 8(1), 1–17. https://doi.org/10.1515/pjbr-2017-0001
  • Turk, O., & Schroder, M. (2010). Evaluation of expressive speech synthesis with voice conversion and copy resynthesis techniques. IEEE Transactions on Audio, Speech, and Language Processing, 18(5), 965–973. https://doi.org/10.1109/TASL.2010.2041113
  • Uusberg, A., Thiruchselvam, R., & Gross, J. J. (2014). Using distraction to regulate emotion: Insights from EEG theta dynamics. International Journal of Psychophysiology: Official Journal of the International Organization of Psychophysiology, 91(3), 254–260. https://doi.org/10.1016/j.ijpsycho.2014.01.006
  • Walters, M. L., Syrdal, D. S., Koay, K. L., Dautenhahn, K., Te Boekhorst, R. (2008). Human approach distances to a mechanical-looking robot with different robot voice styles. In RO-MAN 2008 - The 17th IEEE international symposium on robot and human interactive communication (pp. 707–712). IEEE.
  • Wang, R. W. Y., Chang, Y. C., & Chuang, S. W. (2016). EEG spectral dynamics of video commercials: Impact of the narrative on the branding product preference. Scientific Reports, 6, 36487. https://doi.org/10.1038/srep36487
  • Werheid, K., Schacht, A., & Sommer, W. (2007). Facial attractiveness modulates early and late event-related brain potentials. Biological Psychology, 76(1–2), 100–108. https://doi.org/10.1016/j.biopsycho.2007.06.008
  • Winkler, I., Haufe, S., & Tangermann, M. (2011). Automatic classification of artifactual ICA-components for artifact removal in EEG signals. Behavioral and Brain Functions, 7(1), 30. https://doi.org/10.1186/1744-9081-7-30
  • Xu, K. (2019). First encounter with robot Alpha: How individual differences interact with vocal and kinetic cues in users' social responses. New Media & Society, 21(11–12), 2522–2547. https://doi.org/10.1177/1461444819851479
  • Zajonc, R. B., & Markus, H. (1982). Affective and cognitive-factors in preferences. Journal of Consumer Research, 9(2), 123–131. https://doi.org/10.1086/208905
  • Zhang, H., Liu, M., Li, W., & Sommer, W. (2020). Human voice attractiveness processing: Electrophysiological evidence. Biological Psychology, 150, 107827. https://doi.org/10.1016/j.biopsycho.2019.107827
  • Zheng, Y., Compton, B. J., Heyman, G. D., & Jiang, Z. (2020). Vocal attractiveness and voluntarily pitch-shifted voices. Evolution and Human Behavior, 41(2), 170–175. https://doi.org/10.1016/j.evolhumbehav.2020.01.002
  • Zougkou, K., Weinstein, N., & Paulmann, S. (2017). ERP correlates of motivating voices: Quality of motivation and time-course matters. Social Cognitive and Affective Neuroscience, 12(10), 1687–1700. https://doi.org/10.1093/scan/nsx064

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.