55
Views
0
CrossRef citations to date
0
Altmetric
Full Papers

Multiparty conversation via multirobot system: incorporation of nonverbal user responses for continued conversation

, , &
Pages 482-491 | Received 11 Aug 2023, Accepted 20 Feb 2024, Published online: 09 Mar 2024

References

  • Tanioka T. Nursing and rehabilitative care of the elderly using humanoid robots. J Med Invest. 2019;66(1.2):19–23. doi: 10.2152/jmi.66.19
  • Pino O, Palestra G, Trevino R, et al. The humanoid robot nao as trainer in a memory program for elderly people with mild cognitive impairment. Int J Soc Robot. 2020;12(1):21–33. doi: 10.1007/s12369-019-00533-y
  • Wood LJ, Zaraki A, Robins B, et al. Developing kaspar: A humanoid robot for children with autism. Int J Soc Robot. 2019;13:491–508. doi: 10.1007/s12369-019-00563-6
  • Yabuki K, Sumi K. Learning support system for effectively conversing with individuals with autism using a humanoid robot. In: 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC); 2018; Miyazaki, Japan. p. 4266–4270.
  • Nishio T, Yoshikawa Y, Iio T, et al. Actively listening twin robots for long-duration conversation with the elderly. ROBOMECH J. 2021;8(1):18. doi: 10.1186/s40648-021-00205-5
  • Nomura T, Kanda T, Suzuki T, et al. Do people with social anxiety feel anxious about interacting with a robot? AI Soc. 2020;35(2):381–390. doi: 10.1007/s00146-019-00889-9
  • Arimoto T, Yoshikawa Y, Ishiguro H. Multiple-robot conversational patterns for concealing incoherent responses. Int J Soc Robot. 2018.
  • Sugiyama H, Meguro T, Yoshikawa Y, et al. Avoiding breakdown of conversational dialogue through inter-robot coordination. In: Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems; 2018; Stockholm, Sweden. p. 2256–2258.
  • Saunderson S, Nejat G. How robots influence humans: A survey of nonverbal communication in social human– robot interaction. Int J Soc Robot. 2019;11(4):575–608. doi: 10.1007/s12369-019-00523-0
  • Lee S, Lee G, Kim S, et al. Expressing personalities of conversational agents through visual and verbal feedback. Electronics. 2019;8(7):794.
  • Li Y, Ishi CT, Inoue K, et al. Expressing reactive emotion based on multimodal emotion recognition for natural conversation in human–robot interaction. Adv Robot. 2019;33(20):1030–1041. doi: 10.1080/01691864.2019.1667872
  • Kucherenko T. Data driven non-verbal behavior generation for humanoid robots. In: Proceedings of the 20th ACM International Conference on Multimodal Interaction; 2018; Boulder CO, USA. p. 520–523.
  • Huang HH, Kimura S, Kuwabara K, et al. Generation of head movements of a robot using multimodal features of peer participants in group discussion conversation. Multimodal Technologies and Interaction. 2020;4(2 doi: 10.3390/mti4020015
  • Karatas N, Yoshikawa S, De Silva PRS, et al. Namida: Multiparty conversation based driving agents in futuristic vehicle. In: Human-Computer Interaction: Users and Contexts; 2015; Los Angeles, CA, USA. p. 198–207.
  • Iio T, Yoshikawa Y, Chiba M, et al. Twin-robot dialogue system with robustness against speech recognition failure in human-robot dialogue with elderly people. Appl Sci. 2020;10(4). doi: 10.3390/app10041522
  • Iio T, Yoshikawa Y, Ishiguro H. Retaining human-robots conversation: comparing single robot to multiple robots in a real event. J Adv Comput Intell Intell Inform. 2017;21(4):675–685. doi: 10.20965/jaciii.2017.p0675
  • Iio T, Yoshikawa Y, Ishiguro H. Pre-scheduled turn-taking between robots to make conversation coherent. In: Proceedings of the Fourth International Conference on Human Agent Interaction; 2016; New York, NY, USA. p. 19–25.
  • Khalifa A, Kato T, Yamamoto S. Joining-in-type humanoid robot assisted language learning system. In: Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC'16); 2016; Portorož, Slovenia. p. 245–249. Available from: https://www.aclweb.org/anthology/L16-1037.
  • Mehrabian A. Communication without words. Psycholog Today. 1968;2:53–55.
  • Argyle M, Cook M, Cramer D. Gaze and mutual gaze. Br J Psychiatry. 1994;165(6):848–850. doi: 10.1017/S0007125000073980
  • Andrist S, Tan XZ, Gleicher M, et al. Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, HRI '14. New York, NY, USA: Association for Computing Machinery; 2014. p. 25–32. doi: 10.1145/2559636.2559666.
  • Ephratt M. Linguistic, paralinguistic and extralinguistic speech and silence. J Pragmat. 2011;43(9):2286–2307. doi: 10.1016/j.pragma.2011.03.006Silence as a Pragmatic Phenomenon; Available from: https://www.sciencedirect.com/science/article/pii/S0378216611000853.
  • Simonyan K, Zisserman A. Very deep convolutional networks for large-scale image recognition. In: Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015; 2015; San Diego, CA, USA. Available from: http://arxiv.org/abs/1409.1556.
  • Goodfellow IJ, Erhan D, Luc Carrier P, et al. Challenges in representation learning: A report on three machine learning contests. Neural Netw. 2015;64:59–63. doi: 10.1016/j.neunet.2014.09.005
  • Mishima N, Shinkoda H. The way of positive listening for nursing: toward the communication with each other (Japanese). Medicus shuppan; 1999. p. 178–181.
  • Zizzo DJ. Experimenter demand effects in economic experiments. Exp Econ. 2010 Mar;13(1):75–98. doi: 10.1007/s10683-009-9230-z
  • Hayashi K, Sakamoto D, Kanda T, et al. Humanoid robots as a passive-social medium -- a field experiment at a train station. In: 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI); 2007; New York, NY, USA. p. 137–144.
  • Xiao Z, Zhou MX, Chen W, et al. If i hear you correctly: building and evaluating interview chatbots with active listening skills. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems; 2020; Honolulu HI, USA. p. 1–14.
  • Rincon JA, Costa A, Novais P, et al. A new emotional robot assistant that facilitates human interaction and persuasion. Knowl Inf Syst. 2019;60(1):363–383. doi: 10.1007/s10115-018-1231-9

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.