77
Views
0
CrossRef citations to date
0
Altmetric
Full Papers

Speed effects in touching behaviours: impact on perceived relationships in robot-robot interactions

, , , ORCID Icon, & ORCID Icon
Pages 492-509 | Received 10 Jul 2023, Accepted 14 Feb 2024, Published online: 05 Mar 2024

References

  • Dahiya A, Aroyo AM, Dautenhahn K, et al. A survey of multi-agent human-robot interaction systems. Rob Auton Syst. 2022;161:104335.
  • Tan XZ, Reig S, Carter EJ, et al. From one to another: how robot-robot interaction affects users’ perceptions following a transition between robots. 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2019. p. 114–122.
  • Fraune MR, Oisted BC, Sembrowski CE, et al. Effects of robot-human versus robot-robot behavior and entitativity on anthropomorphism and willingness to interact. Comput Human Behav. 2020;105:106220. doi:10.1016/j.chb.2019.106220
  • Erel H, Cohen Y, Shafrir K, et al. Excluded by robots: can robot-robot-human interaction lead to ostracism? Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction; 2021. p. 312–321.
  • Sherif M. A study of some social factors in perception. Archives of Psychology (Columbia University), 1935.
  • Asch SE. Opinions and social pressure. Sci Am. 1955;193:17–26.
  • Asch SE. Effects of group pressure upon the modification and distortion of judgments. In: H. Guetzkow, editor. Groups, leadership, and men research in human relations; Routledge; 1951. p. 295–303.
  • Iio T, Yoshikawa Y, Ishiguro H. Pre-scheduled turn-taking between robots to make conversation coherent. Proceedings of the Fourth International Conference on Human Agent Interaction; Biopolis, Singapore; 2016. p. 19–25.
  • Iio T, Yoshikawa Y, Ishiguro H. Double-meaning agreements by two robots to conceal incoherent agreements to user's opinions. Adv Robot. 2021;35(19):1145–1155. doi:10.1080/01691864.2021.1974939
  • Sakamoto D, Hayashi K, Kanda T, et al. Humanoid robots as a broadcasting communication medium in open public spaces. Int J Soc Robot. 2009;1(2):157–169. doi:10.1007/s12369-009-0015-5
  • Iio T, Yoshikawa Y, Ishiguro H. Retaining human-robots conversation: comparing single robot to multiple robots in a real event. J Adv Comput Intell Intell Inform. 2017;21(4):675–685. doi:10.20965/jaciii.2017.p0675
  • Iwamoto T, Baba J, Nakanishi J, et al. Playful recommendation: sales promotion that robots stimulate pleasant feelings instead of product explanation. IEEE Robot Autom Lett. 2022;7(4):11815–11822. doi:10.1109/LRA.2022.3189149
  • Amada J, Okafuji Y, Matsumura K, et al. Investigating the crowd-drawing effect, on passersby, of pseudo-crowds using multiple robots. Adv Robot. 2023;37(6):423–432. doi:10.1080/01691864.2022.2143242
  • Shiomi M, Okumura S, Kimoto M, et al. Two is better than one: social rewards from two agents enhance offline improvements in motor skills more than single agent. PLoS One. 2020;15(11):e0240622. doi:10.1371/journal.pone.0240622
  • Shiomi M, Tamura Y, Kimoto M, et al. Two is better than one: verification of the effect of praise from two robots on pre-school children’s learning time. Adv Robot. 2021;35(19):1132–1144. doi:10.1080/01691864.2021.1970019
  • Tamura Y, Shiomi M, Kimoto M, et al. Robots as an interactive-social medium in storytelling to multiple children. Interact Stud. Soc Behav Commun Biol Artif Syst. 2021;22(1):110–140. doi:10.1075/is.18033.tam
  • Iio T, Maeda R, Ogawa K, et al. Improvement of Japanese adults’ English speaking skills via experiences speaking to a robot. J Comput Assist Learn. 2019;35(2):228–245. doi:10.1111/jcal.12325
  • Okada Y, Kimoto M, Iio T, et al. Two is better than one: apologies from two robots are preferred. PLoS One. 2023;18(2):e0281604. doi:10.1371/journal.pone.0281604
  • Itahara H, Kimoto M, Iio T, et al. How does exposure to changing opinions or reaffirmation opinions influence the thoughts of observers and their trust in robot discussions? Appl Sci. 2023;13(1):585. doi:10.3390/app13010585
  • Law T, Malle BF, Scheutz M. A touching connection: how observing robotic touch can affect human trust in a robot. Int J Soc Robot. 2021;13(8):2003–2019. doi:10.1007/s12369-020-00729-7
  • Okada Y, Kimoto M, Iio T, et al. Kawaii emotions in presentations: viewing a physical touch affects perception of affiliative feelings of others toward an object. PLoS One. 2022;17(3):e0264736. doi:10.1371/journal.pone.0264736
  • Deshmukh A, Craenen B, Vinciarelli A, et al. Shaping robot gestures to shape users’ perception: the effect of amplitude and speed on Godspeed ratings. Proceedings of the 6th International Conference on Human-Agent Interaction; 2018. p. 293–300.
  • Willemse CJ, Huisman G, Jung MM, et al. Observing touch from video: the influence of social cues on pleasantness perceptions. Haptics: Perception, Devices, Control, and Applications: 10th International Conference, EuroHaptics 2016; London, UK; 2016 Jul 4–7, Proceedings, Part II 10. p. 196–205.
  • Blakemore S-J, Bristow D, Bird G, et al. Somatosensory activations during the observation of touch and a case of vision–touch synaesthesia. Brain. 2005;128(7):1571–1583. doi:10.1093/brain/awh500
  • Serino A, Pizzoferrato F, Làdavas E. Viewing a face (especially one's own face) being touched enhances tactile perception on the face. Psychol Sci. 2008;19(5):434–438. doi:10.1111/j.1467-9280.2008.02105.x
  • Hirayama T, Okada Y, Kimoto M, et al. What is the speed boundary between patting and slapping by a robot? Investigating perceptions toward robot-robot interaction. Proceedings of the 10th International Conference on Human-Agent Interaction; 2022. p. 212–218.
  • Hayashi K, Kanda T, Miyashita T, et al. Robot Manzai: robot conversation as a passive–social medium. Int J Humanoid Rob. 2008;5(01):67–86. doi:10.1142/S0219843608001315
  • Price S, Bianchi-Berthouze N, Jewitt C, et al. The making of meaning through dyadic haptic affective touch. ACM Trans Comput-Hum Interact. 2022;29(3):1–42.
  • Zapparoli L, Invernizzi P, Gandola M, et al. Like the back of the (right) hand? A new fMRI look on the hand laterality task. Exp Brain Res. 2014;232:3873–3895. doi:10.1007/s00221-014-4065-z
  • Downs JS, Holbrook MB, Sheng S, et al. Are your participants gaming the system? Screening mechanical Turk workers. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Atlanta, Georgia, USA; 2010. p. 2399–2402.
  • Oppenheimer DM, Meyvis T, Davidenko N. Instructional manipulation checks: detecting satisficing to increase statistical power. J Exp Soc Psychol. 2009;45(4):867–872. doi:10.1016/j.jesp.2009.03.009
  • Cekaite A. Touch as embodied compassion in responses to pain and distress. In: Touch in social interaction. Routledge; 2020. p. 81–102.
  • Burdelski M, Cekaite A. Control touch in caregiver–child interaction: embodied organization in triadic mediation of peer conflict in Swedish and Japanese. In: Touch in social interaction. Routledge; 2020. p. 103–123.
  • Benninghoff B, Kulms P, Hoffmann L, et al. Theory of mind in human-robot-communication: appreciated or not? Kogn Syst. 2013;2013(1):1–7.
  • Peled-Avron L, Woolley JD. Understanding others through observed touch: neural correlates, developmental aspects, and psychopathology. Curr Opin Behav Sci. 2022;43:152–158. doi:10.1016/j.cobeha.2021.10.002
  • Shiomi M, Shatani K, Minato T, et al. How should a robot react before people's touch?: Modeling a pre-touch reaction distance for a robot's face. IEEE Robot Autom Lett. 2018;3(4):3773–3780. doi:10.1109/LRA.2018.2856303
  • Suvilehto JT, Glerean E, Dunbar RIM, et al. Topography of social touching depends on emotional bonds between humans. Proc Natl Acad Sci USA. 2015;112(45):13811–13816. doi:10.1073/pnas.1519231112
  • Suvilehto JT, Nummenmaa L, Harada T, et al. Cross-cultural similarity in relationship-specific social touching. Proc R Soc B: Biol Sci. 2019;286(2019):20190467. doi:10.1098/rspb.2019.0467
  • Willemse CJAM, Toet A, van Erp JBF. Affective and behavioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction. Front ICT. 2017;4(12):1–13.
  • Alenljung B, Andreasson R, Lowe R, et al. Conveying emotions by touch to the Nao robot: a user experience perspective. Multimodal Technol Interact. 2018;2(4):82. doi:10.3390/mti2040082
  • Zheng X, Shiomi M, Minato T, et al. What kinds of robot's touch will match expressed emotions? IEEE Robot Autom Lett. 2019;5(1):127–134.
  • Shiomi M, Shatani K, Minato T, et al. Does a robot's subtle pause in reaction time to people's touch contribute to positive influences? 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); Nanjing and Tai'an, China; 2018. p. 364–369.
  • Mejía DAC, Sumioka H, Ishiguro H, et al. Evaluating gaze behaviors as pre-touch reactions for virtual agents. Front Psychol. 2023;14:1129677. doi:10.3389/fpsyg.2023.1129677
  • Mabrouk AB, Zagrouba E. Abnormal behavior recognition for intelligent video surveillance systems: a review. Expert Syst Appl. 2018;91:480–491. doi:10.1016/j.eswa.2017.09.029
  • Pareek P, Thakkar A. A survey on video-based human action recognition: recent updates, datasets, challenges, and applications. Artif Intell Rev. 2021;54:2259–2322. doi:10.1007/s10462-020-09904-8
  • Shiomi M, Nakagawa K, Shinozawa K, et al. Does a robot’s touch encourage human effort? Int J Soc Robot. 2017;9:5–15. doi:10.1007/s12369-016-0339-x
  • Nishiguchi S, Ogawa K, Yoshikawa Y, et al. Theatrical approach: designing human-like behaviour in humanoid robots. Rob Auton Syst. 2017;89:158–166. doi:10.1016/j.robot.2016.11.017
  • Fuoco E. Could a robot become a successful actor? The case of Geminoid F. Acta Univ Lodz Folia Litteraria Polonica. 2022;65(2):203–219. doi:10.18778/1505-9057.65.11
  • Cooney M, Shiomi M, Duarte EK, et al. A broad view on robot self-defense: rapid scoping review and cultural comparison. Robotics. 2023;12(2):43. doi:10.3390/robotics12020043
  • Sakamoto D, Ishiguro H. Geminoid: remote-controlled android system for studying human presence. Kansei Eng Int. 2009;8(1):3–9. doi:10.5057/ER081218-1
  • Glas DF, Minato T, Ishi CT, et al. Erica: the Erato intelligent conversational android. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN); New York, NY, USA; 2016. p. 22–29.
  • Shiomi M, Sumioka H, Sakai K, et al. SŌTO: An android platform with a masculine appearance for social touch interaction. Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction; Cambridge, UK; 2020. p. 447–449.
  • Yu R, Hui E, Lee J, et al. Use of a therapeutic, socially assistive pet robot (Paro) in improving mood and stimulating social interaction and communication for people with dementia: study protocol for a randomized controlled trial. JMIR Res Protoc. 2015;4(2):e4189.
  • Yoshida N, Yonemura S, Emoto M, et al. Production of character animation in a home robot: a case study of lovot. Int J Soc Robot. 2022;14(1):39–54. doi:10.1007/s12369-021-00746-0

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.