1,994
Views
21
CrossRef citations to date
0
Altmetric
Research Article

Human-human-robot interaction: robotic object’s responsive gestures improve interpersonal evaluation in human interaction

, , , ORCID Icon &
Pages 333-359 | Received 08 Feb 2019, Accepted 19 Jan 2020, Published online: 24 Feb 2020

References

  • Bartneck, C., & Forlizzi, J. (2004). A design-centred framework for social human-robot interaction. RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication, Kurashiki, Japan. (IEEE Catalog No. 04TH8759) (pp. 591–594). IEEE.
  • Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1, 71–81. doi:10.1007/s12369-008-0001-3
  • Beck, A., Cañamero, L., & Bard, K. A. (2010). Towards an affect space for robots to display emotional body language. 19th International symposium in Robot and Human Interactive Communication (pp. 464–469). IEEE, Viareggio, Italy.
  • Bethel, C. L., & Murphy, R. R. (2010). Emotive non-anthropomorphic robots perceived as more calming, friendly, and attentive for victim management. 2010 AAAI Fall Symposium Series, Arlington, Virginia.
  • Bohus, D., & Horvitz, E. (2014). Managing human-robot engagement with forecasts and … um … hesitations. Proceedings of the 16th international conference on Multimodal Interaction (pp. 2–9), Istanbul, Turkey. ACM.
  • Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Sage, Thousand Oaks, California.
  • Breed, G. (1972). The effect of intimacy: Reciprocity or retreat? British Journal of Social and Clinical Psychology, 11, 135–142. doi:10.1111/bjc.1972.11.issue-2
  • Bretan, M., Hoffman, G., & Weinberg, G. (2015). Emotionally expressive dynamic physical behaviors in robots. International Journal of Human-Computer Studies, 78, 1–16. doi:10.1016/j.ijhcs.2015.01.006
  • Bull, P. E. (2016). Posture & gesture (Vol. 16). Elsevier, Amsterdam, Netherlands.
  • Burgoon, J. K., Buller, D. B., Hale, J. L., & de Turck, M. A. (1984). Relational messages associated with nonverbal behaviors. Human Communication Research, 10, 351–378. doi:10.1111/hcre.1984.10.issue-3
  • Burgoon, J. K., & Hale, J. L. (1987). Validation and measurement of the fundamental themes of relational communication. Communications Monographs, 54, 19–41. doi:10.1080/03637758709390214
  • Burgoon, J. K., & Koper, R. J. (1984). Nonverbal and relational communication associated with reticence. Human Communication Research, 10, 601–626. doi:10.1111/hcre.1984.10.issue-4
  • Clark, H. H. (1996). Using language. Cambridge university press, Cambridge, England.
  • Clark, H. H. (2005). Coordinating with each other in a material world. Discourse Studies, 7, 507–525. doi:10.1177/1461445605054404
  • Correia, F., Mascarenhas, S., Prada, R., Melo, F. S., & Paiva, A. (2018). Group-based emotions in teams of humans and robots. Proceedings of the 2018 ACM/IEEE international conference on Human-Robot Interaction, Chicago, Illinois (pp. 261–269). ACM.
  • Cuijpers, R. H., & Van den Goor, V. J. P. (2017). Turn-taking cue delays in human-robot communication. CEUR Workshop Proceedings, 2059, 19–29.
  • Duffy, B. R. (2003). Anthropomorphism and the social robot. Robotics and Autonomous Systems, 42, 177–190. doi:10.1016/S0921-8890(02)00374-3
  • Eiser, J. R., & Eiser, J. R. (1986). Social psychology: Attitudes, cognition and social behaviour. Cambridge University Press, Cambridge, England.
  • Emmers-Sommer, T. M. (2004). The effect of communication quality and quantity indicators on intimacy and relational satisfaction. Journal of Social and Personal Relationships, 21, 399–411. doi:10.1177/0265407504042839
  • Erel, H., Hoffman, G., & Zuckerman, O. (2018). Interpreting non-anthropomorphic robots’ social gestures. Proceedings of the 2018 ACM/IEEE international conference on Human-Robot Interaction, Chicago, Illinois. ACM.
  • Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42(3–4), 143–166. doi:10.1016/S0921-8890(02)00372-X
  • Forlizzi, J. (2007). How robotic products become social products: An ethnographic study of cleaning in the home. Proceedings of the ACM/IEEE international conference on Human-robot interaction (pp. 129–136). ACM.
  • Foster, M. E., Gaschler, A., Giuliani, M., Isard, A., Pateraki, M., & Petrick, R. (2012). Two people walk into a bar: Dynamic multi-party social interaction with a robot agent. Proceedings of the 14th ACM international conference on Multimodal interaction, Santa Monica, California (pp. 3–10). ACM.
  • Fukuda, T., Jung, M. J., Nakashima, M., Arai, F., & Hasegawa, Y. (2004). Facial expressive robotic head system for human-robot communication and its application in home environment. Proceedings of the IEEE, 92, 1851–1865. doi:10.1109/JPROC.2004.835355
  • Galletta, A. (2013). Mastering the semi-structured interview and beyond: from research design to analysis and publication. New York, NY:NYU press.
  • Gemeinboeck, P., & Saunders, R. (2017). Movement matters: How a robot becomes body. Proceedings of the 4th international conference on Movement Computing, London, United Kingdom (p. 8). ACM.
  • Gibbs, G. R. (2008). Analysing qualitative data. Sage, Thousand Oaks, California.
  • Goffman, E. (1979). Footing. Semiotica, 25, 1–30. doi:10.1515/semi.1979.25.1-2.1
  • Goffman, E. (1981). Forms of talk. University of Pennsylvania Press, Philadelphia, Pennsylvania.
  • Goodwin, C. (1981). Conversational organization. Interaction between speakers and hearers.
  • Groom, V., & Nass, C. (2007). Can robots be teammates?: Benchmarks in human–robot teams. Interaction Studies, 8, 483–500.
  • Han, J., Campbell, N., Jokinen, K., & Wilcock, G. (2012). Investigating the use of non-verbal cues in human-robot interaction with a Nao robot. 2012 IEEE 3rd international conference on Cognitive Infocommunications (CogInfoCom), Kosice, Slovakia (pp. 679–683). IEEE.
  • Heider, F. (2013). The psychology of interpersonal relations. Psychology Press, East Sussex, England.
  • Henkel, Z., Bethel, C. L., Murphy, R. R., & Srinivasan, V. (2014). Evaluation of proxemic scaling functions for social robotics. IEEE Transactions on Human-Machine Systems, 44, 374–385. doi:10.1109/THMS.6221037
  • Hoffman, G., Bauman, S., & Vanunu, K. (2016). Robotic experience companionship in music listening and video watching. Personal and Ubiquitous Computing, 20, 51–63. doi:10.1007/s00779-015-0897-1
  • Hoffman, G., Birnbaum, G. E., Vanunu, K., Sass, O., & Reis, H. T. (2014). Robot responsiveness to human disclosure affects social impression and appeal. Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, Bielefeld, Germany (pp. 1–8). ACM.
  • Hoffman, G., & Ju, W. (2014). Designing robots with movement in mind. Journal of Human-Robot Interaction, 3, 78–95. doi:10.5898/JHRI.3.1.Hoffman
  • Hoffman, G., & Weinberg, G. (2011). Interactive improvisation with a robotic marimba player. Autonomous Robots, 31, 133–153. doi:10.1007/s10514-011-9237-0
  • Hoffman, G., Zuckerman, O., Hirschberger, G., Luria, M., & Shani Sherman, T. (2015). Design and evaluation of a peripheral robotic conversation companion. Proceedings of the tenth annual ACM/IEEE international conference on Human-Robot Interaction, Portland, Oregon (pp. 3–10). ACM.
  • Iio, T., Yoshikawa, Y., & Ishiguro, H. (2017). Starting a conversation by multi-robot cooperative behavior. International conference on Social Robotics (pp. 739–748). Cham: Springer.
  • Johnson, D. W., & Johnson, R. (1985). Classroom conflict: Controversy versus debate in learning groups. American Educational Research Journal, 22, 237–256. doi:10.3102/00028312022002237
  • Jung, M. F., Martelaro, N., & Hinds, P. J. (2015). Using robots to moderate team conflict: The case of repairing violations. Proceedings of the tenth annual ACM/IEEE international conference on Human-Robot Interaction, Portland, Oregon (pp. 229–236). ACM.
  • Kendrick, K. H., & Holler, J. (2017). Gaze direction signals response preference in conversation. Research on Language and Social Interaction, 50, 12–32. doi:10.1080/08351813.2017.1262120
  • Kleinke, C. L. (1986). Gaze and eye contact: A research review. Psychological Bulletin, 100, 78. doi:10.1037/0033-2909.100.1.78
  • Luria, M., Hoffman, G., Megidish, B., Zuckerman, O., & Park, S. (2016). Designing Vyo, a robotic Smart Home assistant: Bridging the gap between device and social agent. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY (pp. 1019–1025). IEEE doi:10.1109/ROMAN.2016.7745234
  • Luria, M., Hoffman, G., & Zuckerman, O. (2017). Comparing social robot, screen and voice interfaces for smart-home control. Proceedings of the 2017 CHI conference on human factors in computing systems, Denver, Colorado (pp. 580–628). ACM.
  • Matsusaka, Y., Fujie, S., & Kobayashi, T. (2001). Modeling of conversational strategy for the robot participating in the group conversation. Seventh European conference on speech communication and technology, Aalborg, Denmark.
  • Matsuyama, Y., Akiba, I., Fujie, S., & Kobayashi, T. (2015). Four-participant group conversation: A facilitation robot controlling engagement density as the fourth participant. Computer Speech & Language, 33(1), 1–24. doi:10.1016/j.csl.2014.12.001
  • Matsuyama, Y., Taniyama, H., Fujie, S., & Kobayashi, T. (2010). Framework of communication activation robot participating in multiparty conversation. 2010 AAAI Fall Symposium Series, Arlington, Virginia.
  • Maxwell, G. M., Cook, M. W., & Burr, R. (1985). The encoding and decoding of liking from behavioral cues in both auditory and visual channels. Journal of Nonverbal Behavior, 9, 239–263. doi:10.1007/BF00986883
  • Mohr, J., & Spekman, R. (1994). Characteristics of partnership success: Partnership attributes, communication behavior, and conflict resolution techniques. Strategic Management Journal, 15, 135–152. doi:10.1002/(ISSN)1097-0266
  • Mori, M. (1970). The uncanny valley. Energy, 7, 33–35.
  • Morse, J. M. (1995). The significance of saturation. Sage, Thousand Oaks, California.
  • Mutlu, B., Kanda, T., Forlizzi, J., Hodgins, J., & Ishiguro, H. (2012). Conversational gaze mechanisms for humanlike robots. ACM Transactions on Interactive Intelligent Systems (Tiis), 1, 12.
  • Mutlu, B., Shiwa, T., Kanda, T., Ishiguro, H., & Hagita, N. (2009). Footing in human-robot conversations: How robots might shape participant roles using gaze cues. Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, La Jolla, California (pp. 61–68). ACM.
  • Mwangi, E. N., Barakova, E. I., Diaz, M., Mallofré, A. C., & Rauterberg, M. (2017). Who is a better tutor?: Gaze hints with a human or humanoid tutor in game play. Proceedings of the Companion of the 2017 ACM/IEEE international conference on Human-Robot Interaction, Vienna, Austria  (pp. 219–220). ACM.
  • Nagao, K., & Takeuchi, A. (1994). Social interaction: Multimodal conversation with social agents. AAAI, 94, 22–28.
  • Nichols, A. L., & Maner, J. K. (2008). The good-subject effect: Investigating participant demand characteristics. The Journal of General Psychology, 135, 151–166. doi:10.3200/GENP.135.2.151-166
  • Oliveira, R., Arriaga, P., Alves-Oliveira, P., Correia, F., Petisca, S., & Paiva, A. (2018). Friends or foes?: Socioemotional support and gaze behaviors in mixed groups of humans and robots. Proceedings of the 2018 ACM/IEEE international conference on Human-Robot Interaction, Chicago, Illinois (pp. 279–288). ACM.
  • Opdenakker, R. (2006). Advantages and disadvantages of four interview techniques in qualitative research. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research, 7, 4.
  • Otsuka, K., Yamato, J., Takemae, Y., & Murase, H. (2006). Quantifying interpersonal influence in face-to-face conversations based on visual attention patterns. CHI’06 Extended Abstracts on Human Factors in Computing Systems, Montréal, Canada (pp. 1175–1180). ACM.
  • Pacchierotti, E., Christensen, H. I., & Jensfelt, P. (2006). Design of an office-guide robot for social interaction studies. 2006 IEEE/RSJ international conference on Intelligent Robots and Systems, Beijing, China (pp. 4965–4970). IEEE.
  • Parlitz, C., Hägele, M., Klein, P., Seifert, J., & Dautenhahn, K. (2008). Care-o-bot 3-rationale for human-robot interaction design. Proceedings of 39th International Symposium on Robotics (ISR) (pp. 275–280). Seul, Korea.
  • Riek, L. D. (2012). Wizard of oz studies in hri: A systematic review and new reporting guidelines. Journal of Human-Robot Interaction, 1, 119–136. doi:10.5898/JHRI.1.1.Riek
  • Robins, B., Dautenhahn, K., Te Boekhorst, R., & Billard, A. (2005). Robotic assistants in therapy and education of children with autism: Can a small humanoid robot help encourage social interaction skills? Universal Access in the Information Society, 4, 105–120. doi:10.1007/s10209-005-0116-3
  • Sakamoto, D., & Ono, T. (2006). Sociality of robots: Do robots construct or collapse human relations?. Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction, Salt Lake City, Utah (pp. 355–356). ACM.
  • Shah, J., Wiken, J., Williams, B., & Breazeal, C. (2011). Improved human-robot team performance using chaski, a human-inspired plan execution system. Proceedings of the 6th international conference on Human-robot interaction, Lausanne, Switzerland (pp. 29–36). ACM.
  • Shen, S., Slovak, P., & Jung, M. F. (2018). Stop. I see a conflict happening.: A robot mediator for young children’s interpersonal conflict resolution. Proceedings of the 2018 ACM/IEEE international conference on Human-Robot Interaction, Chicago, Illinois (pp. 69–77). ACM.
  • Sidner, C. L., Lee, C., Kidd, C. D., Lesh, N., & Rich, C. (2005). Explorations in engagement for humans and robots. Artificial Intelligence, 166, 140–164. doi:10.1016/j.artint.2005.03.005
  • Simons, T., Pelled, L. H., & Smith, K. A. (1999). Making use of difference: Diversity, debate, and decision comprehensiveness in top management teams. Academy of Management Journal, 42, 662–673.
  • Spexard, T., Li, S., Wrede, B., Fritsch, J., Sagerer, G., Booij, O., … Krose, B. (2006). BIRON, where are you? Enabling a robot to learn new places in a real home environment by integrating spoken dialog and visual localization. 2006 IEEE/RSJ international conference on Intelligent Robots and Systems, Beijing, China (pp. 934–940). IEEE.
  • Takayama, L., Dooley, D., & Ju, W. (2011). Expressing thought: Improving robot readability with animation principles. 2011 6th ACM/IEEE international conference on Human-Robot Interaction (HRI), Lausanne, Switzerland (pp. 69–76). IEEE.
  • Takayama, L., & Pantofaru, C. (2009). Influences on proxemic behaviors in human-robot interaction. 2009 IEEE/RSJ international conference on Intelligent Robots and Systems, St Louis, St Louis (pp. 5495–5502). IEEE.
  • Tan, X. Z., Vázquez, M., Carter, E. J., Morales, C. G., & Steinfeld, A. (2018). Inducing bystander interventions during robot abuse with social mechanisms. Proceedings of the 2018 ACM/IEEE international conference on Human-Robot Interaction, Chicago, Illinois (pp. 169–177). ACM.
  • Tanaka, F., & Ghosh, M. (2011). The implementation of care-receiving robot at an English learning school for children. Proceedings of the 6th international conference on Human-robot interaction, Lausanne, Switzerland (pp. 265–266). ACM.
  • Taylor, S. E., & Fiske, S. T. (1978). Salience, attention, and attribution: Top of the head phenomena. In Advances in experimental social psychology (Vol. 11, pp. 249–288). Elsevier, Amsterdam, Netherlands
  • Tennent, H., Shen, S., & Jung, M. (2019). Micbot: A peripheral robotic object to shape conversational dynamics and team performance. 2019 14th ACM/IEEE international conference on Human-Robot Interaction (HRI), Daegu, Korea (pp. 133–142). IEEE.
  • Vázquez, M., Carter, E. J., McDorman, B., Forlizzi, J., Steinfeld, A., & Hudson, S. E. (2017). Towards robot autonomy in group conversations: Understanding the effects of body orientation and gaze. Proceedings of the 2017 ACM/IEEE international conference on Human-Robot Interaction, Vienna, Austria (pp. 42–52). ACM.
  • Verplanck, W. S. (1955). The control of the content of conversation: Reinforcement of statements of opinion. The Journal of Abnormal and Social Psychology, 51(3), 668. doi:10.1037/h0046514
  • Vertegaal, R., Slagter, R., Van der Veer, G., & Nijholt, A. (2001). Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 301–308). ACM. doi:10.1145/365024
  • Vertegaal, R., Slagter, R., Van der Veer, G., & Nijholt, A. (2001). Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. Proceedings of the SIGCHI conference on Human factors in computing systems, Seattle, Washington (pp. 301–308). ACM.
  • Wu, Y. H., Fassert, C., & Rigaud, A. S. (2012). Designing robots for the elderly: Appearance issue and beyond. Archives of Gerontology and Geriatrics, 54, 121–126. doi:10.1016/j.archger.2011.02.003
  • Zaga, C., de Vries, R. A., Li, J., Truong, K. P., & Evers, V. (2017). A simple nod of the head: The effect of minimal robot movements on children’s perception of a low-anthropomorphic robot. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, Colorado (pp. 336–341). ACM.
  • Zuckerman, O., & Hoffman, G. (2015). Empathy objects: Robotic devices as conversation companions. Proceedings of the ninth international conference on tangible, embedded, and embodied interaction, Stanford, California (pp. 593–598). ACM.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.