1,750
Views
53
CrossRef citations to date
0
Altmetric
Articles

Adaptive Social Robot for Sustaining Social Engagement during Long-Term Children–Robot Interaction

ORCID Icon, &

References

  • Abe, K., Hieida, C., Attamimi, M., Nagai, T., Shimotomai, T., Omori, T., & Oka, N. (2014, October). Toward playmate robots that can play with children considering personality. In Proceedings of the second international conference on human-agent interaction (pp. 165–168), Tsukuba, Japan.
  • Ahmad, M. I., Mubin, O., & Orlando, J. (2016, October). Understanding behaviours and roles for social and adaptive robots in education: Teacher’s perspective. In Proceedings of the fourth international conference on human agent interaction (pp. 297–304), Biopolis, Singapore.
  • Ahmad, M. I., & Shahid, S. (2015). Design and evaluation of mobile learning applications for autistic children in Pakistan. In Abascal J., Barbosa S., Fetter M., Gross T., Palanque P., Winckler M. (Eds.), Human-computer interaction – INTERACT 2015. INTERACT 2015. Lecture Notes in Computer Science, vol 9296 (pp. 436–444). Bamberg, Germany: Springer, Cham.
  • Andersen, P. A., & Guerrero, L. K. (1998). Principles of communication and emotion in social interaction. In A. Anderson & L. K. Guerrero (Eds.), Handbook of communication and emotion: Research, theory, applications, and contexts (pp. 49–96). San Diego, CA: Academic Press.
  • Argyle, M., & Dean, J. (1965). Eye-contact, distance and affiliation. Sociometry, 28, 289–304.
  • Bartneck, C., Kanda, T., Mubin, O., & Al Mahmud, A. (2007). The perception of animacy and intelligence based on a robot’s embodiment. Presented at 2007 7th IEEE-RAS international conference on humanoid robots (pp. 300–305), Pittsburgh, PA.
  • Baxter, P., & Belpaeme, T. (2014). Pervasive memory: The future of long-term social HRI lies in the past. Presented at Third international symposium on new frontiers in human-robot interaction at AISB 2014, Canterbury, UK.
  • Buche, C., Querrec, R., De Loor, P., & Chevaillier, P. (2003). Mascaret: Pedagogical multi-agents systems for virtual environment for training. Presented at Cyberworlds, 2003. (pp. 423–430), Washington, DC.
  • Cameron, D., Fernando, S., Collins, E., Millings, A., Moore, R., Sharkey, A., … Prescott, T. (2015). Presence of life-like robot expressions influences children’s enjoyment of human: Robot interactions in the field. In Proceedings of the AISB convention 2015, Canterbury, UK.
  • Castellano, G., Pereira, A., Leite, I., Paiva, A., & McOwan, P. W. (2009, November). Detecting user engagement with a robot companion using task and social interaction-based features. In Proceedings of the 2009 international conference on multimodal in- terfaces (pp. 119–126), Cambridge, MA.
  • Chang, C.-W., Lee, J.-H., Chao, P.-Y., Wang, C.-Y., Chen, G.-D. (2010). Exploring the possibility of using humanoid robots as instructional tools for teaching a second language in primary school. Educational Technology & Society, 13 (2), 13–24.
  • Cheng, L.-P., Liang, H.-S., Wu, C.-Y., & Chen, M. Y. (2013). iGrasp: Grasp-based adaptive keyboard for mobile devices. In Proceedings of chi conference (pp. 3037–3046), Paris, France.
  • Conati, C. (2002). Probabilistic assessment of user’s emotions in educational games. Applied Artificial Intelligence, 16 (7–8), 555–575.
  • Coninx, A., Baxter, P., Oleari, E., Bellini, S., Bierman, B., Henkemans, O. B., … Belpaeme, T. (2016). Towards long-term social child-robot interaction: Using multi-activity switching to engage young users. Journal of Human-Robot Interaction, 5 (1), 32–67.
  • Cuadrado, L.-E. I., Riesco, Á. M., & López, F. D. L. P. (2016). ARTIE: An integrated environment for the development of affective robot tutors. Frontiers in Computational Neuroscience, 10, 77.
  • Dautenhahn, K., & Werry, I. (2002). A quantitative technique for analysing robot-human interactions. Presented at Intelligent robots and systems, 2002. ieee/rsj international conference on (Vol. 2, pp. 1132–1138), EPFL, Switzerland.
  • dos Santos, C. T., & Osório, F. S. (2004, May). An intelligent and adaptive virtual environment and its application in distance learning. In Proceedings of the working conference on advanced visual interfaces (pp. 362–365), Gallipoli, Italy.
  • Druin, A. (2002). The role of children in the design of new technology. Behaviour and Information Technology, 21 (1), 1–25.
  • EVP, U. (2015). Emotion detection algorithm based on opencv. Retrieved from https://github.com/liy9393/emotion-detection
  • Fong, T., Nourbakhsh, I., & Dautenhahn, K. (2003). A survey of socially interactive robots. Robotics and Autonomous Systems, 42 (3), 143–166.
  • Häring, M., Bee, N., & André, E. (2011, July-August). Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots Presented at 2011 ro-man (pp. 204–209), Atlanta, GA.
  • Hassani, K., Nahvi, A., & Ahmadi, A. (2016). Design and implementation of an intelligent virtual environment for improving speaking and listening skills. Interactive Learning Environments, 24 (1), 252–271.
  • Hastie, H., Lim, M. Y., Janarthanam, S., Deshmukh, A., Aylett, R., Foster, M. E., & Hall, L. (2016, May). I remember you! Interaction with memory for an empathic virtual robotic tutor. In Proceedings of the 2016 international conference on autonomous agents & multiagent systems (pp. 931–939), Singapore, Singapore.
  • Indico. (2016). Facial emotion recognition. Retrieved from https://indico.io/docs
  • Jimenez, F., Yoshikawa, T., Furuhashi, T., & Kanoh, M. (2015). An emotional expression model for educational-support robots. Journal of Artificial Intelligence and Soft Computing Research, 5 (1), 51–57.
  • Kachouie, R., Sedighadeli, S., Khosla, R., & Chu, M.-T. (2014). Socially assistive robots in elderly care: A mixed-method systematic literature review. International Journal of Human-Computer Interaction, 30 (5), 369–393.
  • Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive robots as social partners and peer tutors for children: A field trial. Human-Computer Interaction, 19 (1–2), 61–84. doi:10.1080/07370024.2004.9667340
  • Kanda, T., Sato, R., Saiwaki, N., & Ishiguro, H. (2007). A two-month field trial in an elementary school for long-term human–robot interaction. Robotics, IEEE Transactions On, 23 (5), 962–971.
  • Karapanos, E. (2013). Modeling users’ experiences with interactive systems. Berlin, Heidelberg, Germany: Springer.
  • Khosla, R., Nguyen, K., & Chu, M.-T. (2016). Human robot engagement and acceptability in residential aged care. International Journal of Human-Computer Interaction, Advance online publication.
  • King, M. A., & Yuille, J. C. (1987). “Suggestibility and the child witness.” In S. J. Ceci, M. P. Toglia, & D. F. Ross (Eds.), Children’s Eyewitness Memory (pp. 24–35). New York, NY: Springer.
  • Komatsubara, T., Shiomi, M., Kanda, T., Ishiguro, H., & Hagita, N. (2014, October). Can a social robot help children’s understanding of science in classrooms? In Proceedings of the second international conference on human-agent interaction (pp. 83–90), Tsukuba, Japan.
  • Kozima, H., Michalowski, M. P., & Nakagawa, C. (2009). Keepon. International Journal of Social Robotics, 1 (1), 3–18.
  • Kwon, O.-H., Koo, S.-Y., Kim, Y.-G., & Kwon, D.-S. (2010). Telepresence robot system for English tutoring. Presented at 2010 IEEE workshop on advanced robotics and its social impacts (pp. 152–155), Seoul, South Korea.
  • Leite, I., Castellano, G., Pereira, A., Martinho, C., & Paiva, A. (2014). Empathic robots for long-term interaction. International Journal of Social Robotics, 6 (3), 329–341.
  • Leite, I., Martinho, C., & Paiva, A. (2013). Social robots for long- term interaction: A survey. International Journal of Social Robotics, 5 (2), 291–308.
  • Lyra, O., Karapanos, E., Gouveia, R., Nisi, V., & Nunes, N. J. (2013, June). Engaging children in longitudinal behavioral studies through playful technologies. In Proceedings of the 12th international conference on interaction design and children (pp. 396–399), New York, NY.
  • Ma, X., Yang, X., Zhao, S., Fu, C.-W., Lan, Z., & Pu, Y. (2014). Using social media platforms for human-robot interaction in domestic environment. International Journal of Human- Computer Interaction, 30 (8), 627–642.
  • Mitrović, A., & Djordjević-Kajan, S. (1995). Interactive reconstructive student modeling: A machine-learning approach. International Journal of Human-Computer Interaction, 7 (4), 385–401.
  • Mordvintsev, A., & Abid, K. (2013). Opencv-python tutorials. Retrieved from http://opencv-python-tutroals.readthedocs.io
  • Mubin, O., Stevens, C. J., Shahid, S., Al Mahmud, A., & Dong, -J.-J. (2013). A review of the applicability of robots in education. Journal of Technology in Education and Learning, 1, 209–215.
  • Okita, S. Y., Ng-Thow-Hing, V., & Sarvadevabhatla, R. K. (2011). Multimodal approach to affective human-robot interaction design with children. ACM Transactions on Interactive Intelligent Systems (Tiis), 1 (1), 5.
  • Pasch, M. (2010, June). Improving children’s self-report in user-centered evaluations. In Proceedings of the 9th international conference on interaction design and children (pp. 331–334), Barcelona, Spain.
  • Prain, V., Cox, P., Deed, C., Dorman, J., Edwards, D., Farrelly, C. (2013). Personalised learning: Lessons to be learnt. British Educational Research Journal, 39 (4), 654–676.
  • Rajagopalan, S. S., Murthy, O. R., Goecke, R., & Rozga, A. (2015, May). Play with me-measuring a child’s engagement in a social interaction. Presented at Automatic face and gesture recognition (fg), 2015 11th IEEE international conference and workshops on (Vol. 1, pp. 1–8), Ljubljana, Slovenia.
  • Salter, T., Dautenhahn, K., & Bockhorst, R. (2004). Robots moving out of the laboratory-detecting interaction levels and human contact in noisy school environments. Presented at Robot and human interactive communication, 2004. Roman 2004. 13th IEEE international workshop on (pp. 563–568), Tokyo, Japan.
  • Salter, T., Michaud, F., & Letourneau, D. (2009, March). What are the benefits of adaptation when applied in the domain of child-robot interaction? In Proceedings of the 4th ACM/IEEE international conference on human robot interaction (pp. 237–238), La Jolla, CA.
  • Serholt, S., & Barendregt, W. (2016, October). Robots tutoring children: Longitudinal evaluation of social engagement in child-robot interaction. In Proceedings of the 9th nordic conference on human-computer interaction (pp. 64), Gothenburg, Sweden.
  • Shahid, S., Krahmer, E., & Swerts, M. (2014). Child–robot interaction across cultures: How does playing a game with a social robot compare to playing a game alone or with a friend? Computers in Human Behavior, 40, 86–100.
  • Sidner, C., Kidd, C., Lee, C., & Lesh, N. (2004, January). Where to look: A study of human-robot interaction. Presented at Intelligent user interfaces conference (pp. 78–84), Funchal, Portugal.
  • Sidner, C. L., & Dzikovska, M. (2002). Human-robot interaction: Engagement between humans and robots for hosting activities. In Proceedings of the 4th IEEE international conference on multimodal interfaces (pp. 123), Washington, DC.
  • Sidner, C. L., Lee, C., Kidd, C. D., Lesh, N., & Rich, C. (2005). Explorations in engagement for humans and robots. Artificial Intelligence, 166 (1), 140–164.
  • Tielman, M., Neerincx, M., Meyer, -J.-J., & Looije, R. (2014, March). Adaptive emotional expression in robot-child interaction. In Proceedings of the 2014 ACM/IEEE international conference on human-robot interaction (pp. 407–414), Bielefeld, Germany.
  • Vollmer, C., Pötsch, F., & Randler, C. (2013). Morningness is associated with better gradings and higher attention in class. Learning and Individual Differences, 27, 167–173.
  • Wittrock, M. C. (1974). Learning as a generative process 1. Educational Psychologist, 11 (2), 87–95.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.