383
Views
2
CrossRef citations to date
0
Altmetric
Articles

Enhancing the quality of service of mobile video technology by increasing multimodal synergy

, , &
Pages 874-883 | Received 04 Jul 2018, Accepted 16 Jul 2018, Published online: 08 Sep 2018

References

  • Microsoft. 2018. “About DirectShow.” Accessed March 16, 2018. https://msdn.microsoft.com/en-us/library/windows/desktop/dd373389(v=vs.85).aspx.
  • CITO. 2018. Accessed June 15, 2018. https://msdn.microsoft.com/enus/library/windows/desktop/dd373389(v=vs.85).aspx.
  • Banbury, J. R. 1983. “Wide Field of View Head-up Displays.” Displays 4 (2): 89–96. doi: 10.1016/0141-9382(83)90165-8
  • Baraković, S., and L. Skorin-Kapov. 2015. “Multidimensional Modelling of Quality of Experience for Mobile Web Browsing.” Computers in Human Behavior 48: 314–332. doi: 10.1016/j.chb.2015.03.071
  • Bernhaupt, R., and M. M. Pirker. 2014. “User Interface Guidelines for the Control of Interactive Television Systems via Smart Phone Applications.” Behaviour & Information Technology 33 (8): 784–799. doi: 10.1080/0144929X.2013.810782
  • Calvert, G. A., C. Spence, and B. E. Stein. 2004. The Handbook of Multisensory Processes. Cambridge: The MIT Press / A Bradford Book.
  • Cao, Y., F. van der Sluis, M. Theune, R. op den Akker, and A. Nijholt. 2010. “Evaluating Informative Auditory and Tactile Cues for In-vehicle Information Systems.” In Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI'10), edited by A. K. Dey, A. Schmidt, S. Boll, and A. L. Kun, Pittsburgh, PA, 102–109, November 11–12. New York: ACM.
  • Costanza, E., S. A. Inverso, E. Pavlov, R. Allen, and P. Maes. 2006. “eye-q: Eyeglass Peripheral Display for Subtle Intimate Notifications.” In MobileHCI '06: ACM Proceedings of the 8th Conference on Human-computer Interaction with Mobile Devices and Services, edited by Mika Röykkee, A. Kaikkonen, M. Nieminen, and K. Väänänen-Vainio Mattila, Vol. 159 of ACM International Conference Proceeding Series, Espoo, Finland, 211–218, September 12–15. New York: ACM Press.
  • Cousineau, D. 2005. “Confidence Intervals in Within-subject Designs: A Simpler Solution to Loftus and Masson's Method.” Tutorials in Quantitative Methods for Psychology 1 (1): 42–45. doi: 10.20982/tqmp.01.1.p042
  • de Gelder, B., and J. Vroomen. 2000. “The Perception of Emotions by Ear and by Eye.” Cognition and Emotion 14 (3): 289–311. doi: 10.1080/026999300378824
  • Dixon, N. F., and L. Spitz. 1980. “The Detection of Auditory Visual Desynchrony.” Perception 9 (6): 719–721. doi: 10.1068/p090719
  • Dodgson, N. A. 2004. “Variation and Extrema of Human Interpupillary Distance.” Proceedings of SPIE (Stereoscopic Displays and Virtual Reality Systems) 5291: 36–46.
  • Drullman, R. 1995. “Speech Intelligibility in Noise: Relative Contribution of Speech Elements Above and Below the Noise Level.” The Journal of the Acoustical Society of America 98 (3): 1796–1798. doi: 10.1121/1.413378
  • Ekman, I., L. Ermi, J. Lahti, J. Nummela, P. Lankoski, and F. Mäyrä. 2005. “Designing Sound for a Pervasive Mobile Game.” In Proceedings of the 2005 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE 2005), edited by N. Lee, Valencia, Spain, 110–116, June 15–17. New York: ACM.
  • Erber, N. P. 1975. “Auditory-visual Perception of Speech.” Journal of Speech and Hearing Disorders 40 (4): 481–492. doi: 10.1044/jshd.4004.481
  • Ernst, M. O., and H. H. Bülthoff. 2004. “Merging the Senses Into a Robust Percept.” TRENDS in Cognitive Sciences 8 (4): 162–169. doi: 10.1016/j.tics.2004.02.002
  • Fernandez-Lopez, A., O. Martinez, and F. M. Sukno. 2017. “Towards Estimating the Upper Bound of Visual-speech Recognition: The Visual Lip-reading Feasibility Database.” In Proceedings of the 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), edited by K. Bowyer, R. Chellappa, J. Cohn, H. Gunes, A. O'Toole, C. Pelachaud, Y. Tong, and V. Patel, Washington, DC, 208–215, May 30–June 03. Los Alamitos, CA: IEEE Computer Society.
  • Ferris, T. K., and N. B. Sarter. 2008. “Cross-modal Links Among Vision, Audition, and Touch in Complex Environments.” Human Factors 50 (1): 17–26. doi: 10.1518/001872008X250566
  • Freeman, J. B., and N. Ambady. 2011. “When two Become One: Temporally Dynamic Integration of the Face and Voice.” Journal of Experimental Social Psychology 47 (1): 259–263. doi: 10.1016/j.jesp.2010.08.018
  • Frowein, H. W., G. F. Smoorenburg, L. Pyters, and D. Schinkel. 1991. “Improved Speech Recognition Through Videotelephony: Experiments with the Hard of Hearing.” IEEE Journal on Selected Areas in Communications 9 (4): 611–616. doi: 10.1109/49.81956
  • Grabe, M. E., M. Lombard, R. D. Reich, C. C. Bracken, and T. B. Ditton. 1999. “The Role of Screen Size in Viewer Experiences of Media Content.” Visual Communication Quarterly 6 (2): 4–9. doi: 10.1080/15551399909363403
  • Harper, S., Y. Yesilada, and T. Chen. 2011. “Mobile Device Impairment … Similar Problems, Similar Solutions?.” Behaviour & Information Technology 30 (5): 673–690. doi: 10.1080/01449291003801943
  • Hess, J., H. Knoche, and V. Wulf. 2014. “Thinking Beyond the Box: Designing Interactive TV Across Different Devices.” Behaviour & Information Technology 33 (8): 781–783. doi: 10.1080/0144929X.2014.927163
  • Jang, S. B., Y. G. Kim, and Y.-W. Ko. 2017. “Mobile Video Communication based on Augmented Reality.” Multimedia Tools and Applications 76 (16): 16893–16909. doi: 10.1007/s11042-016-3627-4
  • Janssen, J. H., P. Tacken, J. J. G. de Vries, E. L. van den Broek, J. H. D. M. Westerink, P. Haselager, and W. A. IJstelstein. 2013. “Machine Beats Human Emotion Recognition through Audio, Visual, and Physiological Modalities.” Human Computer Interaction 28 (6): 479–517. doi: 10.1080/07370024.2012.755421
  • Johnson, C., and P. Grainge. 2015. Promotional Screen Industries. Oxon: Routledge / Taylor & Francis Group.
  • Jung, Y., B. Perez-Mira, and S. Wiley-Patton. 2009. “Consumer Adoption of Mobile TV: Examining Psychological Flow and Media Content.” Computers in Human Behavior 25 (1): 123–129. doi: 10.1016/j.chb.2008.07.011
  • Karapantazis, S., and F.-N. Pavlidou. 2009. “VoIP: A Comprehensive Survey on a Promising Technology.” Computer Networks 53 (12): 2050–2090. doi: 10.1016/j.comnet.2009.03.010
  • Kelly, S. D., D. J. Barr, R. B. Church, and K. Lynch. 1999. “Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory.” Journal of Memory and Language 40 (4): 577–592. doi: 10.1006/jmla.1999.2634
  • Kim, K. J. 2017. “Shape and Size Matter for Smartwatches: Effects of Screen Shape, Screen Size, and Presentation Mode in Wearable Communication.” Journal of Computer-Mediated Communication 22 (3): 124–140. doi: 10.1111/jcc4.12186
  • Lim, J. S., S. Y. Ri, B. D. Egan, and F. A. Biocca. 2015. “The Cross-platform Synergies of Digital Video Advertising: Implications for Cross-media Campaigns in Television, Internet and Mobile TV.” Computers in Human Behavior 48: 463–472. doi: 10.1016/j.chb.2015.02.001
  • Liu, Y., and H. Li. 2011. “Exploring the Impact of Use Context on Mobile Hedonic Services Adoption: An Empirical Study on Mobile Gaming in China.” Computers in Human Behavior 27 (2): 890–898. doi: 10.1016/j.chb.2010.11.014
  • McGurk, H., and J. MacDonald. 1976. “Hearing Lips and Seeing Voices.” Nature 264 (5588): 746–748. doi: 10.1038/264746a0
  • O'Hara, K., A. Black, and M. Lipson. 2006. “Everyday Practices with Mobile Video Telephony.” In Proceedings of the SIGCHI Conference on Human Factors in computing systems, edited by R. Grinter, T. Rodden, P. Aoki, E. Cutrell, R. Jeffries, and G. Olson, 871–880. Montreal, QC: ACM Press.
  • Ouni, S., and G. Gris. 2018. “Dynamic Lip Animation from a Limited Number of Control Points: Towards an Effective Audiovisual Spoken Communication.” Speech Communication 96: 49–57. doi: 10.1016/j.specom.2017.11.006
  • Perakakis, M., and A. Potamianos. 2008. “Multimodal System Evaluation using Modality Efficiency and Synergy Metrics.” In Proceedings of the 10th International Conference on Multimodal Interfaces (ICMI'08), edited by V. Digalakis, A. Potamianos, M. Turk, R. Pieraccini, and Y. Ivanov, Chania, Crete, Greece, 9–16, October 20–22. New York: ACM.
  • Powell, H. 2017. “Always On: Mobile Culture and its Temporal Consequences.” Chap. 4, 99–117. London: World Scientific Publishing Europe Ltd.
  • Rimell, A. N., N. J. Mansfield, and D. Hands. 2007. “The Influence of Content, Task and Sensory Interaction on Multimedia Quality Perception.” Ergonomics 51 (2): 85–97. doi: 10.1080/00140130701526432
  • Risberg, A., and J. Lubker. 1978. “Prosody and Speechreading.” Speech Transmission Laboratory Quarterly Progress Report and Status Report 4: 1–16.
  • Rogers, J. A., T. Someya, and Y. Huang. 2010. “Materials and Mechanics for Stretchable Electronics.” Science 327 (5973): 1603–1607. doi: 10.1126/science.1182383
  • Roring, R. W., F. G. Hines, and N. Charness. 2006. “Age-related Identification of Emotions at Different Image Sizes.” Human Factors 48 (4): 675–681. doi: 10.1518/001872006779166406
  • Schulte, S., S. Chen, and K. Nahrstedt. 2014. “Stevens' Power Law in 3D Tele-immersion: Towards Subjective Modeling of Multimodal Cyber interaction.” In Proceedings of the 22nd ACM International Conference on Multimedia, edited by K. A. Hua, Y. Rui, R. Steinmetz, A. Hanjalic, A. (P.) Natsev, and W. Zhu, Orlando, FL, 1133–1136, November 3–7. New York: ACM Press.
  • Shaheen, S., A. Cohen, and E. Martin. 2017. “Smartphone App Evolution and Early Understanding from a Multimodal App User Survey.” Chap. 10, 149–164. Lecture Notes in Mobility (LNMOB). Cham: Springer International Publishing AG.
  • Shaked, N., and U. Winter. 2016. Design of Multimodal Mobile Interfaces. Berlin: Walter De Gruyter.
  • Shams, L., and R. Kim. 2010. “Crossmodal Influences on Visual Perception.” Physics of Life Reviews 7 (3): 269–284. doi: 10.1016/j.plrev.2010.04.006
  • Shoukry, L., and S. Gobel. 2017. “Reasons and Responses: A Multimodal Serious Games Evaluation Framework.” IEEE Transactions on Emerging Topics in Computing. doi:10.1109/TETC.2017.2737953.
  • Škařupová, K., K. Ólafsson, and L. Blinka. 2016. “The Effect of Smartphone Use on Trends in European Adolescents' Excessive Internet Use.” Behaviour & Information Technology 35 (1): 68–74. doi: 10.1080/0144929X.2015.1114144
  • Srinivasan, S., P. J. Hsu, T. Holcomb, K. Mukerjee, S. L. Regunathan, B. Lin, J. Liang, M.-C. Lee, and J. Ribas-Corbera. 2004. “Windows Media Video: Overview and Applications.” Signal Processing: Image Communication 19 (9): 851–875.
  • Stein, B. E. 2012. The New Handbook of Multisensory Processing. Cambridge: The MIT Press.
  • Sumby, W. H., and I. Pollack. 1954. “Visual Contribution to Speech Intelligibility in Noise.” The Journal of the Acoustical Society of America 26 (2): 212–215. doi: 10.1121/1.1907309
  • Takahashi, A., H. Yoshino, and N. Kitawaki. 2004. “Perceptual QoS Assessment Technologies for VoIP.” IEEE Communications Magazine 42 (7): 28–34. doi: 10.1109/MCOM.2004.1316526
  • Tan, D. S., D. Gergle, P. Scupelli, and R. Pausch. 2006. “Physically Large Displays Improve Performance on Spatial Tasks.” ACM Transactions on Computer-Human Interaction 13 (1): 71–99. doi: 10.1145/1143518.1143521
  • Tasaka, S., and Y. Ishibashi. 2002. “Mutually Compensatory Property of Multimedia QoS.” Proceedings of 2002 IEEE International Conference on Communications, Vol. 2, New York, USA, 1105–1111.
  • Thompson, Matt, A. Imran Nordin, and Paul Cairns. 2012. “Effect of Touch-screen Size on Game Immersion.” Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers, BCS-HCI '12, Swinton, UK, 280–285. British Computer Society.
  • van den Broek, E. L. 2011. “Affective Signal Processing (ASP): Unraveling the mystery of emotions.” PhD diss., Human Media Interaction (HMI), Faculty of Electrical Engineering, Mathematics, and Computer Science, University of Twente, Enschede.
  • van den Broek, E. L. 2017. “ICT: Health's Best Friend and Worst Enemy?” In BioSTEC 2017: 10th International Joint Conference on Biomedical Engineering Systems and Technologies, Proceedings Volume 5: HealthInf, edited by E. L. van den Broek, A. Fred, H. Gamboa, and M. Vaz, 611–616, February 21–23. Porto: SciTePress – Science and Technology Publications, Lda.
  • van den Broek, E. L., F. van der Sluis, and Th. E. Schouten. 2010. “User-centered Digital Preservation of Multimedia.” ERCIM (European Research Consortium for Informatics and Mathematics) News 80: 45–47.
  • van der Sluis, F., J. H. Ginn, and T. van der Zee. 2016. “Explaining Student Behavior at Scale: The Influence of Video Complexity on Student Dwelling Time.” Proceedings of the Third (2016) ACM Conference on Learning @ Scale, L@S '16, New York, NY, 51–60. ACM.
  • van der Sluis, F., E. L. van den Broek, R. J. Glassey, E. M. A. G. van Dijk, and F. M. G. de Jong. 2014. “When Complexity Becomes Interesting.” Journal of the American Society for Information Science and Technology 65 (7): 1478–1500.
  • van der Sluis, F., T. van der Zee, and J. H. Ginn. 2017. “Learning About Learning at Scale: Methodological Challenges and Recommendations.” Proceedings of the Fourth (2017) ACM Conference on Learning @ Scale, L@S '17, New York, NY, 131–140. ACM.
  • Vroomen, J., M. Keetels, B. de Gelder, and P. Bertelson. 2004. “Recalibration of Temporal Order Perception by Exposure to Audio-visual Asynchrony.” Cognitive Brain Research 22 (1): 32–35. doi: 10.1016/j.cogbrainres.2004.07.003
  • Watts, L. 2008. “Advanced Noise Reduction for Mobile Telephony.” IEEE Computer 41 (8): 72–79. doi: 10.1109/MC.2008.278
  • Westheimer, G. 1979. “The Spatial Sense of the Eye. Proctor Lecture.” Investigative Ophthalmology & Visual Science 18 (9): 893–912.
  • Yuan, Z., G. Ghinea, and G.-M. Muntean. 2015. “Beyond Multimedia Adaptation: Quality of Experience-aware Multi-sensorial Media Delivery.” IEEE Transactions on Multimedia 17 (1): 104–117. doi: 10.1109/TMM.2014.2371240
  • Yuen, P. C., Y. Y. Tang, and P. S. P. Wang. 2002. Multimodal Interface for Human–machine Communication, Series in Machine Perception and Artificial Intelligence, Vol. 48. River Edge, NJ: World Scientific Publishing Co.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.