REFERENCES
- Bertelsen, O. W., & Nielsen, C. (2000). Augmented reality as a design tool for mobile interfaces. Proceedings of the 3rd Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, 185–192.
- Billinghurst, M., & Thomas, B. H. (2011). Mobile collaborative augmented reality. In L. Alem & W. Huang (Eds.), Recent trends of mobile collaborative augmented reality systems (pp. 1–19). New York, NY: Springer.
- Bowman, D., Kruijff, E., LaViola, J. J., Jr., & Poupyrev, I. (2004). 3D user interfaces: Theory and practice. Reading, MA: Addison-Wesley.
- Broll, W., Lindt, I., Ohlenburg, J., Wittkämper, M., Yuan, C., Novotny, T., … Strothmann, A. (2004). ARTHUR: A collaborative augmented environment for architectural design and urban planning. Journal of Virtual Reality and Broadcasting, 1(1). Retrieved from http://www.jvrb.org/past-issues/1.2004/34
- Chen, Y.-C., Chi, H.-L., Hung, W.-H., & Kang, S.-C. (2011). Use of tangible and augmented reality, models in engineering graphics courses. Journal of Professional Issues in Engineering Education & Practice, 137, 267–276.
- Cheng, K.-Y., Liang, R.-H., Chen, B.-Y., Liang, R.-H., & Kuo, S.-Y. (2010). iCon: Utilizing everyday objects as additional, auxiliary and instant tabletop controllers. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems CHI’10, 1155–1164.
- Cootes, T. F., Edwards, G. J., & Taylor, C. J. (2001). Active appearance models. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23, 681.
- CSI The Hague (2012). Digitise the crime scene: Fiction becomes reality. Retrieved from http://www.csithehague.com/
- Dachselt, R., & Hübner, A. (2007). Three-dimensional menus: A survey and taxonomy. Computers & Graphics, 31, 53–65.
- Datcu, D., & Lukosch, S. (2013, July). Free hands interaction in augmented reality. Paper presented at the ACM Symposium on Spatial User Interaction (SUI), Los Angeles, CA.
- Datcu, D., Swart, T., Lukosch, S., & Rusak, Z. (2012). Multimodal Collaboration for Crime Scene Investigation in Mediated Reality. 14th ACM International Conference on Multimodal Interaction – ICMI, 299–300.
- Ehnes, J. (2009). A tangible interface for the AMI content linking device—The Automated Meeting Assistant. IEEE HSI 2009, 306–313.
- Gu, N., Kim, M. J., & Maher, M. L. (2011). Technological advancements in synchronous collaboration: The effect of 3D virtual worlds and tangible user interfaces on architectural design. Automation in Construction, 20, 270–278.
- Hart, S. (2006). Nasa-Task Load Index (NASA-TLX); 20 years later. Human Factors and Ergonomics Society Annual Meeting Proceedings, 50, 904–908.
- Henderson, S. J., & Feiner, S. (2008). Opportunistic controls: Leveraging natural affordances as tangible user interfaces for augmented reality. Proceedings of ACM Symposium on Virtual Reality Software Technology, 211–218.
- Jones, B. R., Sodhi, R., Campbell, R., H., Garnett, G., & Bailey, B., P. (2010). Build your world and play in it: Interacting with surface particles on complex objects. International Symposium on Mixed and Augmented Reality – ISMAR’10, 165–174.
- Kanade, T., & Hebert, M. (2012). First-person vision. Proceedings of the IEEE, 100, 2442–2453.
- Kinect. (n.d.). In Wikipedia. Retrieved August 28, 2014, from http://en.wikipedia.org/wiki/Kinect
- Kölsch, M., Turk, M., & Höllerer, T. (2004). Vision-Based Interfaces for Mobility. International Conference on Mobile and Ubiquitous Systems (MobiQuitous), August 22–26, Boston, Massachusetts, USA.
- Lee, J. Y., Rhee, G., W., & Seo, D., W. (2010). Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment. International Journal of Advanced Manufacturing Technology, 51, 1069–1082.
- Lee, M., Green, R., & Billinghurst, M. (2008). 3D natural hand interaction for AR applications. 23rd International Conference Image and Vision Computing, 26–28.
- Livingston, M. A., Ai, Z., Karsch, K., & Gibson, G. O. (2011). User interface design for military AR applications. Virtual Reality, 15, 175–184.
- López, G., López, M., Guerrero, L. A., & Bravo, J. (2014). Human–objects interaction: A framework for designing, developing and evaluating augmented objects. International Journal of Human–Computer Interaction, 30, 787–801. doi:10.1080/10447318.2014.927281
- Lucas, B. D., & Kanade, T. (1981). An iterative image registration technique with an application to stereo vision. Proceedings of Imaging Understanding Workshop, 121–130.
- Maier, P., Tonnis, M., Klinker, G., Raith, A., Drees, M., & Kuhn, F. (2010). What do you do when two hands are not enough? Interactive selection of bonds between pairs of tangible molecules, IEEE Symposium on 3D User Interfaces - 3DUI’10, 83–90.
- Mair, E., Hager, G. D., Burschka, D., Suppa, M., & Hirzinger, G. (2010). Adaptive and generic corner detection based on the accelerated segment test. Computer Vision – ECCV’10, Lecture Notes in Computer Science, 6312, 183–196.
- Marco, J., Cerezo, E., & Baldassarri, S. (2012). Bringing tabletop technology to all: evaluating a tangible farm game with kindergarten and special needs children. Personal and Ubiquitous Computing, 17, 1577–1591.
- Mateu, J., Lasala, M. J., & Alamán, X. (2014). VirtualTouch: A tool for developing mixed reality educational applications and an example of use for inclusive education. International Journal of Human–Computer Interaction, 30, 815–828. doi:10.1080/10447318.2014.927278
- Merrill, D., & Maes, P. (2007). Augmenting Looking, Pointing and Reaching Gestures to Enhance the Searching and Browsing of Physical Objects, In: Proceedings of the 5th international conference on Pervasive computing, PERVASIVE’07, ISBN: 978-3-540-72036-2, Springer-Verlag Berlin, Heidelberg, 1–18.
- bin Mohd Sidik, M. K., bin Sunar, M. S., bin Ismail, I., bin Mokhtar, M. K., & Jusoh, N. B. M. (2011). A study on natural interaction for human body motion using depth image data. Workshop on Digital Media and Digital Content Management - DMDCM, 97–102.
- Nagel, T., & Heidmann, F. (2011). Exploring faceted geo-spatial data with tangible interaction. GeoViz 2011, March 10–11, 2011, Hamburg, Germany.
- Ness, S., R., Reimer, P., Krell, N., Odowichuck, G., Schloss, W., A., & Tzanetakis, G. (2010). Sonophenology: A tangible interface for sonification of geo-spatial phonological data at multiple time scales. The 16th International Conference on Auditory Display (ICAD-2010), 335–341.
- Newton-Dunn, H., Nakano, H., & Gibson, J. (2003). Block Jam: A tangible interface for interactive music. Proceedings of the 2003 conference on New Interfaces for Musical Expression NIME’03, 170–177.
- Piumsomboon, T., Clark, A., & Billinghurst, M. (2011). Physically-based interaction for tabletop augmented reality using a depth-sensing camera for environment mapping. Proceeedings of Image and Vision Computing New Zealand (IVCNZ-2011), 161–166.
- Poelman, R., Akman, O., Lukosch, S., & Jonker, P. (2012). As if being there: mediated reality for crime scene investigation. Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, CSCW’12, 1267–1276.
- Radkowski, R., & Stritzke, C. (2012). Interactive hand gesture-based assembly for augmented reality applications. The Fifth International Conference on Advances in Computer-Human Interactions ACHI’12, 303–308.
- Reifinger, S., Wallhoff, F., Ablassmeier, M., Poitschke, T., & Rigoll, G. (2007). Static and dynamic hand-gesture recognition for augmented reality applications. Proceedings of the 12th International Conference on Human-Computer Interaction: Intelligent Multimodal Interaction Environments, 728–737.
- Schiettecatte, B., & Vanderdonckt, J. (2008). AudioCubes: A distributed cube tangible interface based on interaction range for sound design. Proceedings of the Second International Conference on Tangible and Embedded Interaction (TEI’08), 3–10.
- Schweighofer, G., & Pinz, A. (2006). Robust pose estimation from a planar target. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28, 2024–2030.
- Shen, Y., Ong, S. K., & Nee, A. Y. C. (2011). Vision-based hand interaction in augmented reality environment. International Journal of Human–Computer Interaction, 27, 523–544. doi:10.1080/10447318.2011.555297
- Smith, C. G. (1995), May/June. The hand that rocks the cradle. I.D., pp. 60–65.
- Swart, T. (2012). Design of a gesture controlled graphic interface for Head Mounted Displays for CSI The Hague (Unpublished master’s thesis). Delft University of Technology, Delft, the Netherlands.
- Viola, P., & Jones, P. (2002). Robust real-time object detection. International Journal of Computer Vision, 57, 137–154.
- White, S., Feng, D., & Feiner, S. (2009). Interaction and presentation techniques for shake menus in tangible augmented reality. 8th IEEE International Symposium on Mixed and Augmented Reality, ISMAR’09, 39–48.
- Wiedenmaier, S., Oehme, O., Schmidt, L., & Luczak, H. (2003). Augmented Reality (AR) for assembly processes design and experimental evaluation. International Journal of Human-Computer Interaction, 16, 497–514. doi:10.1207/S15327590IJHC1603_7
- Williams, A., & Botet Escriba, V. J. (n.d.). Boost C++ Libraries. Chapter 28. Thread 3.1.0. Retrieved from http://www.boost.org/doc/libs/1_52_0/doc/html/thread.html
- Xu, D., Read, J., C., Mazzone, E., & Brown, M. (2007). Designing and testing a tangible interface prototype. IDC 2007 Proceedings: Methodology, 25–28.