741
Views
5
CrossRef citations to date
0
Altmetric
Research Article

Impact of Heads-Up Displays on Small Unmanned Aircraft System Operator Situation Awareness and Performance: A Simulated Study

ORCID Icon, ORCID Icon, &

References

  • Akaho, K., Nakagawa, T., Yamaguchi, Y., Kawai, K., Kato, H., & Nishida, S. (2012). Route guidance by a car navigation system based on augmented reality. Electrical Engineering in Japan, 180(2), 43–54. https://doi.org/https://doi.org/10.1002/eej.22278
  • Aromaa, S., Aaltonen, I., Kaasinen, E., Elo, J., & Parkkinen, I. (2016). Use of wearable and augmented reality technologies in industrial maintenance work. AcademicMindtrek’16, 235–242. Association for Computing Machinery New York NY United States. https://doi.org/https://doi.org/10.1145/2994310.2994321
  • Arthur, J. J., Kramer, L. J., & Bailey, R. E. (2005). Flight test comparison between enhanced vision (FLIR) and synthetic vision systems. Proceedings of the SPIE enhanced and sythentics vision 2005, 5802, (pp. 25–36). Orlando, FL.
  • Arthur, J. J., Nicholas, S. N., Shelton, K. J., Ballard, K., Prinzel, L. J., Ellis, K. E., Williams, S. P. (2017). Simulation test of a head-worn display with ambient vision display for unusual attitude recovery. Proceedings of Degraded Environments: Sensing, Processing, and Display 2017, 101970B. https://doi.org/https://doi.org/10.1117/12.2262705
  • Arthur, J. J., Prinzel, L. J., Barnes, J. R., Williams, S. P., Jones, D. R., Harrison, S. J., & Bailey, R. E. (2014). Performance comparison between a head-worn display system and a head-up display for low visibility commercial operations. Proceedings of SPIE display technologies and applications defense, security, and avionics VIII: Head- and helmet-mounted displays XIX, 2014, Baltimore, MD.
  • Avanzini, G., Luca, V. D., & Pascarelli, C. (2020). AR-based visual aids for sUAS operations in security sensitive areas. International Conference on Augmented Reality, Virtual Reality, and Computer Graphics, 28(6), 289–298. https://doi.org/https://doi.org/10.1080/10447318.2011.590122
  • Bailey, R. E., Kramer, L. J., & Prinzel, L. (2007). Fusion of synthetic and enhanced vision for all-weather commercial aviation operations. NATO HRM-141 Symposium on Human Factors of Day/Night All-Weather Operations. Heraklion, Greece: NTRS.
  • Balog, C. R., Terwillinger, B. A., Vincenzi, D. A., & Ison, D. C. (2017). Human factors challenges of sustainable small unmanned aircraft systems (sUAS) operations. In P. Savage-Knepshield & J. Chen (Eds.), Advances in human factors in robots and unmanned systems (pp. 61–73). Springer.
  • Beachly, E., Detweiler, C., Elbaum, S., Twidwell, D., & Duncan, B. (2017). UAS-Rx interface for mission planning, fire tracking, fire ignition, and real-time updating. In 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (pp. 67–74). Shanghai, China.
  • Bush, R., Carlos, S., Dean, E., Kim, E., Miranda, A., Semmens, R., Hernandez, A. (2020). Is that what I think it is? Impact of screen size on user ability to identify human activities. International Conference on Applied Human Factors and Ergonomics, 517–523. San Diego, CA.
  • Calhoun, G., Ruff, H., Lefebvre, A., Draper, M., & Ayala, A. (2007). “Picture-in-picture” augmentation of UAV workstation video display. Proceedings of the Human Factors and Ergonomcs Society 51st Annual Meeting, (pp. 70–74). Baltimore, MD.
  • Coleman, J., & Thirtyacre, D. (2021). Remote pilot situational awareness with augmented reality glasses: An observational field study. International Journal of Aviation, Aeronautics, and Aerospace, 8(1), 1–10. https://doi.org/https://doi.org/10.15394/ijaaa.2021.1547
  • Danish Technological Institute. (2019). Global trends of unmanned aerial systems. Association for Unmanned Vehicle Systems International [AUVSI].
  • DJI. (2019). Inspire 1 specs. https://www.dji.com/inspire-1/info
  • Doshi, A., Cheng, S. Y., & Trivedi, M. M. (2009). A novel active heads-up display for drive assistance. IEEE Transactions on Systems, Man, and Cybernetics, Part B, 39(1), 85–93. https://doi.org/https://doi.org/10.1109/TSMCB.2008.923527
  • Drone Industry Insights. (2020). Drone market size and forecast 2020-2025 and trends & regulations.
  • DSLRPros. (2019). Search and rescue drones. https://www.dslrpros.com/sar-drones.html
  • Endsley, M. R. (1995a). Toward a theory of situation awareness in dynamic systems. Human Factors, 37(1), 32–64. https://doi.org/https://doi.org/10.1518/001872095779049543
  • Endsley, M. R. (1995c). Measurement of situation awareness in dynamic systems. Human Factors, 37(1), 65–84. https://doi.org/https://doi.org/10.1518/001872095779049499
  • Endsley, M. R. (2000). Direct measurement of situation awareness: Validity and use of SAGAT. In M. R. Endsley & D. J. Garland (Eds.), Situation awareness analysis and measurement (pp. 147–174). Lawrence Erlbaum Associates.
  • Endsley, M. R., & Jones, D. G. (2004). Unmanned and remotely operated vehicles. In M. R. Endsley & D. G. Jones (Eds.), Designing for situation awareness (pp. 219–234). CRC Press.
  • Epson. (2016, September 28). Moverio smart glasses tested in disaster response system. Epson.
  • Eyerman, J., Crispino, G., Zamarro, A., & Durscher, R. (2018). Drone efficacy study (DES): Evaluating the impact of drones for locating lost persons in search and rescue events. DJI and European Emergency Number Association.
  • FAA. (2016). AC 107-2 small unmanned aircraft systems (sUAS). U.S. Department of Transportation.
  • FAA. (2019). FAA aerospace fiscal years 2019-2039. Federal Aviation Administration. https://www.faa.gov/data_research/aviation/aerospace_forecasts/media/FY2019-39_FAA_Aerospace_Forecast.pdf
  • FAA. (2020). FY2021_Q1_UAS_Sightings. FAA.org. https://www.faa.gov/uas/resources/public_records/uas_sightings_report/
  • FAA. (2021, February 9). UAS by the numbers. Federal Aviation Administration. https://www.faa.gov/uas/resources/by_the_numbers/
  • Funabiki, K., Tsuda, H., Iijima, T., Nojima, T., Tawada, K., & Yoshida, T. (2009). Flight experiment of pilot display for search-and-rescue helicopter. Proceedings of the SPIE Head- and Helmet-Mounted Displays XIV: Design and Applications 2009 (pp. 723607). Orlando, FL.
  • GAO. (2015). Unmanned aerial systems: FAA continues progress toward integration into the national airspace. https://www.gao.gov/assets/680/671907.pdf
  • Goodrich, M. A., Morse, B. S., Engh, C., Cooper, J. L., & Adams, J. A. (2009). Towards using unmanned aerial vehicles (UAVs) in wilderness search and rescue. Interaction Studies, 10(3), 453–478. https://doi.org/https://doi.org/10.1075/is.10.3.08goo
  • Guan, X., Lyu, R., Shi, H., & Chen, J. (2020). A survey of safety separation management and collision avoidance approaches of civil UAS operating in integration national airspace system. Chinese Journal of Aeronautics, 33(11), 2851–2863. https://doi.org/https://doi.org/10.1016/j.cja.2020.05.009
  • Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (task load index): Results of empirical and theorectical research. In P. A. Hancock & N. Meshkati (Eds.), Advances in psychology, 52: Human mental workload (pp. 139–183). North-Holland.
  • Hedayati, H., Walker, M., & Szafir, D. (2018). Improving collocated robot teleoperation with augmented reality. In HRI’18 (pp. 78–86). Association for Computing Machinery New York NY United States.
  • Heide, A. M., Fallavollita, P., Wang, L., Sandner, P., Navab, N., Weidert, S., & Euler, E. (2017). Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room. The International Journal of Medical Robotics and Computer Assisted Surgery, 14(2), e1885. https://doi.org/https://doi.org/10.1002/rcs.1885
  • Henderson, S. J., & Feiner, S. (2009). Evaluating the benefits of augmented reality for task localization in maintenance of an armored personnel carrier turret. IEEE International Symposium on Mixed and Augmented Reality (pp. 135–144). Orlando, FL.
  • Jones, B., Tang, A., Neustaedter, C., Antle, A. N., & McLaren, E. S. (2020). Designing technology for shared communication and awareness in wilderness search and rescue. In D. S. McCrickard, M. Jones, T, L. Stelter (Eds.), HCI Outdoors: Theory, Design, Methods and Applications. Human–Computer Interaction Series. Springer, Cham. https://doi-org.portal.lib.fit.edu/https://doi.org/10.1007/978-3-030-45289-6_9
  • Korn, B., Schmerwitz, S., Lorenz, B., & Dohler, H. (2009). Combining enhanced and synthetic vision for autonomous all-weather approach and landing. The International Journal of Aviation Psychology, 19(1), 49–75. https://doi.org/https://doi.org/10.1080/10508410802597408
  • Kramer, L., Prinzel, L., Bailey, R., & Arthur, J. (2003). Synthetic vision enhances situation awareness and RNP capabilities for terrain-challenged approaches. In AIAA’s 3rd Annual Aviation Technology, Integration, and Operations (ATIO) Tech (pp. 17–19). Aerospace Research Central, Denver, CO.
  • Lin, L. R. (2010). Supporting wilderness search and rescue with integrated intelligence: Autonomy and Information at the right time and the right place. Proceedings of the AAAI Conference on Artificial Intelligence, 24(1). Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/7573
  • Lindemann, P., Lee, T.-Y., & Rigoll, G. (2018, October). Catch my drift: Elevating situation awareness for highly automated driving with an explanatory windshield display user interface. Multimodal Technologies and Interaction, 2(4), 71. https://doi.org/https://doi.org/10.3390/mti2040071
  • Liu, D., Jenkins, S., Sanderson, P., Watson, M., Leanne, T., Kruys, A., & Russell, J. (2009). Monitoring with head-mounted displays: Performance and safety in full-scale simulator and part-task trainer. Technology, Computer, and Simulation, 109(4), 1135–1146. https://doi.org/https://doi.org/10.1213/ANE.0b013e3181b5a200
  • Melzer, J. (2017). Head-mounted displays. In J. Melzer, C. Spitzer, U. Ferrell, & T. Ferrell (Eds.), Digital avionics handbook (3rd ed., 16–1 to 16–24). CRC Press.
  • Merrell, T. (2018). Evaluation of consumer drone control interface [Masters Thesis]. Wright State University Department of Biomedical, Industrial & Human Factors Engineering.
  • Mika, P. (2009). Emergency service use of UAS. In 2009/2010 UAS yearbook - UAS: The global perspective (pp. 137–138). Blyenburgh & Co.
  • Neumann, J., & Durlach, P. (2005). Human factors and trainability of piloting a simulated micro unmanned aerial vehicle. Proceedings of the Human Factors and Ergonomics Society 49th Annual Meeting. Orlando, FL.
  • Oh, H. J., Ko, S. M., & Ji, Y. G. (2016). Effects of superimposition of a head-up display on driving performance and glance behavior in the elderly. International Journal of Human Computer Interaction, 32(2), 143–154. https://doi.org/https://doi.org/10.1080/10447318.2015.1104155
  • Perelman, B., & Muller, S. T. (2013). Examining memory for search using a simulated aerial search and rescue task. International Symposium on Aviation Psychology (pp. 412–417). Dayton, OH.
  • Preece, J. (2016). Citizen science: New research challenges for human-computer interaction. International Journal of Human-Computer Interaction, 32(8), 585–612. https://doi.org/https://doi.org/10.1080/10447318.2016.1194153
  • Qi, J., Song, D., Shang, H., Wang, N., Hua, C., Wu, C., Qi, X., & Han, J. (2015). Search and rescue rotary-wing UAV and its application to the Lushan ms 7.0 earthquake. Journal of Field Robotics, 33(3), 290–321. https://doi.org/https://doi.org/10.1002/rob.21615
  • Randel, J. M., & Pugh, H. L. (1996). Differences in expert and novice situation awareness in naturalistic decision making. International Journal of Human-Computer Studies, 45(5), 579–597. https://doi.org/https://doi.org/10.1006/ijhc.1996.0068
  • Riley, J., & Endsley, M. (2004). The hunt for situation awareness: Human-robot interaction in search and rescue. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 48(3), 693–697. https://doi.org/https://doi.org/10.1177/154193120404800389
  • Rudol, P., & Doherty, P. (2008). Human body detection and geolocalization for UAV search and rescue missions using color and thermal energy. Proceedings of the 2008 IEEE Aerospace Conference (pp. 1–8). Big Sky, Montana.
  • Ruiz, J. J., Escalera, M. A., Viguria, A., & Ollero, A. (2015). A simulation framework to validate the use of head-mounted displays and tablets for information exchange with the UAV safety pilot. 2015 Workshop on Research, Education and Development of Unmanned Aerial Systems (RED-UAS) (pp. 336–341). Cancun, Mexico. https://doi.org/https://doi.org/10.1109/RED-UAS.2015.7441025
  • Salmon, P. M., Stanton, N. A., Walker, G. H., Baber, C., Jenkins, D. P., McMaster, R., & Young, M. S. (2008). What really is going on? Review of situation awareness models for individuals and teams. Theoretical Issues in Ergonomics Science, 9(4), 297–323. https://doi.org/https://doi.org/10.1080/14639220701561775
  • Snow, M., & Reising, J. (1999). Effect of pathway-in-the-sky and synthetic terrain imagery on situation awareness in a simulated low-level ingress scenario. Air Force Research Laboratory Wright-Patternson AFB.
  • Tangmanee, K., & Teeravarunyou, S. (2012). Effects of guided arrows on head-up display towards the vehicle windshield. 2012 Southeast Asian Network of Ergonomics Societies Conference (SEANES). Langkawi, Malaysia: IEEE.
  • Taylor, G., Purman, B., Schermerhorn, P., Garcia-Sampedro, G., Hubal, R., Crabtree, K., Spriggs, S. (2015). Multi-modal interaction for UAS control. Proceedings of SPIE Defense + Security, 9468, 1–8. Baltimore, MD. https://doi.org/https://doi.org/10.1117/12.2180020
  • Taylor, R. M. (1990). Situation awareness rating technique (SART): The development of a tool for aircrew systems design. NATO-AGARD-CP-478.
  • Thomas, L. C. (2009). Evaluation of the use of a head worn display (HWD) for flight support in the commercial flight deck. SAE International Journal of Aerospace, 2(1), 109–114. https://doi.org/https://doi.org/10.4271/2009-01-3144
  • U.S. Department of Homeland Security. (2006). Search and rescue. http://rdept.cgaux.org/documents/CoxswainSAR-ReferenceGuide.pdf
  • Vidulich, M. (2000). The relationship between mental workload and situation awareness. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 44(21), 460–463. https://doi.org/https://doi.org/10.1177/154193120004402122
  • Wickens, C. D., Alexander, A. L., Horrey, W. J., Nunes, A., & Hardy, T. J. (2004). Traffic and flight guidance depiction on a synthetic vision system display: The effects of clutter on performance and visual attention allocation. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 48(1), 218–222. https://doi.org/https://doi.org/10.1177/154193120404800147
  • Yang, Z., Shi, J., Wang, D., Li, H., Wu, C., Zhang, Y., & Wan, J. (2018). Head-up display graphic warning system facilitates simulated driving performance. International Journal of Human-Computer Interaction, 35(9), 796–803. https://doi.org/https://doi.org/10.1080/10447318.2018.1496970
  • Yoon, J. W., Chen, R. E., Han, P. K., Si, P., Freeman, W. D., & Pirris, S. M. (2017). Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. The International Journal of Medical Robotics and Computer Assisted Surgery, 13(3), e1770. https://doi.org/https://doi.org/10.1002/rcs.1770
  • Yoon, J. W., Chen, R. E., Kim, E. J., Akinduro, O. O., Kerezoudis, P., Han, P. K., Si, P., Freeman, W. D., Diaz, R. J., Komotar, R. J., Pirris, S. M., Brown, B. L., Bydon, M., Wang, M. Y., Wharen, R. E., & Quinones-Hinojosa, A. (2018). Augmented reality for the surgeon: System review. The International Journal of Medical Robotics and Computer Assisted Surgery, 14(4), 1–13. https://doi.org/https://doi.org/10.1002/rcs.1914

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.