311
Views
0
CrossRef citations to date
0
Altmetric
Articles

The Role of bi-Directional Graphic Communication in Human-Unmanned Operations

Pages 1926-1943 | Received 29 Mar 2021, Accepted 25 Mar 2022, Published online: 19 Apr 2022

References

  • Baber, C., Fulthorpe, C., & Houghton, R. J. (2010). Supporting naturalistic decision making through location-based photography: A study of simulated military reconnaissance. International Journal of Human-Computer Interaction, 26(2–3), 147–172. https://doi.org/10.1080/10447310903498718
  • Back, Y., Zak, Y., Parmet, Y., & Oron-Gilad, T. (2021). Combining cognitive work analysis and empirical evaluations to understand map use by operators of small carry-on unmanned aerial systems. Applied Ergonomics, 90, 103218. https://doi.org/10.1016/j.apergo.2020.103218
  • Bangor, A., Kortum, P., & Miller, J. (2009). Determining what individual SUS scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3), 114–123. http://www.usabilityprofessionals.org.
  • Barshi, I., & Farris, C. (2016). Misunderstandings in ATC communication: Language, cognition, and experimental methodology. Routledge.
  • Brehmer, B. (2005). The dynamic OODA loop: amalgamating Boyd’s OODA loop and the Cybernetic approach to command and control. In 10th International Command and Control research and technology symposium. http://www.dodccrp.org/events/10th_ICCRTS/CD/papers/365.pdf
  • Breton, R., Rousseau, R. (2005, June). The C-OODA: A cognitive version of the OODA loop to represent C2 activities. In Proceedings of the 10th International Command and Control Research Technology Symposium. Defence Research and Development Canada.
  • Brooke, J. (1996). SUS: A ‘quick and dirty’ usability scale. In P. W. Jordan, B. Thomas, B. A. Weerdmeester, & A. L. McClelland (Eds.), Usability Evaluation in Industry. Taylor and Francis.
  • Caldwell, B. S., & Everhart, N. C. (1998). Information flow and development of coordination in distributed supervisory control teams. International Journal of Human-Computer Interaction, 10(1), 51–70. https://doi.org/10.1207/s15327590ijhc1001_4
  • Calhoun, G. L., & Draper, M. H. (2010). ‘Synthetic Vision for Improving Unmanned Aerial Vehicle Operator Situational Awareness’. In M. Barnes, & F. Jentsch (Eds.), Human-Robot Interaction in Future Military Operations. Human Factors in Defense (pp. 229–249). Burlington.
  • Cassenti, D. N., Kelley, T. D., Yagoda, R. E., & Avery, E. (2012). Improvements in robot navigation through operator speech preferences. Paladyn, Journal of Behavioral Robotics, 3(2), 102–111. https://doi.org/10.2478/s13230-012-0100-6
  • Christoffersen, K., & Woods, D. D. (2002). 1. How to make automated systems team players. In Advances in human performance and cognitive engineering research. Emerald Group Publishing Limited.
  • Craparo, E. M. (2004). [Natural language processing for unmanned aerial vehicle guidance interfaces]. [Doctoral dissertation]. Massachusetts Institute of Technology.
  • Cumming, M., & Akar, E. (2005). Coordinating the complexity of design using P2P groupware. CoDesign, 1(4), 255–265. https://doi.org/10.1080/15710880500478361
  • Demir, M., McNeese, N. J., & Cooke, N. J. (2020). Understanding human-robot teams in light of all-human teams: Aspects of team interaction and shared cognition. International Journal of Human-Computer Studies, 140, 102436. https://doi.org/10.1016/j.ijhcs.2020.102436
  • Draper, M., Calhoun, G., Ruff, H., Williamson, D., & Barry, T. (2003). October) Manual versus speech input for unmanned aerial vehicle control station operations. In Proceedings of the human factors and ergonomics society annual meeting. (Vol. 47, No. 1, pp. 109–113). SAGE Publications. https://doi.org/10.1177/154193120304700123
  • Grier, R. A. (2013). September) The potential utility of the system usability scale in US military acquisition. In Proceedings of the human factors and ergonomics society annual meeting. (Vol. 57, No. 1, pp. 206–209). SAGE Publications. https://doi.org/10.1177/1541931213571046
  • Harris, J., & Barber, D. (2014). Speech and gesture interfaces for squad-level human-robot teaming [Paper presentation]. In Unmanned Systems Technology Xvi, June). (Vol. 9084, p. 90840B). International Society for Optics and Photonics.
  • Helmus, T. C., & Glenn, R. W. (2005). Steeling the mind: Combat stress reactions and their implications for urban warfare. Rand Corporation.
  • Hevner, A. R., March, S. T., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly,, 28(1), 75–105. https://doi.org/10.2307/25148625
  • Hew, P. (2011). Structured, graphical analysis of C2 teams and their technologies. The International C2 Journal, 5(3). https://apps.dtic.mil/sti/citations/ADA555521
  • Klein, G., & Crandall, B. W. (1995). The role of mental simulation in problem solving and decision making. In Local Applications of the Ecological Approach to Human-Machine Systems, (Vol. 2, pp. 324–358).
  • Lewis, J. R. (2018). The system usability scale: Past, present, and future. International Journal of Human–Computer Interaction, 34(7), 577–590. https://doi.org/10.1080/10447318.2018.1455307
  • Li, W. C., Bord, T., Zhang, J., Braithwaite, G., & Lone, M. (2020). Evaluating system usability of augmented reality in flight operations. In Contemporary ergonomics and human factors. Eds. Rebecca Charles and Dave Golightly. CIEHF.
  • Matthews, G., Joyner, L., Gilliland, K., Campbell, S. E., Falconer, S., & Huggins, J. (1999). Validation of a comprehensive stress state questionnaire: Towards a state ‘Big Three’?. In I. Mervielde, I. J. Deary, F. De Fruyt, & F. Ostendorf (Eds.), Personality psychology in Europe (Vol. 7, pp. 335–350). Tilburg University Press.
  • Matthews, G. (2021). Stress states, personality and cognitive functioning: A review of research with the Dundee Stress State Questionnaire. Personality and Individual Differences, 169, 110083. https://doi.org/10.1016/j.paid.2020.110083
  • Mosier, K. L., & Fischer, U. M. (2010). Judgment and decision making by individuals and teams: Issues, models, and applications. Reviews of Human Factors and Ergonomics, 6(1), 198–256. https://doi.org/10.1518/155723410X12849346788822
  • Naikar, N. (2017). Cognitive work analysis: An influential legacy extending beyond human factors and engineering. Applied Ergonomics, 59(Pt B), 528–540. https://doi.org/10.1016/j.apergo.2016.06.001
  • Neigel, A. R., Dever, D. A., Claypoole, V. L., & Szalma, J. L. (2019). Task engagement and the vigilance decrement revisited: Expanding upon the work of Joel S. Warm using a semantic vigilance paradigm. Human Factors, 61(3), 462–473. https://doi.org/10.1177/0018720819835086
  • Ntuen, C. A., Park, E. H., & Gwang-Myung, K. (2010). Designing an information visualization tool for sensemaking. International Journal of Human-Computer Interaction, 26(2–3), 189–205. https://doi.org/10.1080/10447310903498825
  • Obradovich, J. H., & Smith, P. J. (2008). Design concepts for virtual work systems. In J. Nemiro, M. M. Beyerlein, L. Bradley, & S. Beyerlein (Eds.), The handbook of high-performance virtual teams: A toolkit for collaborating across boundaries (pp. 295–328). Jossey-Bass.
  • O'Hanlon, M. E., Miller, J. N. (2019). Why we need a more modern and ready military, not a larger one. https://www.brookings.edu/blog/order-from-chaos/2019/10/04/why-we-need-a-more-modern-and-ready-military-not-a-larger-one/
  • Ophir-Arbelle, R., Oron-Gilad, T., Borowsky, A., & Parmet, Y. (2013). Is more information better? How dismounted soldiers use video feed from unmanned vehicles: Attention allocation and information extraction considerations. Journal of Cognitive Engineering and Decision Making, 7(1), 26–48. https://doi.org/10.1177/1555343412445054
  • Oron-Gilad, T., & Oppenheim, I. (2015, December). Scalable interfaces for operator control units; a common display to conduct MOUT operations with multiple video feed sources – Final Research Report – Task (5), DCS subcontract no: APX03-S010 (BG Negev Technologies and Applications Ltd) under Prime Contract no W911NF-10-D-0002.
  • Oron-Gilad, T., & Oppenheim, I. (2016, August). Scalable interfaces for operator control units; a common display to conduct MOUT operations with multiple video feed sources – Final Research Report – Task (6), One-Way Still Images Graphic Communication Between A Ground/Air Observer And A Ground Attacker In A Target Acquisition Task. DCS subcontract no: APX03-S010 (BG Negev Technologies and Applications Ltd) under Prime Contract no W911NF-10-D-0002.
  • Oron-Gilad, T., & Oppenheim, I. (2017). Use of graphic imagery as a mean of communication between operators and unmanned systems in C3 fire tasks. In International conference on engineering psychology and cognitive ergonomics. (pp. 362–381). Springer.
  • Oron-Gilad, T., & Oppenheim, I. (2018, January). Scalable interfaces for operator control units; investigating Bi-directional Graphic Communication Between A Ground/Air Observer And A Ground Attacker. T2C1S3C – Investigating Tactical Multi-Modal Soldier-Robot Exchanges Fund Type: 6.2 GDLS PO 40253724 (BG Negev Technologies and Applications Ltd) under Prime Contract no W911MF-10-2-0016.
  • Oppenheim, I., & Oron-Gilad, T. (2019). Is a Picture Worth a Thousand Words? Adding Bi-Directional Graphic Communication to the Interaction of Observer-Attacker Teams.
  • Oron-Gilad, T. I., Oppenheim, I., & Ziv-Schegolsky, R. (2019, July). Scalable interfaces for operator control units; investigating Bi-directional Graphic Communication Between Autonomous Ground/Air Observers and Ground. T2C1S3C – Investigating Tactical Multi-Modal Soldier-Robot Exchanges Fund Type: 6.2 GDLS PO 40253724 (BG Negev Technologies and Applications Ltd) under Prime Contract no W911MF-10-2-0016
  • Oron-Gilad, T., & Parmet, Y. (2014, September) Is more information better for dismounted soldiers? Display-layout considerations of multiple video feed from unmanned vehicles. In Proceedings of the human factors and ergonomics society annual meeting. (Vol. 58, No. 1, pp.345–349). SAGE Publications. https://doi.org/10.1177/1541931214581071
  • Oron-Gilad, T., & Parmet, Y. (2017). Close target reconnaissance: A field evaluation of dismounted soldiers utilizing video feed from an unmanned ground vehicle in patrol missions. Journal of Cognitive Engineering and Decision Making, 11(1), 63–80. https://doi.org/10.1177/1555343416675077
  • Pettitt, R. A., Carstens, C. B., & Elliott, L. R. (2014). Speech-based robotic control for dismounted soldiers: Evaluation of visual display options. ARMY RESEARCH LAB ABERDEEN PROVING GROUND MD HUMAN RESEARCH AND ENGINEERING DIRECTORATE.
  • Priest, H. A., Durlach, P. J., & Billings, D. R. (2011). Developing collective training for small unmanned aerial systems employment: Workload and performance with multiple systems. ARMY RESEARCH INST FOR THE BEHAVIORAL AND SOCIAL SCIENCES ORLANDO FL.
  • Razzak, M. A., & Islam, M. N. (2020). Exploring and evaluating the usability factors for military application: A road map for hci in military applications. Human Factors and Mechanical Engineering for Defense and Safety, 4(1), 1–18. https://doi.org/10.1007/s41314-019-0032-6
  • Revell, K., Langdon, P., Bradley, M., Politis, I., Brown, J., Thompson, S., … Stanton, N. (2018, July). UCEID-the best of both worlds: Combining ecological interface design with user centered design in a novel HF method applied to automated driving. In International Conference on Applied Human Factors and Ergonomics (pp. 493–501). Springer.
  • Salmon, P. M., Stanton, N. A., Jenkins, D. P., Walker, G. H., & Rafferty, L. (2010). Decisions, decisions… and even more decisions: Evaluation of a digitized mission support system in the land warfare domain. International Journal of Human-Computer Interaction, 26(2–3), 206–227. https://doi.org/10.1080/10447310903498940
  • Stanton, N. A., & William Wong, B. L. (2010). Editorial: Explorations Into Naturalistic Decision Making With Computers, International Journal of Human–Computer Interaction, 26(2–3), 99–107. https://doi.org/10.1080/10447310903498676
  • Stothart, C. R., Burland, B. R., Strickland, H. C., Messina, F. D., Couch, D. S., & From, J. D. (2020). Can Artificial Intelligence Systems Improve Information-Gathering Efficiency in Army Mission Command Processes. ARMY RESEARCH INST FOR THE BEHAVIORAL AND SOCIAL SCIENCES FORT BELVOIR VA FORT BELVOIR United States
  • Stowers, K., Kasdaglis, N., Rupp, M. A., Newton, O. B., Chen, J. Y., & Barnes, M. J. (2020). The IMPACT of agent transparency on human performance. IEEE Transactions on Human-Machine Systems, 50(3), 245–253. https://doi.org/10.1109/THMS.2020.2978041
  • Walker, G. H., Stanton, N. A., Jenkins, D. P., Salmon, P. M., & Rafferty, L. (2010). From the 6 Ps of planning to the 4 Ds of digitization: Difficulties, dilemmas, and defective decision making. International Journal of Human-Computer Interaction, 26(2–3), 173–188. https://doi.org/10.1080/10447310903498742
  • Zak, Y., Tapiro, H., Alicia, T. J., Parmet, Y., Rottem Hovev, M., Taylor, G. S., & Oron-Gilad, T. (2021). Rapid interpretation of temporal–spatial Unmanned Aerial Vehicle (UAV) operational data – RITSUD: Aiding UAV operators with visualizations of patterns-of-life activities. Journal of Cognitive Engineering and Decision Making, 15(4), 135–154. https://doi.org/10.1177/15553434211023605
  • Ziv-Schegolsky, R., Oppenheim, I., Parmet, Y., & Oron-Gilad, T. (2020, December). A communication interface for a dismounted ground commander and an Intelligent Autonomous Unmanned Aerial Systems (IA-UAS)–A feasibility study. In Proceedings of the human factors and ergonomics society annual meeting. (Vol. 64, No. 1, pp.218–223). SAGE Publications. https://doi.org/10.1177/1071181320641053

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.