1,633
Views
3
CrossRef citations to date
0
Altmetric
Articles

Shaping the development and use of Artificial Intelligence: how human factors and ergonomics expertise can become more pertinent

Pages 1702-1710 | Received 10 May 2023, Accepted 30 Oct 2023, Published online: 06 Nov 2023

References

  • Arntz, M., T. Gregory, and U. Zierahn. 2016. “The Risk of Automation for Jobs in OECD Countries: A Comparative Analysis.” OECD Social, Employment and Migration Working Papers, No. 189.
  • Babic, Boris, Sara Gerke, Theodoros Evgeniou, and I Glenn Cohen. 2019. “Algorithms on Regulatory Lockdown in Medicine.” Science (New York, N.Y.) 366 (6470): 1202–1204. doi:10.1126/science.aay9547.
  • Bentley, T., N. Green, D. Tappin, and R. Haslam. 2021. “State of Science: The Future of Work–Ergonomics and Human Factors Contributions to the Field.” Ergonomics 64 (4): 427–439. doi:10.1080/00140139.2020.1841308.
  • Berente, N., B. Gu, J. Recker, and R. Santhanam. 2021. “Managing Artificial Intelligence.” MIS Quarterly 45: 1433–1450.
  • Bessen, J., S. M. Impink, and R. Seamans. 2023. “The Role of Ethical Principles in AI Startups.” SSRN Electronic Journal https://ssrn.com/abstract=4378280 or doi:10.2139/ssrn.4378280.
  • Bieder, C., and M. Bourrier, Eds. 2013. Trapping Safety into Rules: How Desirable and Avoidable is Proceduralization of Safety? Farnham: Ashgate.
  • Billings, C. E. 1991. “Human-Centered Aircraft Automation: A Concept and Guidelines.” Retrieved from NASA, United States. https://ntrs.nasa.gov/search.jsp?R=19910 022821.
  • Boos, Daniel, Hannes Guenter, Gudela Grote, and Katharina Kinder. 2013. “Controllable Accountabilities. The Internet of Things and Its Challenges for Organisations.” Behaviour & Information Technology 32 (5): 449–467. doi:10.1080/0144929X.2012.674157.
  • Bovens, M. 2007. “Analysing and Assessing Accountability: A Conceptual Framework.” European Law Journal 13 (4): 447–468. doi:10.1111/j.1468-0386.2007.00378.x.
  • Brady, A., and N. Naikar. 2022. “Development of Rasmussen’s Risk Management Framework for Analysing Multi-Level Sociotechnical Influences in the Design of Envisioned Work Systems.” Ergonomics 65 (3): 485–518. doi:10.1080/00140139.2021.2005823.
  • Brown, T. 2008. “Design Thinking.” Harvard Business Review 86 (6): 84–92, 141.
  • Carroll, J. M. 1996. “Encountering Others: Reciprocal Openings in Participatory Design and User-Centred Design.” Human–Computer Interaction 11 (3): 285–290. doi:10.1207/s15327051hci1103_5.
  • Castelvecchi, D. 2016. “The Blackbox of AI.” Nature 538 (7623): 20–23. doi:10.1038/538020a.
  • Challenger, R., C.W. Clegg, and C. Shepherd. 2013. “Function Allocation in Complex Systems: Reframing an Old Problem.” Ergonomics 56 (7): 1051–1069. doi:10.1080/00140139.2013.790482.
  • Chartered Institute of Ergonomics & Human Factors. 2022. “Human Factors in Highly Automated Systems.” White Paper. https://ergonomics.org.uk/resource/human-factors-in-highly-automated-systems-white-paper.html
  • Chui, M., M. Harryson, J. Manyika, R. Roberts, R. Chung, A. van Hetern, and P. Nel. 2018. Notes from the AI frontier - Applying AI for Social Good. McKinsey Global Institute.
  • Clegg, S. R., D. Courpasson, and N. Phillips. 2006. Power and Organizations. Foundations for Organizational Science. London: Sage.
  • Clegg, C. W., M. A. Robinson, M. C. Davis, L. E. Bolton, R. L. Pieniazek, and A. McKay. 2017. “Applying Organizational Psychology as a Design Science: A Method for Predicting Malfunctions in Socio-Technical Systems (PreMiSTS).” Design Science 3: e6. doi:10.1017/dsj.2017.4.
  • Davis, M. C., R. Challenger, D.N. Jayewardene, and C.W. Clegg. 2014. “Advancing Socio-Technical Systems Thinking: A Call for Bravery.” Applied Ergonomics 45 (2): 171–180. doi:10.1016/j.apergo.2013.02.009.
  • Dempsey, Patrick G., Michael S. Wogalter, and Peter A. Hancock. 2000. “What’s in a Name? Using Terms from Definitions to Examine the Fundamental Foundation of Human Factors and Ergonomics Science.” Theoretical Issues in Ergonomics Science 1 (1): 3–10. doi:10.1080/146392200308426.
  • Dul, J., R. Bruder, P. Buckle, P. Carayon, P. Falzon, W.S. Marras, J. R. Wilson, and B. van der Doelen. 2012. “A Strategy for Human Factors/Ergonomics: Developing the Discipline and Profession.” Ergonomics 55 (4): 377–395. doi:10.1080/00140139.2012.661087.
  • Dul, J., and P. W. Neumann. 2009. “Ergonomics Contributions to Company Strategies.” Applied Ergonomics 40 (4): 745–752. doi:10.1016/j.apergo.2008.07.001.
  • Emmenegger, C., and D. Norman. 2019. “The Challenges of Autonation in the Automabile.” Ergonomics 62 (4): 512–513. doi:10.1080/00140139.2019.1563336.
  • Endsley, M. R. 1995. “Towards a Theory of Situation Awareness in Dynamic Systems.” Human Factors: The Journal of the Human Factors and Ergonomics Society 37 (1): 32–64. doi:10.1518/001872095779049543.
  • European and US Technology Policy Committees of the ACM. 2022. “Statement on Principles for Responsible Algothmic Systems.” https://www.acm.org/binaries/content/assets/public-policy/final-joint-ai-statement-update.pdf
  • Faraj, S., S. Pachidi, and K. Sayegh. 2018. “Working and Organizing in the Age of the Learning Algorithm.” Information and Organization 28 (1): 62–70. doi:10.1016/j.infoandorg.2018.02.005.
  • Frey, C.B, and M.A. Osborne. 2017. “The Future of Employment: How Susceptible Are Jobs to Computerisation?” Technological Forecasting and Social Change 114: 254–280. doi:10.1016/j.techfore.2016.08.019.
  • Goes, P. G. 2014. “Editor’s Comments: Design Science Research in Top Information Systems Journals.” MIS Quarterly 38 (1): III–VIII.
  • Grote, G. 2014. “Adding a Strategic Edge to Human Factors/Ergonomics: Principles for the Management of Uncertainty as Cornerstones for System Design.” Applied Ergonomics 45 (1): 33–39. doi:10.1016/j.apergo.2013.03.020.
  • Hancock, P. A. 2019. “Some Pitfalls in the Promises of Automated and Autonomous Vehicles.” Ergonomics 62 (4): 479–495. doi:10.1080/00140139.2018.1498136.
  • Hancock, P. A., I. Nourbakhsh, and J. Stewart. 2019. “On the Future of Transportation in an Era of Automated and Autonomous Vehicles.” Proceedings of the National Academy of Sciences of the United States of America 116 (16): 7684–7691. doi:10.1073/pnas.1805770115.
  • High-Level Expert Group on Artificial Intelligence. 2019. Ethics Guidelines for Trustworthy AI. Brussels: European Commission.
  • Hignett, S., J. R. Wilson, and W. Morris. 2005. “Finding Ergonomic Solutions–Participatory Approaches.” Occupational Medicine (Oxford, England) 55 (3): 200–207. doi:10.1093/occmed/kqi084.
  • Hwang, T. J., A. S. Kesselheim, and K. N. Vokinger. 2019. “Lifecycle Regulation of Artificial Intelligence- and Machine Learning-Based Software Devices in Medicine.” JAMA 322 (23): 2285–2286. doi:10.1001/jama.2019.16842.
  • IEA. 2000. “What is Ergonomics.” https://iea.cc/about/what-is-ergonomics/
  • Kasneci, Enkelejda, Kathrin Sessler, Stefan Küchemann, Maria Bannert, Daryna Dementieva, Frank Fischer, Urs Gasser, Georg Groh, Stephan Günnemann, Eyke Hüllermeier, Stephan Krusche, Gitta Kutyniok, Tilman Michaeli, Claudia Nerdel, Jürgen Pfeffer, Oleksandra Poquet, Michael Sailer, Albrecht Schmidt, Tina Seidel, Matthias Stadler, Jochen Weller, Jochen Kuhn, and Gjergji Kasneci. 2023. “ChatGPT for Good? On Opportunities and Challenges for Large Language Models for Education.” Learning and Individual Differences 103: 102274. doi:10.1016/j.lindif.2023.102274.
  • Kellogg, K. C., M. Valentine, and A. Christin. 2020. “Algorithms at Work: The New Contested Terrain of Control.” Academy of Management Annals 14 (1): 366–410. doi:10.5465/annals.2018.0174.
  • Kim, B., and F. Doshi-Velez. 2021. “Machine Learning Techniques for Accountability.” AI Magazine 42 (1): 47–52. doi:10.1002/j.2371-9621.2021.tb00010.x.
  • Kirwan, B., A. R. Hale, and A. Hopkins, Eds. 2002. Changing Regulation: Controlling Hazards in Society. Oxford: Pergamon.
  • Langer, M., D. Oster, T. Speith, H. Hermanns, L. Kästner, E. Schmidt, A. Sesing, and K. Baum. 2021. “What Do We Want from Explainable Artificial Intelligence (XAI)? - A Stakeholder Perspective on XAI and a Conceptual Model Guiding Interdisciplinary XAI Research.” Artificial Intelligence 296: 103473. doi:10.1016/j.artint.2021.103473.
  • McLean, S., G. J. M. Read, J. Thompson, C. Baber, N. A. Stanton, and P. M. Salmon. 2023. “The Risks Associated with Artificial General Intelligence: A Systematic Review.” Journal of Experimental & Theoretical Artificial Intelligence 35 (5): 649–663. doi:10.1080/0952813X.2021.1964003.
  • Möhlmann, M., L. Zalmanson, O. Henfridsson, and R. W. Gregory. 2021. “Algorithmic Management of Work on Online Labor Platforms: When Matching Meets Control.” MIS Quarterly 45 (4): 1999–2022. doi:10.25300/MISQ/2021/15333.
  • Moray, N. 1995. “Ergonomics and the Global Problems of the Twenty-First Century.” Ergonomics 38 (8): 1691–1707. doi:10.1080/00140139508925220.
  • Mumford, E. 2000. “A Socio-Technical Approach to Systems Design.” Requirements Engineering 5 (2): 125–133. doi:10.1007/PL00010345.
  • Murray, A., J. Rhymer, and D. G. Sirmon. 2021. “Human and Technology: Forms of Conjoined Agency in Organizations.” Academy of Management Review 46 (3): 552–571. doi:10.5465/amr.2019.0186.
  • NIST. 2023. Artificial Intelligence Risk Management Framework (AI RMF 1.0). doi:10.6028/NIST.AI.100-1.
  • Norman, D., and J. Euchner. 2023. “Design for a Better World.” Research-Technology Management 66 (3): 11–18. doi:10.1080/08956308.2023.2183015.
  • Norros, L. 2014. “Developing Human Factors/Ergonomics as a Design Discipline.” Applied Ergonomics 45 (1): 61–71. doi:10.1016/j.apergo.2013.04.024.
  • Oswald, F. L., M. R. Endsley, J. Chen, E. K. Chiou, M. H. Draper, N. J. McNeese, and E. M. Roth. 2022. “The National Academies Board on Human-Systems Integration (BOHSI) Panel: Human-AI Teaming: Research Frontiers.” Proceedings of the Human Factors and Ergonomics Society Annual Meeting 66 (1): 130–134. doi:10.1177/1071181322661007.
  • Parker, S. K., and G. Grote. 2022. “Automation, Algorithms, and beyond: Why Work Design Matters More than Ever in a Digital World.” Applied Psychology 71 (4): 1171–1204. doi:10.1111/apps.12241.
  • Raisch, S., and S. Krakowski. 2021. “Artificial Intelligence and Management: The Automation-Augmentation Paradox.” Academy of Management Review 46 (1): 192–210. doi:10.5465/amr.2018.0072.
  • Rasmussen, J. 1997. “Risk Management in a Dynamic Society: A Modelling Problem.” Safety Science 27 (2–3): 183–213. doi:10.1016/S0925-7535(97)00052-0.
  • Ross, P. E. 2023. “A Former Pilot on Why Autonomous Vehicles Are so risky - Five Questions for Missy Cummings.” IEEE Spectrum 60 (6): 21–21, June. doi:10.1109/MSPEC.2023.10147081.
  • Rudin, C. 2019. “Stop Explaining Black Box Machine Learning Models for High Stakes Decisions and Use Interpretable Models instead.” Nature Machine Intelligence 1 (5): 206–215. doi:10.1038/s42256-019-0048-x.
  • Salmon, P. M., C. Baber, C. Burns, T. Carden, N. Cooke, M. Cummings, P. Hancock, S. McLean, G. J. M. Read, and N. A. Stanton. 2023. “Managing the Risks of Artifical General Intelligence: A Human Factors and Erogonimcs Perspective.” Human Factors and Ergonomics in Manufacturing & Service Industries 33 (5): 366–378. doi:10.1002/hfm.20996.
  • Salmon, P. M., T. Carden, and P. Hancock. 2021. “Putting the Humanity into Inhuman Systems: How Human Factors and Ergonomics Can Be Used to Manage the Risks Associated with Artificial General Intelligence.” Human Factors and Ergonomics in Manufacturing & Service Industries 31 (2): 223–236. doi:10.1002/hfm.20883.
  • Salmon, P. M., G. M. Read, G. H. Walker, N. J. Stevens, A. Hulme, S. McLean, and N. A. Stanton. 2022. “Methodological Issues in Systems Huamn Factors and Ergonomics: Perspectives on the Research-Practice Gap, Reliability and Validity, and Prediction.” Human Factors and Ergonomics in Manufacturing & Service Industries 32 (1): 6–19. doi:10.1002/hfm.20873.
  • Sanneman, Lindsay, and Julie A. Shah. 2022. “The Situation Awareness Framework for Explainable AI (SAFE-AI) and Human Factors Considerations for XAI Systems.” International Journal of Human–Computer Interaction 38 (18–20): 1772–1788. doi:10.1080/10447318.2022.2081282.
  • Sauer, J., A. Sonderegger, and S. Schmutz. 2020. “Usability, User Experience and Accessibility: Towards an Integrative Model.” Ergonomics 63 (10): 1207–1220. doi:10.1080/00140139.2020.1774080.
  • Sheridan, T. B. 1987. “Supervisory Control.” In Handbook of Human Factors, edited by G. Salvendy, 1243–1268. New York: Wiley.
  • Shneiderman, B. 2016. “The Dangers of Fault, Biased, or Malicious Algorithms Requires Independent Oversight.” Proceedings of the National Academy of Sciences of the United States of America 113 (48): 13538–13540. doi:10.1073/pnas.1618211113.
  • Siddiqui, F., and J. B. Merrill. 2023. “17 Fatalities, 736 Crashes: The Shocking Toll of Tesla’s Autopilot.” The Washington Post, June 10.
  • Simon, H. A. 1996. The Sciences of the Artificial. 3rd ed. Cambridge, MA: MIT Press.
  • Slota, S. C., K. R. Fleischmann, S. Greenberg, N. Verma, B. Cummings, L. Li, and C. Shenefiel. 2023. “Many Hands Make Many Fingers to Point: Challenges in Creating Accountable AI.” Ai & Society 38 (4): 1287–1299. doi:10.1007/s00146-021-01302-0.
  • Stanton, N. A., and C. Harvey. 2017. “Beyond Human Error Taxonomies in Assessment of Risk in Sociotechnical Systems: A New Paradigm with the EAST 'Broken-Links’ Approach.” Ergonomics 60 (2): 221–233. doi:10.1080/00140139.2016.1232841.
  • Sujan, M., R. Pool, and P. Salmon. 2022. “Eight Human Factors and Ergonomics Principles for Healthcare Artificial Intelligence.” BMJ Health & Care Informatics 29 (1): e100516. doi:10.1136/bmjhci-2021-100516.
  • Teodorescu, M. H. M., L. Morse, Y. Awwad, and G. C. Kane. 2021. “Failures of Fairness in Automation Require a Deeper Understanding of Human-ML Augmentation.” MIS Quarterly 45 (3): 1483–1500. doi:10.25300/MISQ/2021/16535.
  • Thatcher, A., P. Waterson, A. Todd, and N. Moray. 2018. “State of Science: Ergonomics and Global Issues.” Ergonomics 61 (2): 197–213. doi:10.1080/00140139.2017.1398845.
  • Tippins, N. T., F. L. Oswald, and S. M. McPhail. 2021. “Scientific, Legal, and Ethical Concerns about AI-Based Personnel Selection Tools: A Call to Action.” Personnel Assessment and Decisions 7 (2): 1–22. doi:10.25035/pad.2021.02.001.
  • Tracy, R. 2023. “Biden Administration Weighs Possible Rules for AI Tools like ChatGPT.” The Wall Street Journal, April 11.
  • van den Broek, E., A. Sergeeva, and M. Huysman. 2021. “When the Machine Meets the Expert: An Ethnography of Developing AI for Hiring.” MIS Quarterly 45 (3): 1557–1580. doi:10.25300/MISQ/2021/16559.
  • Waterson, P. 2014. “Health Information Technology and Sociotechnical Systems: A Progress Report on Recent Developments within the UK National Health Service (NHS).” Applied Ergonomics 45 (2): 150–161. doi:10.1016/j.apergo.2013.07.004.
  • Waterson, P. 2019. “Autonomous Vehicles and Human Factors/ergonomics - A Challenge but Not a Threat.” Ergonomics 62 (4): 509–511. doi:10.1080/00140139.2019.1563335.
  • Wilson, J. R., and A. Rutherford. 1989. “Mental Models: Theory and Application in Human Factors.” Human Factors: The Journal of the Human Factors and Ergonomics Society 31 (6): 617–634. doi:10.1177/001872088903100601.