References
- Ægisdóttir, S., White, M.J., Spengler, P.M., Maugherman, A.S., Anderson, L.A., Cook, R.S., Nichols, C.N., Lampropoulos, G.K., Walker, B.S., Cohen, G., & Rush, J.D. (2006). The meta-analysis of clinical judgment project: Fifty-six years of accumulated research on clinical versus statistical prediction. The Counseling Psychologist, 34(3), 341–382. https://doi.org/10.1177/0011000005285875
- Ahamed, T., Lederman, R., Bosua, R., Verspoor, K., Buntine, W., & Hart, G. (2016). Towards a methodology for nursing-specific Clinical Decision Support Systems (CDSS). Journal of Decision Systems, 25(sup1), 23–34. https://doi.org/10.1080/12460125.2016.1187387
- Aversa, P., Cabantous, L., & Haefliger, S. (2018). When decision support systems fail: Insights for strategic information systems from Formula 1. The Journal of Strategic Information Systems, 27(3), 221–236. https://doi.org/10.1016/j.jsis.2018.03.002
- Berner, E.S. (Ed.). (2016). Clinical decision support systems: Theory and practice. Springer.
- Bowker, G.C., & Star, S.L. (2000). Sorting things out: Classification and its consequences. MIT Press.
- Bozdag, E. (2013). Bias in algorithmic filtering and personalization. Ethics and Information Technology, 15(3), 209–227. https://doi.org/10.1007/s10676-013-9321-6
- Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 1–12. https://doi.org/10.1177/2053951715622512
- Chae, B., Paradice, D., Courtney, J.F., & Cagle, C.J. (2005). Incorporating an ethical perspective into problem formulation: Implications for decision support systems design. Decision Support Systems, 40(2), 197–212. https://doi.org/10.1016/j.dss.2004.02.002
- Charmaz, K. (2014). Constructing grounded theory. SAGE.
- Deng, X., Joshi, K.D., & Galliers, R.D. (2016). The duality of empowerment and marginalization in micro-task crowdsourcing: Giving voice to the less powerful through value sensitive design. MIS Quarterly, 40(2), 279–302. https://doi.org/10.25300/MISQ/2016/40.2.01
- Faraj, S., Pachidi, S., & Sayegh, K. (2018). Working and organizing in the age of the learning algorithm. Information and Organization, 28(1), 62–70. https://doi.org/10.1016/j.infoandorg.2018.02.005
- Friedman, B., & Kahn, P.H., Jr. (2003). Human values, ethics, and design. In J. A. Jacko & A. Sears (Eds.), The human–computer interaction handbook: Fundamentals, evolving technologies and emerging applications (pp. 1177–1201). Lawrence Erlbaum Associates.
- Friedman, B., Kahn, P.H., & Borning, A. (2008). Value sensitive design and information systems. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 69–101). Wiley.
- Friedman, B., Hendry, D.G., & Borning, A. (2017). A survey of value sensitive design methods. Foundations and Trends in Human–Computer Interaction, 11(2), 63–125. https://doi.org/10.1561/1100000015
- Galliers, R., Newell, S., Shanks, G., & Topi, H. (2017). Datification and its human, organizational and societal effects: The strategic opportunities and challenges of algorithmic decision-making. The Journal of Strategic Information Systems, 26(3), 185–190. https://doi.org/10.1016/j.jsis.2017.08.002
- Gerth, J., Rossegger, A., Bauch, E., & Endrass, J. (2017). Assessing the discrimination and calibration of the ontario domestic assault risk assessment in Switzerland. Partner Abuse, 8(2), 168–189. https://doi.org/10.1891/1946-6560.8.2.168
- Gerth, J., Rossegger, A., Singh, J.P., & Endrass, J. (2015). Assessing the risk of severe intimate partner violence: Validating the DyRiAS in Switzerland. Archives of Forensic Psychology, 1(2), 1–15.
- Grove, W.M., & Meehl, P.E. (1996). Comparative efficiency of informal (subjective, impressionistic) and formal (mechanical, algorithmic) prediction procedures: The clinical–statistical controversy. Psychology, Public Policy, and Law, 2(2), 293–323. https://doi.org/10.1037/1076-8971.2.2.293
- Han, W., Ada, S., Sharman, R., & Rao, H.R. (2015). Campus emergency notification systems: An examination of factors affecting compliance with alerts. MIS Quarterly, 39(4), 909–929. https://doi.org/10.25300/MISQ/2015/39.4.8
- Hsu, J.S.-C., Lin, T.-C., Cheng, K.-T., & Linden, L.P. (2012). Reducing requirement incorrectness and coping with its negative impact in information system development projects. Decision Sciences, 43(5), 929–955. https://doi.org/10.1111/j.1540-5915.2012.00368.x
- Jarvenpaa, S., & Standaert, W. (2018). Digital probes as opening possibilities of generativity. Journal of the Association for Information Systems, 19(10), 982–1000. https://doi.org/10.17705/1jais.00516
- Johnson, D.G. (2015). Technology with no human responsibility? Journal of Business Ethics, 127(4), 707–715. https://doi.org/10.1007/s10551-014-2180-1
- Kawamoto, K., Houlihan, C.A., Balas, E.A., & Lobach, D.F. (2005). Improving clinical practice using clinical decision support systems: A systematic review of trials to identify features critical to success. BMJ, 330(7494), 765. https://doi.org/10.1136/bmj.38398.500764.8F
- Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14–29. https://doi.org/10.1080/1369118X.2016.1154087
- Kizilcec, R.F. (2016). How much information?: Effects of transparency on trust in an algorithmic interface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. San Jose.
- Le Dantec, C.A., Poole, E.S., & Wyche, S.P. (2009). Values as lived experience: Evolving value sensitive design in support of value discovery. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. Boston.
- Marjanovic, O., & Cecez-Kecmanovic, D. (2017). Exploring the tension between transparency and datification effects of open government IS through the lens of complex adaptive systems. The Journal of Strategic Information Systems, 26(3), 210–232. https://doi.org/10.1016/j.jsis.2017.07.001
- Martin, K. (2019). Ethical implications and accountability of algorithms. Journal of Business Ethics, 160(4), 835–850. https://doi.org/10.1007/s10551-018-3921-3
- Miller, R.A. (1990). Why the standard view is standard: People, not machines, understand patients’ problems. Journal of Medicine and Philosophy, 15(6), 581–591. https://doi.org/10.1093/jmp/15.6.581
- Mittelstadt, B.D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping the debate. Big Data & Society, 3(2), 1–21. https://doi.org/10.1177/2053951716679679
- Mitzinneck, B.C., & Besharov, M.L. (2019). Managing value tensions in collective social entrepreneurship: The role of temporal, structural, and collaborative compromise. Journal of Business Ethics, 159(2), 381–400. https://doi.org/10.1007/s10551-018-4048-2
- Neville, K., O’Riordan, S., Pope, A., Rauner, M., Rochford, M., Madden, M., Sweeney, J., Nussbaumer, A., McCarthy, N., & O‘Brien, C. (2016). Towards the development of a decision support system for multi-agency decision-making during cross-border emergencies. Journal of Decision Systems, 25(sup1), 381–396. https://doi.org/10.1080/12460125.2016.1187393
- Newell, S., & Marabelli, M. (2015). Strategic opportunities (and challenges) of algorithmic decision-making: A call for action on the long-term societal effects of ‘datification’. The Journal of Strategic Information Systems, 24(1), 3–14. https://doi.org/10.1016/j.jsis.2015.02.001
- Poulsen, A., & Burmeister, O.K. (2019). Overcoming carer shortages with care robots: Dynamic value trade-offs in run-time. Australasian Journal of Information Systems, 23. https://doi.org/10.3127/ajis.v23i0.1688
- Power, D.J., Cyphert, D., & Roth, R.M. (2019). Analytics, bias, and evidence: The quest for rational decision making. Journal of Decision Systems, 28(2), 120–137. https://doi.org/10.1080/12460125.2019.1623534
- Ransbotham, S., Fichman, R.G., Gopal, R., & Gupta, A. (2016). Special section introduction—Ubiquitous IT and digital vulnerabilities. Information Systems Research, 27(4), 834–847. https://doi.org/10.1287/isre.2016.0683
- Reiman, T., & Rollenhagen, C. (2012). Competing values, tensions and trade-offs in management of nuclear power plants. Work, 41(Suppl 1), 722–729. https://doi.org/10.3233/WOR-2012-0232-722
- Robbins, R.W., & Wallace, W.A. (2007). Decision support for ethical problem solving: A multi-agent approach. Decision Support Systems, 43(4), 1571–1587. https://doi.org/10.1016/j.dss.2006.03.003
- Rossegger, A., Endrass, J., & Gerth, J. (2012). Einführung ins Risk-Assessment. In J. Endrass, A. Rosegger, F. Urbaniok, & B. Borchard (Eds.), Interventionen bei Gewalt- und Sexualstraftätern (pp. 91–97). Medizinisch Wissenschaftliche Verlagsgesellschaft.
- Rossegger, A., Gerth, J., Singh, J.P., & Endrass, J. (2013). Examining the predictive validity of the SORAG in Switzerland. Sexual Offender Treatment, 8(2), 1–12. http://www.sexual-offender-treatment.org/123.html
- Rowe, F. (2018). Being critical is good, but better with philosophy! From digital transformation and values to the future of IS research. European Journal of Information Systems, 27(3), 380–393. https://doi.org/10.1080/0960085X.2018.1471789
- Rowe, F. (2020). Contact tracing apps and values dilemmas: A privacy paradox in a neo-liberal world. International Journal of Information Management, 55, 102178. https://doi.org/10.1016/j.ijinfomgt.2020.102178
- Saldaña, J. (2013). The coding manual for qualitative researchers. SAGE.
- Shamsuzzoha, A., Ferreira, F., Azevedo, A., & Helo, P. (2017). Collaborative smart process monitoring within virtual factory environment: An implementation issue. International Journal of Computer Integrated Manufacturing, 30(1), 167–181. https://doi.org/10.1080/0951192X.2016.1185156
- Sheridan, T.B., & Parasuraman, R. (2005). Human-automation interaction. Reviews of Human Factors and Ergonomics, 1(1), 89–129. https://doi.org/10.1518/155723405783703082
- Stahl, B.C. (2008). Researching ethics and morality in information systems: Some guiding questions. In Proceedings of the International Conference on Information Systems2008. Paris.
- Stahl, B.C. (2012). Morality, ethics, and reflection: A categorization of normative IS research. Journal of the Association for Information Systems, 13(8), 636–656. https://doi.org/10.17705/1jais.00304
- Syed, D., Chang, T.-H., Svetinovic, D., Rahwan, T., & Aung, Z. (2017). Security for complex cyber-physical and industrial control systems: Current trends, limitations, and challenges. In Proceedings of the Pacific Asia Conference on Information Systems2017. Langkawi.
- Tahat, L., Elian, M.I., Sawalha, N.N., & Al-Shaikh, F.N. (2014). The ethical attitudes of information technology professionals: A comparative study between the USA and the Middle East. Ethics and Information Technology, 16(3), 241–249. https://doi.org/10.1007/s10676-014-9349-2
- Turilli, M., & Floridi, L. (2009). The ethics of information transparency. Ethics and Information Technology, 11(2), 105–112. https://doi.org/10.1007/s10676-009-9187-9
- van den Hoven, J. (2007). ICT and value sensitive design. In P. Goujon, S. Lavelle, P. Duquenoy, K. Kimppa, & V. Laurent (Eds.), The information society: Innovation, legitimacy, ethics and democracy in honor of Professor Jacques Berleur s.j. (pp. 67–72). Springer.
- Xu, H., Crossler, R.E., & Bélanger, F. (2012). A value sensitive design investigation of privacy enhancing tools in web browsers. Decision Support Systems, 54(1), 424–433. https://doi.org/10.1016/j.dss.2012.06.003
- Zeng, J., Ustun, B., & Rudin, C. (2017). Interpretable classification models for recidivism prediction. Journal of the Royal Statistical Society: Series A (Statistics in Society), 180(3), 689–722. https://doi.org/10.1111/rssa.12227