49
Views
0
CrossRef citations to date
0
Altmetric
Editorial

Who did that? AI assisted targeting and the lowering of thresholds in Gaza

Thirty-eight articles on the Occupied Palestinian Territories and Israel that have appeared in this journal since 1989 have been published as a collection.Footnote1 They cover attacks on health services, denial of rights, attempted peace processes, vulnerable groups, physical injuries, mental and psychosocial health issues, settlements, challenges to medical education and distance learning – a tapestry of life under occupation and violence and attempts to mitigate and end it. At the time of writing, the IPC Famine Reports, the International Criminal Court orders, the International Court of Justice rulings, and all the evidence that informs their findings make what is going on tragically clear.

On 27 May, the prime minister of Israel described an air strike on a refugee camp in the previous ‘safe zone’ of Rafah that killed up to 45 people and injured over 200 as a ‘tragic mishap’ (Hallam Citation2024; Murphy and Adams Citation2024). Volker Turk, the UN High Commissioner for Human Rights, said the attack suggested that there had been ‘no apparent change in the methods and means of warfare used by Israel that have already led to so many civilian deaths’ (OHCHR Citation2024). An Israeli Defence Force statement said the strike was based on ‘intelligence information’ and that ‘many steps [had been taken] to reduce the chance of harming uninvolved [civilians], including aerial surveillance, the use of precision munitions and additional intelligence information’ (Mackintosch and Gritten Citation2024).

It may seem an irrelevant detail to consider the issue of weapons targeting given the enormity of what is happening, but it is not a detail for those getting indiscriminately slaughtered and it illustrates a wider concern about the use of AI in targeting. The strike on the refugee camp and subsequent statements, if taken at face value, show that Israel’s precision munitions are not precise, and their intelligence information, including that gained through aerial surveillance, is not accurate.

Software is one of Israel’s main exports, and Palestine has long been a testing ground for their new technologies (Loewenstein Citation2023). Since the first weeks of the war, the Israeli Defence Forces have been using a machine learning AI system called Lavender to help identify targets (Abraham Citation2024). One Israeli intelligence officer, who was asked about the human role in the target ‘selection process’ said: ‘I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time’ (McKernan and Davies Citation2024). At the beginning of the war, the threshold for inclusion on the kill list was lowered, and Lavender identified 37,000 Palestinians as suspected militants. It was easier to locate them in their homes, which inevitably meant family were also more likely to be present (Abraham Citation2024).

The literature on trust in AI is extensive and complex (Montag et al. Citation2023), but it does seem that the backing of a machine gives confidence to the decision-making process. In wartime this could make it easier to do something you don’t want to do but think you should, or to do something you think you shouldn’t but would like to. Another Israeli intelligence official said ‘Everyone there, including me, lost people on 7 October. The machine did it coldly. And that made it easier’ (McKernan and Davies Citation2024).

Others are fine tuning this software elsewhere: US Central Command’s Chief Technology Officer said US forces are using Maven AI to narrow down enemy targets in Iraq and Syria and spot rocket launchers in Yemen and surface vessels on the Red Sea (Manson Citation2024). And, of course, many in the industry are keen to sell. Germany’s Rheinmetall has produced Mission Master-CXT, an unmanned vehicle that, among other functions, can autonomously both find and destroy targets. Rheinmetall’s vice president of business development and innovation said: ‘The restriction on the weapon system is a client decision, not what we’re providing’ (Tucker and Williams Citation2022). So if those making the profits claim they have no part in the responsibility for what they manufacture, who does? The question has repeatedly been asked: when AI is involved in ‘mishaps’ who is held responsible? We welcome the submission of any articles that try to unpick the question of responsibility and AI assisted targeting.

This issue continues with an obituary for Martin Bax, who played a key role as Chair of the Editorial Advisory Board of Medicine, Conflict and Survival from 1991 to 2005, and who has died aged 90. As Michael Pountney writes, Martin was a man of many gifts who inspired much respect and affection, and will be very much missed.

A review article in this issue: Respiratory health and the Syrian conflict: a scoping literature review, by Basha et al., highlights how conflict adversely affects respiratory health both directly and indirectly among populations whose health is already compromised, and emphasizes the need for more research with a focus on the social determinants of health, and diagnosis and treatment.

The urgent subject of palliative care in war and conflict was picked up by de Laat et al. in They do their utmost: promise and limits of palliative care in two refugee camps in Rwanda, a qualitative study. It includes the ‘first-hand experiences of individuals who have fled protracted conflict and face dying far from home’ and their recommendations, from the micro to the macro level, for policy developers and decision-makers.

Peace through Health has been a recurring theme in this journal, and Sezai Caglayan considers the situation in Turkey in: How can health be more effective in peace works in Turkey: introducing peace through health. The article examines the theoretical and practical aspects of Peace through Health in Turkey and recommends the development of systematic training in the subject, including interdisciplinary modules for the Turkish tertiary curricula.

The devastating consequences of the conflict in Sudan receive little international attention, and Aborode et al.’s commentary: Effects of migration on Sudanese women and children: a public health concern, clearly shows that there urgently needs to be more. This commentary, as well as others on Sudan that appeared in three recent issues of MCS, provide devastating testimony that this conflict urgently needs a resolution.

Book reviews in this issue cover a range of issues central to this journal and the titles speak for themselves. Two are by Janne L. Punski-Hoogervorst: Violence in Extreme Conditions – ethical challenges in military practice, and Global terrorism. Leo van Bergen reviews Dying for France – experiencing and representing the soldier’s death 1500–2000, Vappu Taipale Twelve feminist lessons of war, and Simon Rushton Conflict, education and peace in Nepal: rebuilding education for peace and development. Finally, two reviews are of books covering reproductive health: Abortion pills go global: reproductive freedom across borders, reviewed by Donya Zarrinnegar and Population control: theorizing institutional violence, reviewed by Juulia Kela.

With thanks to reviewers, authors and readers. Do get in touch with ideas and suggestions – and of course your articles.

Notes

1. Article collection: Palestine and Israel: articles published in Medicine, Conflict & Survival 1989–2023 (tandfonline.com).

References

  • Abraham, Y. 2024. “‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza.” +972 Magazine, April 3. 972mag.com.
  • Hallam, J. 2024. “Deadly Israeli Strike on Rafah Was a “Tragic error,” Netanyahu Says.” CNN, Atlanta, US. May 28. https://edition.cnn.com/middleeast/live-news/israel-hamas-war-gaza-news-05-27-24/h_f427806f0877c26a321e7bd6615446cd.
  • Loewenstein, A. 2023. The Palestinian Laboratory. Verso.
  • Mackintosch, T., and D. Gritten. 2024. “Gaza War: Dozens Reported Killed in Israeli Strike on Rafah.” BBC News, May 27.
  • Manson, K. 2024. “AI Targeting, Used in US Airstrikes, Is Just the Beginning.” Bloomberg UK, February 29.
  • McKernan, B., and H. Davies. 2024. “‘The Machine Did it coldly’: Israel Used AI to Identify 37,000 Hamas Targets| Israel-Gaza War.” The Guardian, April 3.
  • Montag, C., J. Kraus, M. Baumann, and D. Rozgonjuk. 2023. The Propensity to Trust in (Automated) Technology Mediates the Links Between Technology Self-Efficacy and Fear and Acceptance of Artificial Intelligence. North York, CA: Elsevier Computers in Human Behavior Reports Volume 11, August 2023, 100315.
  • Murphy, M., and A. Adams. 2024. “Deadly Strike on Rafah a Tragic Mishap, Netanyahu Says.” BBC News, May 27.
  • OHCHR. 2024. “Gaza: Türk Voices Horror at Loss of Life in Camp After Israeli Strikes.” UN Office of the High Commissioner for Human Rights, May 27.
  • Tucker P., and L. Williams. 2022. “The Army’s Big Convention Was Full of Armed Robots.” Defence One, October 13.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.