Dr Ashraf Abul Saud
The Fourth Geneva Convention of 1949 and its Additional Protocols were written to shield civilians from the horrors of war, offering them strict and unambiguous protections during armed conflict.
At its core, Article 49 explicitly bans the forced displacement or deportation of civilians – whether carried out individually or collectively – from their own land.
The only narrow exception is when such movement is essential for their immediate safety or demanded by overwhelming military necessity.
Even then, they must be allowed to return home the moment the fighting ends.
Nevertheless, in the Gaza war, Israel turned to advanced artificial intelligence on an unprecedented scale to identify and strike targets.
Three systems in particular, namely Lavender, “Where’s My Father” (Habsora), and Gospel, have sparked intense controversy, with many accusing them of quietly eroding the very foundations of international humanitarian law.
Lavender, an AI target-generation platform, uses machine learning to scan vast amounts of data on known militants.
It examines digital footprints, from participation in certain chat groups and frequent SIM card changes to shifting addresses and movement patterns, then assigns every individual in its database a score from 1 to 100.
In the frantic early weeks of the war, this system reportedly flagged over 37,000 Palestinians as potential targets.
Israel is said to have accepted a 10 per cent error rate, a figure that, on its own, implied the possible deaths of thousands of civilians.
Disturbingly, reports later suggested the threshold was loosened even further: allowing 15 to 20 civilian deaths for a single low-level fighter, and more than 100 civilians for a high-value target.
Compounding the tragedy is the “Where’s My Father” system, which alerts forces the moment a targeted person returns home, often late at night, when families are gathered together.
Because of delays between detection and strike, entire neighbourhoods were sometimes flattened long after the intended target had left.
Instead of using expensive precision munitions, the military frequently opted for cheaper unguided “dumb” bombs, levelling residential blocks with devastating efficiency and at minimal cost.
Then there is Gospel, a system so rapid it generates targets for civilian infrastructure, including homes, schools, universities, and vital facilities, faster than any human ever could.
Together, these tools have led many respected voices, including experts from the United Nations, Human Rights Watch, Amnesty International, and the International Committee of the Red Cross, to warn that the scale, automation, and apparent absence of meaningful human oversight may amount to serious violations of international law – and, in some cases, war crimes.
Dr Ashraf Abul Saud is a writer and an international relations scholar.











