Urgent need to investigate role of technology, social media companies in killing Gazan civilians [EN/AR]
News and Press Release Source
- Euro-Med Monitor Posted 21 Apr 2024 Originally published 21 Apr 2024 Origin View original
Palestinian Territory – The role of major technology companies and international social media platforms in the killing of Palestinian civilians during Israel’s genocidal war against the Gaza Strip, ongoing since 7 October 2023, must be investigated. These companies need to be held accountable if found to be complicit or not to have taken adequate precautions to prevent access to, and exploitation of, users’ information. They must ensure that their services are not used in conflict zones and that their users’ privacy is respected.
There are frequent reports that Israel uses a number of artificial intelligence-supported technological systems, including Where’s Daddy?, Fire Factory, Gospel, and Lavender, to illegally track and monitor Palestinians. These systems are able to identify possible suspects and classify them as legitimate targets based on potentially relevant information that is typically unrelated to the location or individual in question, by looking for similarities and patterns among all Gaza Strip residents, particularly men, and among members of armed factions.
Studies have shown that, although they are aware of a significant margin of error due to the nature of these operating systems and their inability to provide accurate information—particularly with regard to the whereabouts of those placed on the targeting list in real time—the Israeli army usually does not verify the accuracy of the information provided by these systems.
For example, Israel’s army uses the Lavender system extensively to identify suspects in the Strip before targeting them; this system intentionally results in a high number of civilian casualties.
The Lavender system uses the logic of probabilities, which is a distinguishing characteristic of machine learning algorithms. The algorithm looks through large data sets for patterns that correspond to fighter behaviour, and the amount and quality of the data determines how successful the algorithm is in finding these patterns. It then recommends targets based on the probabilities.
With concerns being voiced regarding the Lavender system’s possible reliance on tracking social media accounts, Israeli military and intelligence sources have acknowledged attacking potential targets without considering the principle of proportionality or collateral damage.
These suspicions are supported by a book (The Human Machine Team) written by the current commander of the elite Israeli army Unit 8200, which offers instructions on how to create a “target machine” akin to the Lavender artificial intelligence system. The book also includes information on hundreds of signals that can raise the severity of a person’s classification, such as switching cell phones every few months, moving addresses frequently, or even just joining the same group on Meta’s WhatsApp application as a “fighter”.
Additionally, it has been recently revealed that Google and Israel are collaborating on several technology initiatives, including Project Nimbus, which provides the Israeli army with tools for the increased monitoring and illegal data collection of Palestinians, thereby broadening Israeli policies of denial and persecution, plus other crimes against the Palestinian people. This project in particular has sparked significant human rights criticism, prompting dozens of company employees to protest and resign, with others being fired over their protests.
The Israeli army also uses Google Photos’ facial recognition feature to keep an eye on Palestinian civilians in the Gaza Strip and create a “hit list”. It gathers as many images as possible from the 7 October event, known as Al-Aqsa Flood, during which Palestinian faces were visible as they stormed the separation fence and entered the settlements. This technology is then used to sort photos and store images of faces, which resulted in the recent arrest of thousands of Palestinians from the Gaza Strip, in violation of the company’s explicit rules and the United Nations Guiding Principles on Business and Human Rights.
The Euro-Med Monitor field team has documented accounts of Palestinian civilians who, as a consequence of their social media activity, have been singled out as suspects by Israel, despite having taken no military action.
A young Palestinian man who requested to be identified only as “A.F.” due to safety concerns, for instance, was seriously injured in an Israeli bombing that targeted a residential house in Gaza City’s Al-Sabra neighbourhood.
The house was targeted shortly after A.F. posted a video clip on Instagram, which is owned by Meta, in which he joked that he was in a “field reconnaissance mission”.
His relative told Euro-Med Monitor that A.F. had merely been attempting to mimic press reporters when he posted the brief video clip on his personal Instagram account. Suddenly, however, A.F. was targeted by an Israeli reconnaissance plane while on the roof of the house.
A separate Israeli bombing on 16 April claimed the lives of six young Palestinians who had gathered to access Internet services. One of the victims was using a group chat on WhatsApp—a Meta subsidiary—to report news about the Sheikh Radwan neighbourhood of Gaza City.
The deceased man’s relative, who requested anonymity due to safety fears, informed Euro-Med Monitor that the victim was near the Internet access point when the group was directly hit by a missile from an Israeli reconnaissance plane. The victim was voluntarily sharing news in family and public groups on the WhatsApp application about the Israeli attacks and humanitarian situation in Sheikh Radwan.
The Israeli army’s covert strategy of launching extremely damaging air and artillery attacks based on data that falls short of the minimum standard of accurate target assessment, ranging from cell phone records and photos to social media contact and communication patterns, all within the larger context of an incredibly haphazard killing programme, is deeply concerning.
The evidence presented by global technology experts points to a likely connection between the Israeli army’s use of the Lavender system—which has been used to identify targets in Israel’s military assaults on the Gaza Strip—and the Meta company. This means that the Israeli army potentially targeted individuals merely for being in WhatsApp groups with other people on the suspect list. Additionally, the experts question how Israel could have obtained this data without Meta disclosing it.
Earlier, British newspaper The Guardian exposed Israel’s use of artificial intelligence (Lavender) to murder a great number of Palestinian civilians. The Israeli military used machine learning systems to identify potential “low-ranking” fighters, with the aim of targeting them without considering the level of permissible collateral damage. A “margin of tolerance” was adopted, which allowed for the death of 20 civilians for every overthrown target; when targeting “higher-ranking fighters”, this margin of tolerance allowed for the death of 100 people for every fighter.
Google, Meta, and other technology and social media companies may have colluded with Israel in crimes and violations against the Palestinian people, including extrajudicial killings, in defiance of international law and these companies’ stated human rights commitments.
Social networks should not be releasing this kind of personal data about their users or taking part in the Israeli genocide against Palestinian civilians in the Gaza Strip. An international investigation is required in order to provide guarantees of accountability for those responsible, and justice for the victims.
Meta’s overt and obvious bias towards Israel, its substantial suppression of content that supports the Palestinian cause, and its policy of stifling criticism of Israel’s crimes—including rumours of close ties between top Meta officials and Israel—suggest the company’s plausible involvement in the killing of Palestinian civilians.
Given the risks of failing to take reasonable steps to demonstrate that the objective is a legitimate one under international humanitarian law, the aforementioned companies must fully commit to ending all cooperation with the Israeli military and providing Israel with access to data and information that violate Palestinian rights and put their lives in jeopardy.
Israel’s failure to exercise due diligence and consider human rights when using artificial intelligence for military purposes must be immediately investigated, as well as its failure to comply with international law and international humanitarian law.
These companies must promptly address any information that has been circulating about their involvement in Israeli crimes against the Palestinian people. Serious investigations regarding their policies and practices in relation to Israeli crimes and human rights violations must be opened if necessary, and companies must be held accountable if found to be complicit or to have failed to take reasonable precautions to prevent the exploitation of user information for criminal activities.
“Ghetto living is more than just a feeling of confinement; it is a sense of suffocation too.”
― Susan Nathan, The Other Side of Israel: My Journey Across the Jewish/Arab Divide