New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.
New Delhi: A previously undisclosed AI system called Lavender was used by the Israeli military during its bombing operation to identify potential targets in Gaza, according to intelligence sources involved in the conflict, a report by The Guardian stated. This AI database had at one point flagged 37,000 Palestinians as having possible links with Hamas or the Palestinian Islamic Jihad (PIJ).
As per the report, the intelligence sources claimed the system played a key role in target identification, rapidly processing data to link individuals to the militant groups. They also alleged that Israeli officials permitted high numbers of civilian casualties, especially in the early months of the war.
Lavender was developed by Israel’s elite Unit 8200 intelligence division, comparable to the US’s National Security Agency or GCHQ in the UK. The intelligence sources said that the military applied pre-set allowances for estimated civilian deaths before approving strikes on lower-level targets, The Guardian reported.
Two of the intelligence sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians while targeting militants. “Attacks on these targets were carried out using unguided munitions known as “dumb bombs”,” the sources said, adding the attacks would target entire homes and kill all the occupants.
“This is unparalleled, in my memory. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier,” one of the officers told The Guardian.
“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of those bombs],” another said.
Meanwhile, the IDF refuted the claims as “baseless” and said its operations aimed to dismantle Hamas following the October 7 attacks.
“Some of the claims are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organisation on October 7, the IDF has been operating to dismantle Hamas’ military capabilities,” the IDF said in a statement.
“Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF added.