Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

インシデント 672: Lavender AI System Reportedly Directs Gaza Strikes with High Civilian Casualty Rate

レスポンスしました
概要: The AI system "Lavender" has reportedly been used by the Israel Defense Forces (IDF) to identify targets in Gaza with minimal human oversight, resulting in allegedly high civilian casualty rates. The system, designed to speed up target identification, seems to have led to significant errors and mass casualties.
Editor Notes: "The Gospel" is an AI-based decision support system used by the Israel Defense Forces (IDF) to recommend buildings and structures as bombing targets in Gaza. It works alongside another AI system called "Lavender," which generates a database of individuals linked to Hamas or PIJ militants. While "Lavender" identifies human targets, "The Gospel" focuses on selecting physical targets, significantly increasing the number of potential bombing sites and accelerating the targeting process.

ツール

新しいレポート新しいレポート新しいレスポンス新しいレスポンス発見する発見する履歴を表示履歴を表示

組織

すべての組織を表示
推定: Unit 8200 と Israel Defense Forcesが開発し提供したAIシステムで、Palestinians と Gazansに影響を与えた

インシデントのステータス

インシデントID
672
レポート数
7
インシデント発生日
2024-04-03
エディタ
Applied Taxonomies
MIT

MIT 分類法のクラス

Machine-Classified
分類法の詳細

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

7.3. Lack of capability or robustness

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. AI system safety, failures, and limitations

Entity

Which, if any, entity is presented as the main cause of the risk
 

Human

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Intentional

インシデントレポート

レポートタイムライン

+4
translated-ja-‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
translated-ja-Israel offers a glimpse into the terrifying world of military AItranslated-ja-What War by A.I. Actually Looks Liketranslated-ja-Inside Israel’s Bombing Campaign in Gaza
translated-ja-‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

translated-ja-‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

972mag.com

translated-ja-Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report

translated-ja-Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report

yahoo.com

translated-ja-‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

translated-ja-‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets

theguardian.com

translated-ja-Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza

translated-ja-Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza

theguardian.com

translated-ja-Israel offers a glimpse into the terrifying world of military AI

translated-ja-Israel offers a glimpse into the terrifying world of military AI

washingtonpost.com

translated-ja-What War by A.I. Actually Looks Like

translated-ja-What War by A.I. Actually Looks Like

nytimes.com

translated-ja-Inside Israel’s Bombing Campaign in Gaza

translated-ja-Inside Israel’s Bombing Campaign in Gaza

newyorker.com

translated-ja-‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
972mag.com · 2024
自動翻訳済み

translated-ja-In 2021, a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World" was released in English under the pen name "Brigadier General Y.S." In it, the …

translated-ja-Israeli Military Using AI to Select Targets in Gaza With 'Rubber Stamp' From Human Operator: Report
yahoo.com · 2024
自動翻訳済み

translated-ja-Israel has been using an artificial intelligence system called Lavender to create a “kill list” of at least 37,000 people in Gaza, according to a new report from Israel’s +972 magazine, confirmed by the Guardian. Lavender is t…

translated-ja-‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
theguardian.com · 2024
自動翻訳済み

translated-ja-The Israeli military's bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources …

translated-ja-Israel Defence Forces’ response to claims about use of ‘Lavender’ AI database in Gaza
theguardian.com · 2024
自動翻訳済み
The Guardianによるインシデント後のレスポンス

translated-ja-IDF statement in response to an article about the use of the AI-powered database named Lavender in the bombardment of Gaza:

Some of the claims portrayed in your questions are baseless in fact, while others reflect a flawed und…

translated-ja-Israel offers a glimpse into the terrifying world of military AI
washingtonpost.com · 2024
自動翻訳済み

translated-ja-It's hard to concoct a more airy sobriquet than this one. A new report published by +972 magazine and Local Call indicates that Israel has allegedly used an AI-powered database to select suspected Hamas and other militant targ…

translated-ja-What War by A.I. Actually Looks Like
nytimes.com · 2024
自動翻訳済み

translated-ja-In November the left-wing Israeli outlets +972 magazine and Local Call published a disturbing investigation by the journalist Yuval Abraham into the Israel Defense Forces' use of an artificial intelligence system for identifyi…

translated-ja-Inside Israel’s Bombing Campaign in Gaza
newyorker.com · 2024
自動翻訳済み

translated-ja-Since the war began in Gaza, more than six months ago, the Israeli magazine +972 has published some of the most penetrating reporting on the Israel Defense Forces' conduct. In November, +972, along with the Hebrew publication …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください
前のインシデント次のインシデント

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2023 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 8b8f151