Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

インシデント 335: UK Visa Streamline Algorithm Allegedly Discriminated Based on Nationality

概要: UK Home Office's algorithm to assess visa application risks explicitly considered nationality, allegedly caused candidates to face more scrutiny and discrimination.

ツール

新しいレポート新しいレポート新しいレスポンス新しいレスポンス発見する発見する履歴を表示履歴を表示

組織

すべての組織を表示
Alleged: UK Visas and Immigration と UK Home Office developed an AI system deployed by UK Visas and Immigration, which harmed UK visa applicants from some countries.

インシデントのステータス

インシデントID
335
レポート数
8
インシデント発生日
2015-03-01
エディタ
Khoa Lam
Applied Taxonomies
MIT

MIT 分類法のクラス

Machine-Classified
分類法の詳細

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

1.1. Unfair discrimination and misrepresentation

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. Discrimination and Toxicity

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Intentional

インシデントレポート

レポートタイムライン

Incident OccurrenceLegal action to challenge Home Office use of secret algorithm to assess visa applications+1
AI system for granting UK visas is biased, rights groups claim
+4
Update: papers filed for judicial review of the Home Office’s visa algorithm
Legal action to challenge Home Office use of secret algorithm to assess visa applications

Legal action to challenge Home Office use of secret algorithm to assess visa applications

foxglove.org.uk

AI system for granting UK visas is biased, rights groups claim

AI system for granting UK visas is biased, rights groups claim

theguardian.com

The use of Artificial Intelligence by the Home Office to stream visa applications

The use of Artificial Intelligence by the Home Office to stream visa applications

kingsleynapley.co.uk

Update: papers filed for judicial review of the Home Office’s visa algorithm

Update: papers filed for judicial review of the Home Office’s visa algorithm

foxglove.org.uk

UK commits to redesign visa streaming algorithm after challenge to 'racist' tool

UK commits to redesign visa streaming algorithm after challenge to 'racist' tool

techcrunch.com

Home Office says it will abandon its racist visa algorithm - after we sued them

Home Office says it will abandon its racist visa algorithm - after we sued them

foxglove.org.uk

Home Office drops 'racist' algorithm from visa decisions

Home Office drops 'racist' algorithm from visa decisions

bbc.com

We won! Home Office to stop using racist visa algorithm

We won! Home Office to stop using racist visa algorithm

jcwi.org.uk

Legal action to challenge Home Office use of secret algorithm to assess visa applications
foxglove.org.uk · 2017

It has come to light that the Home Office is using a secretive algorithm, which it describes as digital “streaming tool,” to sift visa applications. So far they have refused to disclose much information about how the algorithm works, hiding…

AI system for granting UK visas is biased, rights groups claim
theguardian.com · 2019

Immigrant rights campaigners have begun a ground-breaking legal case to establish how a Home Office algorithm that filters UK visa applications actually works.

The challenge is the first court bid to expose how an artificial intelligence pr…

The use of Artificial Intelligence by the Home Office to stream visa applications
kingsleynapley.co.uk · 2019

The growth of technology has brought a great deal of efficiency and security to almost all organisations and businesses. But such progress may have taken a slightly wrong turn as the reliance on artificial intelligence by the Home Office as…

Update: papers filed for judicial review of the Home Office’s visa algorithm
foxglove.org.uk · 2020

Foxglove is supporting the Joint Council for the Welfare of Immigrants (JCWI) to challenge the Home Office’s use of a secret algorithm to sift visa applications, which it describes as a digital “streaming tool”.

We share JCWI’s concerns tha…

UK commits to redesign visa streaming algorithm after challenge to 'racist' tool
techcrunch.com · 2020

The U.K. government is suspending the use of an algorithm used to stream visa applications after concerns were raised the technology bakes in unconscious bias and racism.

The tool had been the target of a legal challenge. The Joint Council …

Home Office says it will abandon its racist visa algorithm - after we sued them
foxglove.org.uk · 2020

Home Office lawyers wrote to us yesterday, to respond to the legal challenge which we’ve been working on with the Joint Council for the Welfare of Immigrants (JCWI). 

We were asking the Court to declare the streaming algorithm unlawful, and…

Home Office drops 'racist' algorithm from visa decisions
bbc.com · 2020

The Home Office has agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained "entrenched racism".

The Joint Council for the Welfare of Immigrants (JCWI) and digital rights group Foxglove…

We won! Home Office to stop using racist visa algorithm
jcwi.org.uk · 2020

We are delighted to announce that the Home Office has agreed to scrap its 'visa streaming' algorithm, in response to legal action we launched with tech-justice group Foxglove.

From Friday, 7 August, Home Secretary Priti Patel will suspend t…

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

DWP urged to reveal algorithm that ‘targets’ disabled for benefit fraud

Oct 2019 · 6 レポート
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Don’t Believe the Algorithm

Aug 2017 · 4 レポート
Tinder's Personalized Pricing Algorithm Found to Offer Higher Prices for Older Users

A Consumer Investigation into Personalised Pricing

Mar 2015 · 4 レポート
前のインシデント次のインシデント

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities

DWP urged to reveal algorithm that ‘targets’ disabled for benefit fraud

Oct 2019 · 6 レポート
Facial Recognition Trial Performed Poorly at Notting Hill Carnival

Don’t Believe the Algorithm

Aug 2017 · 4 レポート
Tinder's Personalized Pricing Algorithm Found to Offer Higher Prices for Older Users

A Consumer Investigation into Personalised Pricing

Mar 2015 · 4 レポート

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2023 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 8b8f151