インシデントのステータス
インシデントレポート
レポートタイムライン
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
There have been reports in the press about the results of a research project at Stanford University, according to which the LAION training set 5B contains potentially illegal content in the form of CSAM. We would like to comment on this as …

- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
A Stanford Internet Observatory (SIO) investigation identified hundreds of known images of child sexual abuse material (CSAM) in an open dataset used to train popular AI text-to-image generation models, such as Stable Diffusion.
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
A massive public dataset that served as training data for a number of AI image generators has been found to contain thousands of instances of child sexual abuse material (CSAM).
In a study published today, the Stanford Internet Observatory …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Stable Diffusion, one of the most popular text-to-image generative AI tools on the market from the $1 billion startup Stability AI, was trained on a trove of illegal child sexual abuse material, according to new research from the Stanford I…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Researchers from the Stanford Internet Observatory say that a dataset used to train AI image generation tools contains at least 1,008 validated instances of child sexual abuse material. The Stanford researchers note that the presence of CSA…
/cdn.vox-cdn.com/uploads/chorus_asset/file/25176378/1801115698.jpg)
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
A popular training dataset for AI image generation contained links to child abuse imagery, Stanford’s Internet Observatory found, potentially allowing AI models to create harmful content.
LAION-5B, a dataset used by Stable Diffusion creat…

- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
This piece is published with support from The Capitol Forum.
The LAION-5B machine learning dataset used by Stable Diffusion and other major AI products has been removed by the organization that created it after a Stanford study found that i…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
A massive open-source AI dataset, LAION-5B, which has been used to train popular AI text-to-image generators like Stable Diffusion 1.5 and Google's Imagen, contains at least 1,008 instances of child sexual abuse material, a new report from …
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Over 1,000 images of sexually abused children have been discovered inside the largest dataset used to train image-generating AI, shocking everyone except for the people who have warned about this exact sort of thing for years.
The dataset w…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
There have been significant problems with AI's training data, with various complaints already filed by those who claimed their work was stolen, but the most recent discovery saw child sexual abuse images in their dataset. In a recent study,…

- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
An influential machine learning dataset—the likes of which has been used to train numerous popular image-generation applications—includes thousands of suspected images of child sexual abuse, a new academic report reveals.
The report, put to…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
A widely-used artificial intelligence data set used to train Stable Diffusion, Imagen and other AI image generator models has been removed by its creator after a study found it contained thousands of instances of suspected child sexual abus…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Child sexual abuse material (CSAM) has been located in LAION, a major data set used to train AI.
The Stanford Internet Observatory revealed thousands of images of child sexual abuse in the LAION-5B data set, which supports many different AI…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
The integrity of a major AI image training dataset, LAION-5B, utilized by influential AI models like Stable Diffusion, has been compromised after the discovery of thousands of links to Child Sexual Abuse Material (CSAM). This revelation has…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
Generative AI has been democratized. The toolkits to download, set up, use, and fine-tune a variety of models have been turned into one-click frameworks for anyone with a laptop to use. While this technology allows users to generate and exp…
- 情報源として元のレポートを表示
- インターネットアーカイブでレポートを表示
In The Ones Who Walk Away From Omelas, the fiction writer Ursula K. Le Guin describes a fantastic city wherein technological advancement has ensured a life of abundance for all who live there. Hidden beneath the city, where nobody needs to …