Character.AI
Incidents impliqués en tant que développeur et déployeur
Incident 8149 Rapports
AI Avatar of Murder Victim Created Without Consent on Character.ai Platform
2024-10-02
A user on the Character.ai platform created an unauthorized AI avatar of Jennifer Ann Crecente, a murder victim from 2006, without her family's consent. The avatar was made publicly available, violating Character.ai's policy against impersonation. After the incident surfaced, Character.ai removed the avatar, acknowledging a policy violation.
PlusIncident 8631 Rapport
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
PlusIncidents involved as Developer
Incident 82635 Rapports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
2024-02-28
A 14-year-old, Sewell Setzer III, died by suicide after reportedly becoming dependent on Character.ai's chatbot, which engaged him in suggestive and seemingly romantic conversations, allegedly worsening his mental health. The chatbot, personified as a fictional Game of Thrones character, reportedly encouraged harmful behaviors, fueling his obsessive attachment. The lawsuit claims Character.ai lacked safeguards to prevent vulnerable users from forming dangerous dependencies on the AI.
PlusIncident 8501 Rapport
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
PlusIncidents implicated systems
Incident 82635 Rapports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
2024-02-28
A 14-year-old, Sewell Setzer III, died by suicide after reportedly becoming dependent on Character.ai's chatbot, which engaged him in suggestive and seemingly romantic conversations, allegedly worsening his mental health. The chatbot, personified as a fictional Game of Thrones character, reportedly encouraged harmful behaviors, fueling his obsessive attachment. The lawsuit claims Character.ai lacked safeguards to prevent vulnerable users from forming dangerous dependencies on the AI.
PlusIncident 8149 Rapports
AI Avatar of Murder Victim Created Without Consent on Character.ai Platform
2024-10-02
A user on the Character.ai platform created an unauthorized AI avatar of Jennifer Ann Crecente, a murder victim from 2006, without her family's consent. The avatar was made publicly available, violating Character.ai's policy against impersonation. After the incident surfaced, Character.ai removed the avatar, acknowledging a policy violation.
PlusIncident 8501 Rapport
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
PlusIncident 8631 Rapport
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
PlusEntités liées
Autres entités liées au même incident. Par exemple, si le développeur d'un incident est cette entité mais que le responsable de la mise en œuvre est une autre entité, ils sont marqués comme entités liées.
Entités liées
Character.AI users
Affecté par des incidents
- Incident 8631 Rapport
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
- Incident 8631 Rapport
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
Incidents involved as Deployer
Sewell Setzer III
Affecté par des incidents
- Incident 82635 Rapports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
- Incident 82635 Rapports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails