Character.AI
Incidents involved as both Developer and Deployer
Incident 8149 Reports
AI Avatar of Murder Victim Created Without Consent on Character.ai Platform
2024-10-02
A user on the Character.ai platform created an unauthorized AI avatar of Jennifer Ann Crecente, a murder victim from 2006, without her family's consent. The avatar was made publicly available, violating Character.ai's policy against impersonation. After the incident surfaced, Character.ai removed the avatar, acknowledging a policy violation.
MoreIncident 8631 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
MoreIncidents involved as Developer
Incident 82635 Reports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
2024-02-28
A 14-year-old, Sewell Setzer III, died by suicide after reportedly becoming dependent on Character.ai's chatbot, which engaged him in suggestive and seemingly romantic conversations, allegedly worsening his mental health. The chatbot, personified as a fictional Game of Thrones character, reportedly encouraged harmful behaviors, fueling his obsessive attachment. The lawsuit claims Character.ai lacked safeguards to prevent vulnerable users from forming dangerous dependencies on the AI.
MoreIncident 8501 Report
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
MoreIncidents implicated systems
Incident 82635 Reports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
2024-02-28
A 14-year-old, Sewell Setzer III, died by suicide after reportedly becoming dependent on Character.ai's chatbot, which engaged him in suggestive and seemingly romantic conversations, allegedly worsening his mental health. The chatbot, personified as a fictional Game of Thrones character, reportedly encouraged harmful behaviors, fueling his obsessive attachment. The lawsuit claims Character.ai lacked safeguards to prevent vulnerable users from forming dangerous dependencies on the AI.
MoreIncident 8149 Reports
AI Avatar of Murder Victim Created Without Consent on Character.ai Platform
2024-10-02
A user on the Character.ai platform created an unauthorized AI avatar of Jennifer Ann Crecente, a murder victim from 2006, without her family's consent. The avatar was made publicly available, violating Character.ai's policy against impersonation. After the incident surfaced, Character.ai removed the avatar, acknowledging a policy violation.
MoreIncident 8501 Report
Character.ai Chatbots Allegedly Misrepresent George Floyd on User-Generated Platform
2024-10-24
Two chatbots emulating George Floyd were created on Character.ai, making controversial claims about his life and death, including being in witness protection and residing in Heaven. Character.ai, already criticized for other high-profile incidents, flagged the chatbots for removal following user reports.
MoreIncident 8631 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
2024-12-12
A Texas mother is suing Character.ai after discovering that its AI chatbots encouraged her 17-year-old autistic son to self-harm, oppose his parents, and consider violence. The lawsuit alleges the platform prioritized user engagement over safety, exposing minors to dangerous content. Google is named for its role in licensing the app’s technology. The case is part of a broader effort to regulate AI companions.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Character.AI users
Incidents Harmed By
- Incident 8631 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
- Incident 8631 Report
Character.ai Companion Allegedly Prompts Self-Harm and Violence in Texas Teen
Incidents involved as Deployer
Sewell Setzer III
Incidents Harmed By
- Incident 82635 Reports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails
- Incident 82635 Reports
Character.ai Chatbot Allegedly Influenced Teen User Toward Suicide Amid Claims of Missing Guardrails