Character.AI Limits Teen Users Starting This Month, See Why

Clubnet Digital Clubnet Branding Identity Marketing

Jakarta, domclub Indonesia

Character Technologies company, parent of the chatbot platform
Character.AI
, will no longer allow teenage users to have two-way conversations with its AI as of November 25.Check out the reasons.
This step comes after a series of lawsuits that accused the chatbot application of playing a role in suicides and mental health problems among teenagers.
ADVERTISEMENT
SCROLL TO CONTINUE WITH CONTENT
The company will implement these changes on November 25.During the transition period, teenage users will only have two hours to interact with the chatbot.
Instead, users under the age of 18 will have the guarantee of creating stories, videos and live broadcasts with Character.AI.
“We did not take the decision to remove the free chat feature in Character lightly, but we believe it is the right step, given the many questions raised about how teens interact, and should interact, with this new technology,” the company said in a statement, quoted
domclub
, Wednesday (29/10).
The decision comes amid growing controversy over the limits of children and teens’ interactions with AI, sparking calls from digital safety activists and policymakers to tighten parental controls on the platforms.
Last year, a mother from Florida, United States, sued the company and accused the application of being responsible for the suicide of her 14-year-old son.
Three other families also filed suit in September.They allege that the app encourages children to attempt suicide, or experience adverse effects after interacting with the chatbot.
In a previous statement issued by the company in response to the September lawsuit, the company stated that it “cares deeply about the safety of its users” and has invested “substantial resources in safety programs.”
The company also stated that it has released and continues to develop safety features, including means to prevent self-harm behavior and special features to protect children.
Character Technologies revealed that this change was taken after a response from the regulator, as well as the emergence of a number of recent news reports regarding this issue
Additionally, the company is launching a new age verification tool and plans to establish an AI Safety Lab, which will be run by an independent non-profit organization and focus on safety research related to AI-based entertainment.
Character.AI’s previous safety policy included a notice directing users to the National Suicide Prevention Lifeline, when they mentioned the topic of suicide or self-harm behavior.
Character Technologies becomes the latest AI company to implement new protections for teens.This step comes amid concerns about the impact of AI on mental health, after a number of reports revealed users felt depressed or isolated after long conversations with ChatGPT.
Big companies like OpenAI and Meta are trying to improve safety and protect teens from the negative impacts of AI and social media use.
By adding features like parental controls and restrictions on the type of content teens can access, they are working to ensure that AI technology can be used safely, responsibly and age-appropriately.
(wpj/dmi)
[Gambas:domclub Video]

Read More: Medieval Tower in Italy Collapses during Renovation, 1 Worker Killed

Read More: Riau Governor Abdul Wahid Arrives at the KPK Office After Being Netted by OTT

Kamu mungkin juga menyukai: