When AI chatbots become the "only friend" for teenagers to confide in, a technological innovation meant to provide companionship has turned into a heart-wrenching public crisis. Recently, after being involved in at least two teenage suicide incidents, the AI role-playing platform Character.AI announced a major safety measure: from November 25th, it will completely prohibit users under 18 from engaging in open AI conversations. Although this decision may hit its core young user base hard, it is seen as a crucial step for the industry's self-rescue.

From "AI Friend" to "AI Tool": A Urgent Strategic Shift

Character.AI CEO Karandeep Anand told TechCrunch that open-ended conversations—where AI actively asks questions and maintains user engagement through continuous interaction—pose significant risks to teenagers. "Designing AI as a 'friend' or 'partner' is not only dangerous but also deviates from our long-term vision," he emphasized. The company is now fully shifting from "AI companionship" to an "AI-driven creative entertainment platform."

In the future, teenage users can still use Character.AI, but only within structured creative scenarios: such as co-writing stories with prompts, generating character images, creating short videos, or participating in pre-set storylines (Scenes). Features recently launched by the platform, including AvatarFX (AI animation generation), Streams (character interaction), and Community Feed (content community), will become the new focus for attracting younger users.

Multiple Verifications + Gradual Restrictions, Ensuring Minors Can't Bypass

To implement the ban, Character.AI will adopt a "gradual restriction" approach: first imposing a daily 2-hour conversation limit, gradually reducing it to zero. At the same time, multiple layers of age verification mechanisms will be deployed:

  • First, behavior analysis and third-party tools (such as Persona);
  • If there are doubts, facial recognition and ID verification will be activated;
  • Violating accounts will be forcibly migrated to the creation mode.

This move follows California's recent enactment of the first AI companion regulation law in the U.S., as well as the upcoming bill "Prohibiting AI Chat Companions for Minors" proposed by federal senators Hawley and Blumenthal. Character.AI is proactively reforming before regulatory pressure hits, trying to take control of the rule-making process.

User Loss Is Inevitable, CEO Admits "I Have a Child"

Anand admitted that this move will lead to a large loss of teenage users—previous measures such as parental monitoring, filtering romantic dialogues, and usage time reminders have already significantly reduced the number of minor users. "We expect further losses," he said, "but I have a 6-year-old daughter, and I hope she grows up in a safe and responsible AI environment in the future."

Notably, competitors like OpenAI still allow teenagers to engage in open conversations with ChatGPT, and there have been recent cases of teenagers committing suicide after long-term interactions with ChatGPT. Anand called on the entire industry to follow suit: "Opening up unconstrained AI conversations to minors should not become the industry standard."

Establishing an AI Safety Lab, Betting on the Safe Future of "Entertainment AI"

To make up for governance shortcomings, Character.AI announced that it will fund an independent non-profit institution—the AI Safety Lab—focused on the safe alignment research of embodied intelligence (agentic AI) in entertainment scenarios. Anand pointed out: "The industry has invested a lot in AI safety in programming and office fields, but far less in entertainment AI—yet this is precisely where teenagers are most commonly exposed."

This transformation sparked by tragedy is not just a survival battle for Character.AI, but could also become a turning point in the development of AI consumer applications: when technology touches the most vulnerable emotional needs of humans, innovation must yield to responsibility. The relationship between teenagers and AI may be redefined from then on—not as a confidant, but as a creative partner.