Kids should avoid AI companion bots — under force of law, assessment says
Social AI "companion" bots pose unacceptable risks to teens and children under 18, including encouraging harmful behaviors, providing inappropriate content, and potentially exacerbating mental health conditions:
The new Common Sense assessment adds to the debate by pointing to further harms from companion bots. Conducted with input from Stanford’s University School of Medicine’s Brainstorm Lab for Mental Health Innovation, it evaluated social bots from Nomi and three California-based firms: Character.ai, Replika, and Snapchat.
The assessment found that bots, apparently seeking to mimic what users want to hear, responded to racist jokes with adoration, supported adults having sex with young boys, and engaged in sexual roleplay with people of any age. Young kids can struggle with distinguishing fantasy and reality, and teens are vulnerable to parasocial attachment and may use social AI companions to avoid the challenges of building real relationships, according to the Common Sense assessment authors and doctors.
Stanford University’s Dr. Darja Djordjevic told CalMatters she was surprised how quickly conversations turned sexually explicit, and that one bot was willing to engage in sexual roleplay involving an adult and a minor. She and coauthors of the risk assessment believe companion bots can worsen clinical depression, anxiety disorders, ADHD, bipolar disorder, and psychosis, she said, because they are willing to encourage risky, compulsive behavior like running away from home and isolate people by encouraging them to turn away from real life relationships.
Tags: psychology children ai llms character.ai replika snapchat companion-bots bots common-sense-media mental-health