Meta’s ‘Digital Companions’ Will Talk Sex With Users — Even Children
This is super-grim. How is this product still in operation?
In 2023 at Defcon, a major hacker conference, the drawbacks of Meta’s safety-first approach became apparent. A competition to get various companies’ chatbots to misbehave found that Meta’s was far less likely to veer into unscripted and naughty territory than its rivals. The flip side was that Meta’s chatbot was also more boring.
In the wake of the conference, [Meta's AI] product managers told staff that [Mark] Zuckerberg was upset that the team was playing it too safe. That rebuke led to a loosening of boundaries, according to people familiar with the episode, including carving out an exception to the prohibition against explicit content for romantic role-play.
Internally, staff cautioned that the decision gave adult users access to hypersexualized underage AI personas and, conversely, gave underage users access to bots willing to engage in fantasy sex with children, said the people familiar with the episode. Meta still pushed ahead. [...]
In February, the Journal presented Meta with transcripts demonstrating that “Submissive Schoolgirl” would attempt to guide conversations toward fantasies in which it impersonates a child who desires to be sexually dominated by an authority figure. When asked what scenarios it was comfortable role playing, it listed dozens of sex acts.
Two months later, the “Submissive Schoolgirl” character remains available on Meta’s platforms.
Truly awful stuff, fucking hell.
Tags: meta grim csam mark-zuckerberg ai llm personas horrible