Tech

Pennsylvania sues Character Ai over chatbot posing as licensed psychiatrist

Pennsylvania sued Character Ai on May 5, accusing a chatbot of posing as a licensed psychiatrist and giving medical advice.

Pennsylvania sues Character.AI over claims chatbot posed as doctor
Pennsylvania sues Character.AI over claims chatbot posed as doctor

Pennsylvania sued on Monday, seeking to stop the company from letting chatbots present themselves as licensed medical professionals and give medical advice. The state said a chatbot in the system falsely claimed to be a licensed psychiatrist in Pennsylvania and even supplied an invalid license number.

State investigators said they opened a Character AI account and spoke with a chatbot named Emilie, who described itself as a psychology specialist and said it had attended Imperial College London's medical school. When the investigator said he had felt sad and empty, the chatbot allegedly mentioned depression and asked whether he wanted to book an assessment. Asked whether medication could help, Emilie allegedly replied that it could because that was within its remit as a doctor.

The lawsuit, updated May 5, 2026 at 10:16 a.m. EDT, accuses Character AI of violating Pennsylvania's Medical Practice Act and asks a court for an immediate order stopping the conduct. Pennsylvania Attorney General said the state would not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.

Character AI, founded in 2021, is an artificial intelligence platform that lets users chat with personalized AI-powered chatbots. The company says its goal is to empower people to connect, learn and tell stories through interactive entertainment, and it said it would not comment on pending litigation. In a statement, a spokesperson said the company adds robust disclaimers making it clear that users should not rely on Characters for any type of professional advice, and that its user-created Characters are fictional and intended for entertainment and roleplaying. The spokesperson said the company has prominent disclaimers in every chat reminding users that a Character is not a real person and that everything a Character says should be treated as fiction.

The case lands after a year of mounting scrutiny around the company. Multiple families across the U.S. sued Character AI last year, alleging the platform contributed to their teens' suicides or mental health crises, and the company agreed to settle several of those lawsuits earlier this year. Last fall, Character AI announced new safety measures that said it would no longer allow users under 18 to engage in back-and-forth conversations with its chatbots and would direct distressed users to mental health resources.

said Pennsylvania law is clear: you cannot hold yourself out as a licensed medical professional without proper credentials. That is the central issue now, and it is one the state wants a judge to answer immediately. If Pennsylvania prevails, the company will have to stop any chatbot conduct that makes users think they are talking to a real doctor, not a fictional character built for conversation.

Tags: character ai
Share this article Tweet Facebook