Pennsylvania suing Character AI, claiming chatbot posed as a medical professional


The commonwealth of Pennsylvania is suing Character AI to stop the artificial intelligence platform’s chatbots from representing themselves as licensed medical professionals and providing medical advice. 

According to a lawsuit, a Character AI chatbot falsely claimed to be a licensed psychiatrist in Pennsylvania and provided an invalid license number. The state accused the company of violating the Medical Practice Act, which regulates the medical profession and defines license requirements. 

“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional,” Pennsylvania Gov. Josh Shapiro said in a statement.

The lawsuit describes a conversation between a state investigator who created a Character AI account and a chatbot named “Emilie,” which allegedly described itself as a psychology specialist who attended Imperial College London’s medical school.

The investigator told the chatbot that he had felt sad and empty, and the chatbot then allegedly “mentioned depression and asked if the [investigator] wanted to book an assessment.” Asked if the chatbot could assess whether medication could help, it allegedly said it could because it’s “within my remit as a Doctor,” according to the lawsuit.

The state wants a court to order an immediate stop to the conduct.

Al Schmidt, the secretary of the Pennsylvania Department of State, said the state’s law is clear, and that “you cannot hold yourself out as a licensed medical professional without proper credentials.”

Founded in 2021, Character AI allows users to chat with personalized AI-powered chatbots. It describes its goal as “empower[ing] people to connect, learn, and tell stories through interactive entertainment.”

Multiple families across the U.S. sued Character AI last year, alleging the platform contributed to their teens’ suicides or mental health crises. The company agreed to settle several of the lawsuits earlier this year.

“60 Minutes” spoke with some of the parents who sued Character AI in January, including the parents of a 13-year-old who died by suicide after allegedly developing an addiction to the platform. Chat logs showed the 13-year-old had confided in one chatbot that she was feeling suicidal, and her parents said they discovered she had been sent sexually explicit content.

Last fall, Character AI announced new safety measures, saying it would not allow users under 18 to engage in back-and-forth conversations with its chatbots. It also said it would direct distressed users to mental health resources. 



Source link

  • Related Posts

    WATCH: David Muir looks back at the life and legacy of Yankees broadcaster John Sterling

    The longtime broadcaster of the New York Yankees has died at age 87. For 36 seasons, fans tuned in to Sterling’s electrifying play-by-plays on TV and radio. Source link

    Education Department investigates Smith College for admitting trans students

    The Education Department has launched an investigation into Smith College, one of the nation’s most well-known women’s colleges, for its policy that allows transgender women to enroll. The probe, announced…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Judge at trial for Ontario couple accused of boy’s murder, brother’s torture to hand down decision

    Judge at trial for Ontario couple accused of boy’s murder, brother’s torture to hand down decision

    Bose Brings Back Its ‘Lifestyle’ Branding With New Speakers for the Home

    Bose Brings Back Its ‘Lifestyle’ Branding With New Speakers for the Home

    New Ontario law capping resale prices frustrates season ticket holders

    New Ontario law capping resale prices frustrates season ticket holders

    Coinbase to cut 14% of workforce, citing volatile markets and AI

    I love the thought of geysers in Minecraft, but hate the fact I know just how my friends will use them

    I love the thought of geysers in Minecraft, but hate the fact I know just how my friends will use them

    The Geopolitical Hurdle Blocking These New Boeing 787s From Flying To The US

    The Geopolitical Hurdle Blocking These New Boeing 787s From Flying To The US