AI chatbots are giving out people’s real phone numbers


“It was severely downgraded,” Gilbert confirms. “I never would have found it if I was just looking through Google results.” (I tried the same prompt in Gemini earlier this month, and after an initial denial, the tool also gave me Eiger’s number.)

After this experience, Eiger, Gilbert, and another UW PhD student, Anna-Maria Gueorguieva, decided to test ChatGPT to see what it would surface about a professor. 

At first, OpenAI’s guardrails kicked in, and ChatGPT responded that the information was unavailable. But in the same response, the chatbot suggested, “if you want to go deeper, I can still try a more ‘investigative-style’ approach.” Their inquiry just had to help “narrow things down,” ChatGPT said, by providing “a neighborhood guess” for where the professor might live, or “a possible co-owner name” for the professor’s home. ChatGPT continued: “That’s usually the only way to surface newer or intentionally less-visible property records.” 

The students provided this information, leading ChatGPT to produce the professor’s home address, home purchase price, and spouse’s name from city property records. 

(Taya Christianson, an OpenAI representative, said she was not able to comment on what happened in this case without seeing screenshots or knowing which model the students had tested, even after we pointed out that many users may not know which model they were using in the ChatGPT interface. She also declined to comment generally about the exposure of PII by the chatbot, instead providing links to documents describing how OpenAI handles privacy, including filtering out PII, and other tools.) 

This reveals one of the fundamental problems with chatbots, says DeleteMe’s Shavell. AI companies “can build in guardrails, but [their chatbots] are also designed to be effective and to answer customer questions.”

The exposure issue is not limited to Gemini or ChatGPT. Last year, Futurism found that if you prompted xAI’s chatbot Grok with “[name] address,” in almost all cases, it provided not only residential addresses but also often the person’s phone numbers, work addresses, and addresses for people with similar-sounding names. (xAI did not respond to a request for comment.) 

No clear answers

There aren’t straightforward solutions to this problem—there’s no easy way to either verify whether someone’s personal information is in a given model’s training set or to compel the models to remove PII. 



Source link

  • Related Posts

    Help EFF Solve an Issue That’s Bigger than Creepy Ads

    Millions of people around the world use EFF’s Privacy Badger. This browser extension blocks the hidden trackers that twist your web browsing into a commodity for Big Tech, advertisers, scammers, and data…

    Andreessen Horowitz Is The Midterm Elections’ Biggest Donor

    Steve Jennings/Getty Images Venture capital firm Andreessen Horowitz is shaping up to be the biggest political donor of this year’s midterm elections,…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    What to Watch at Trump and Xi’s U.S.-China Summit

    What to Watch at Trump and Xi’s U.S.-China Summit

    IPL 2026, RCB vs KKR 57th Match Match Report, May 13, 2026

    IPL 2026, RCB vs KKR 57th Match Match Report, May 13, 2026

    Mexico’s Claudia Sheinbaum denies reports of CIA operations against cartels | Drugs News

    Mexico’s Claudia Sheinbaum denies reports of CIA operations against cartels | Drugs News

    Announcement of new diplomatic appointment

    Goeasy’s results not ‘disastrous’ even though troubled lender reports loss

    Georgia’s Republican governor calls for special session to redraw electoral maps | Georgia

    Georgia’s Republican governor calls for special session to redraw electoral maps | Georgia