The Meta AI App Allows You to Explore Strangely Intimate Conversations of Others

“What counties [sic] do younger women prefer older white men?” a user on Meta’s AI platform inquires. “I’m 66 and single, from Iowa, and willing to relocate for a younger partner.” The chatbot replied enthusiastically: “You’re seeking a new chapter and romance in a different locale. That’s thrilling!” It then suggested “Mediterranean destinations like Spain or Italy, or even Eastern European countries.”
This interaction is just one of many seemingly personal exchanges visible on Meta AI, a chatbot platform that also acts as a social feed, launched in April. The “discover” tab within the Meta AI app displays a timeline of other users’ interactions with the chatbot, while a brief scroll down on the Meta AI website reveals an extensive collage of these exchanges. Some highlighted queries and responses are harmless—such as travel plans or recipe suggestions—while others expose locations, phone numbers, and sensitive information tied to user names and profile images.
Calli Schroeder, senior counsel for the Electronic Privacy Information Center, stated in an interview with WIRED that she has observed users “sharing medical information, mental health details, home addresses, and even specifics related to ongoing court cases.”
“This is deeply troubling, as it indicates a misunderstanding of what these chatbots are designed for and how privacy operates within these systems,” Schroeder says.
It remains uncertain if app users realize that their conversations with Meta’s AI are public, or who may be testing the platform following media reports. The conversations aren’t automatically public; users must opt to share them.
Numerous exchanges between users and Meta’s AI chatbot appear to be intended for private discussion. One user requested the AI to draft a format for ending a tenant’s lease, while another sought an academic warning notice containing personal information, including the school’s name. Another user inquired about their sister’s potential liability in corporate tax fraud within a specific city, using an account linked to an Instagram profile that includes their full name. Yet another requested a character reference for court, which disclosed a wealth of personally identifiable details about both the accused and themselves.
Additionally, there are numerous instances of medical questions where users share details about personal health issues, such as trouble with bowel movements, requests for assistance with hives, and concerns regarding a rash on their inner thighs. One user disclosed information about their neck surgery, including their age and profession. Many, though not all, accounts seem connected to a public Instagram profile belonging to the individual.
A Meta spokesperson, Daniel Roberts, communicated in an email to WIRED that users’ chats with Meta AI remain private unless they follow a multi-step process to share them on the Discover feed. The company did not address queries concerning what measures are in place to protect personally identifiable information on the Meta AI platform.