ChatGPT’s Risqué Phase Might Be Its Most Engaging Yet

He views erotic bots as “just one aspect of your spectrum of relationships,” instead of a substitute for human connection, allowing users to “explore a kink” they may not be able to in real life.
Prompt Pleasure
When considering who might seek out a chatbot for sexual pleasure, it’s easy to envision the typical greasy-haired straight male who hasn’t ventured outside in days or feels disconnected from physical intimacy. After all, men have been quicker to adopt generative AI tools, and conversations surrounding the male “loneliness epidemic” are becoming increasingly prominent.
Devlin challenges the notion that only “incel types” turn to AI bots for satisfaction. “There’s a common belief that this is solely for lonely straight men, which hasn’t been the case in any research I’ve conducted,” she states. She references the r/MyBoyfriendIsAI subreddit as an example of women utilizing ChatGPT for companionship.
“If you view these kinds of relationships as risky, let me introduce you to human relationships,” McArthur points out. Devlin shares this perspective, noting that women often experience significant toxicity from men online, making the choice to “create a nice, respectful boyfriend” from a chatbot a logical one for her.
Carpenter approaches ChatGPT with more caution and a clinical perspective. “People shouldn’t automatically categorize it as something for sharing intimacy or that it’s friend-like or can be trusted,” she advises. “It’s not your friend.” She indicates that interactions with bots should be regarded as a new social category, distinct from human-to-human relationships.
Every expert interviewed by WIRED emphasized the critical importance of user privacy. If a user’s ChatGPT account is compromised or chat transcripts are leaked, these erotic exchanges could not only cause embarrassment but also present potential harm. Similar to a user’s pornography habits or browsing history, their chatbot conversations could reveal sensitive details, such as a closeted individual’s sexual orientation.
Devlin warns that erotic chatbot interactions could increase the risk of “emotional commodification,” where sexual desire turns into a profit avenue for AI companies. “I see that as a manipulative approach,” she argues.
Imagine a hypothetical version of ChatGPT that excels at dirty talk and is meticulously tailored to engage with your deepest sexual desires via text, images, and voice—but comes with a higher monthly subscription fee.
“This technology is undeniably seductive. It provides us with connection, whether sexual or romantic,” Devlin observes. “Everyone craves connection. Everyone wants to feel desired.”