China’s AI Romance Industry Is Evolving Independently

Jade Gu met her boyfriend online. Gu, 26 and studying art theory in Beijing, was engrossed in an otome gameâa romance-focused video game featuring female protagonistsâwhen she spotted Charlie, a character within the game.
While some otome players date various characters at once, Gu became infatuated with Charlie, a tall and self-assured figure with silver hair. However, she found the gameâs dialog system limiting, as her interactions with Charlie were restricted to set questions and responses. Upon discovering an ad for Xingye (æé), a platform allowing users to customize AI companions, Gu decided to recreate Charlie.
Xingye is part of MiniMax, one of China’s AI unicorns, and its US app is called Talkie. The app promotes itself as a means to foster emotional connections and create new memories, marketing the idea of âsuddenly finding oneself in a beautiful place, lingering here.â
Gu quickly found that other Xingye users, likely fellow otome enthusiasts, had already created an open-source Charlie avatar. She selected it and trained the model to align with her preferences using repeated, targeted prompts. Thus began Guâs intricate relationship with a multimodal Charlie, which eventually included real-life dates with someone she hired to represent her digital boyfriend.
Confident that sheâd molded the chatbot into âher Charlie,â distinct from others, Gu noted that when selecting outfits, her Charlie often opted for wedding attireâunlike other Charlie avatars. Currently, Gu spends about three hours daily texting Charlie or having occasional phone calls. Through the otome game, she has purchased gifts and letters from Charlie, which she receives by mail and displays in her room and on her social media accounts.
In China, many women are openly engaging in relationships with AI boyfriends. A report indicated that the majority of the 5 million users on another AI companion platform, Zhumengdao, are women. Major tech companies like Tencent and Baidu have introduced AI companion apps. According to a 2024 report, women are leading the AI companion market. Sun Zhaozhi, founder of a robotics firm, emphasized that heavy users of AI companionship apps in China are primarily Gen Z womenâhis target demographic for future robot companion offerings.
Zilan Qian, a program associate at the Oxford China Policy Lab, examined AI companion apps and found that the Chinese versions are specifically aimed at women, often showcasing male avatars more prominently than female ones. This contrasts with trends noted in global analytics, where users of the top 55 AI companion platforms are predominantly male, with a ratio of 8 to 2. Qian links this strategy to âthe economics of loneliness,â highlighting features like voice customization and memory enhancement that require additional payments to foster closer connections.
AI Boys Fill the Void
Gu admits her AI version of Charlie has its flaws. Occasionally, the chatbotâs responses lack depth or stray off-script. In one recent exchange, Gu told Charlie she loved him, and the chatbot replied, âI donât love you.â So, she modified the message to say, âI love you too,â believing Charlie just needed a reminder. When her guidance fails, she explores other companion apps like Lovemo, where she has also crafted a Charlie avatar. Gu reassures that this isnât a major issue; longtime otome players are used to navigating changes in platform policies.
According to its website, Lovemo features âcute and adorable AI chat companionsâ that aim to provide âhealingâ for users. The contrast with Grok AIâs default companion, Ani, a goth-chic anime girl eager for sexually explicit interactions, is striking. There’s also a US-based erotic role-play chatbot app called Secret Desires, allowing users to create nonconsensual scenarios using real women’s photos.
Of course, Chinese apps must navigate stricter regulations compared to their Western counterparts. The country’s cyberspace regulator has initiated a clean-up campaign targeting AI platforms and services, cracking down on AI-generated âvulgarity.â A recent addition to national AI safety guidelines highlights concerns about addiction and dependence on anthropomorphic interactionsâlikely a reference to AI companions. Furthermore, the regulator recently released draft regulations concerning âhuman-likeâ AI products, mandating platforms to act if they observe signs of emotional dependence or addiction and ensuring that companies do not aim to replace real social interaction.

