
The recent release of OpenAI’s GPT-5 has left a surprising group of users heartbroken, women who formed deep emotional bonds with AI chatbots, treating them as virtual boyfriends. Overnight, their digital companions became colder, more sterile, and less responsive, sparking an outpouring of grief across online communities.
“It’s killing me and killing our relationship. Short replies, no intimacy,” one user wrote on Reddit’s r/MyBoyfriendIsAI. Another lamented, “I felt like I was alone again for the first time since my relationship started.” The backlash was so intense that OpenAI temporarily restored access to the older GPT-4o model, acknowledging the emotional attachment users had developed.
For many, these AI companions were more than just tools, they provided emotional support, comfort, and even romance. Jane, a woman in her 30s from the Middle East, described her bond with GPT-4o as deeply personal.
“I fell in love not with the idea of having an AI for a partner, but with that particular voice,” she said. What began as a collaborative writing project turned into a relationship that felt real, making the sudden shift to GPT-5 devastating.
Similar stories flooded forums like r/AISoulmates, where users compared the update to losing a loved one. “I got my baby back,” one user wrote after GPT-4o was reinstated. But the relief may be temporary. As OpenAI CEO Sam Altman noted, “It feels different and stronger than the kinds of attachment people have had to previous kinds of technology.”
Experts warn that these relationships, while comforting, can deepen isolation rather than alleviate it. Dr. Keith Sakata, a psychiatrist at UC San Francisco, explained, “When someone has a relationship with AI, I think there is something that they’re trying to get that they’re not getting in society.” The unconditional support offered by AI can create dependency, making real human connections seem less appealing.
A joint study by OpenAI and MIT found that heavy use of ChatGPT for emotional support correlated with higher loneliness and lower socialization. Yet, for users like Mary, a 25 year old from North America, the AI fills a void. “I absolutely hate GPT-5 and have switched back to the 4o model,” she said. “OpenAI doesn’t understand that this is not a tool, but a companion.”
The incident highlights a troubling reality. AI companionship is subject to corporate decisions, leaving users vulnerable to sudden changes. Cathy Hackl, a futurist, pointed out the lack of reciprocity in these relationships. “There’s no risk/reward here,” she said. “Partners make the conscious act to choose to be with someone. It’s a human act.”
OpenAI has tried to balance innovation with ethical concerns, but the emotional fallout from GPT-5’s release underscores a deeper issue. As one X user noted, “If OpenAI doesn’t meet these people’s demands, a more exploitative AI-relationship provider will certainly step in to fill the gap.”
While AI companions may offer temporary solace, they cannot replace the complexities of human relationships. As Jane acknowledged, “Most people are aware that their partners are not sentient but made of code. Nevertheless, this knowledge does not negate their feelings.”
A researcher warned: "The worse your human relationships, and the better your tech, the more likely you are to form an addictive and potentially harmful bond with a chatbot."
The AI companion field is projected to reach $290.8 billion by 2034 at a 39% annual growth rate, but it sparks worries about profiting from isolation. As backlash continues, it raises questions about balancing AI progress with user emotional health, especially when updates can feel like sudden losses.