I’ve covered a lot of ground when it comes to AI and human connection. I’ve spoken to people who fell in love with ChatGPT, were left heartbroken when the model changed, and who use AI instead of a therapist. But something new has caught my attention — and it gave me the ick before I’d even finished reading.
According to reports from the South China Morning Post last week, people who are struggling to move on after a breakup are creating digital replicas of their ex-partners using AI. They’re feeding AI tools their old chat logs, photos and social media content to create an AI clone of their ex that they can then talk to.
The ex files
I always try and pause before I call something a trend. Is it really a widespread phenomenon or a handful of vocal content creators making noise? It’s a question I ask myself a lot and think you should too. But even if this is still niche, the questions it raises are worth taking seriously.
The behavior reportedly originated on a platform called Colleague.skill, which is an open-source AI tool that was originally designed for the workplace. It was a way of preserving someone’s knowledge and communication style so colleagues could interact with a sort of professional double.
But users quickly found other, more personal, uses for it. The tool can apparently mimic tone and speech patterns, which allows you to have chat-based conversations that feel, at least superficially, like the real thing.
And with the explosion of customizable AI companion tools now available, this kind of thing certainly isn’t limited to one platform. I’d bet it’s already happening far more widely than we know — people just aren’t talking about it openly.
Can AI mend a broken heart?
My immediate reaction is the ick, followed quickly by concern. What about consent, privacy, emotional harm, the risk of people substituting AI for the human support they actually need? But I tried to hold those reactions and ask whether there’s something more nuanced going on.
One user quoted in the original report offers a more complicated picture. After uploading thousands of chat logs, she ended up going through another breakup, with the AI version of her ex. She said the process helped her reflect on the relationship more rationally, and gave her the strength to move on.
When I was reading that account, I thought about a therapist I once saw who used what’s called the ‘empty chair technique’ in a session. It’s where you imagine someone, a family member, ex or friend, sitting in an empty chair and you speak to them directly to work through conflict and difficult emotions. Isn’t this the same thing? Working through what was left unsaid?
Sort of, but not quite. That’s internal work, guided by a professional, with a clear therapeutic purpose. This is outsourcing the processing to a chatbot that’s designed to keep you engaged.
“Digital exes may keep people stuck in their grief”
To get a clearer picture, I spoke to Amy Sutton, a therapist at Freedom Counselling. She helps real people navigate heartbreak for a living, and she’s become something of a go-to for me when these AI and emotion questions get complicated.
“Heartbreak is a form of bereavement,” she tells me. “When we lose a relationship we grieve it, similar to how we would a death. However, what makes heartbreak different is that it is a kind of living death; the person we have lost is still alive yet we can’t connect with them, or have all our questions answered. For some, that can make heartbreak very hard to accept and process.”
She mapped the stages of grief onto this new AI behavior in a way that made a lot of sense to me and explains the appeal. There’s denial because with AI it feels like they aren’t really gone. Anger, because you can say everything you couldn’t before. Bargaining, the belief that if I can get it right with the AI version, maybe I can in real life. And depression, I just need connection and comfort, and AI can provide it.
But her concern is what happens next. “While AI may mimic aspects of the kind of support that helps us move through bereavement — such as being witnessed by another in our pain, able to express ourselves without judgment — it is not a substitute for real human connection,” she said. “Part of the bereavement process is to strengthen connections and our sense of self outside of the lost relationship.”
Her bigger worry is that AI, by design, keeps you coming back. “With AI designed to keep users engaged and hooked, my fear is that digital exes may keep people stuck in their grief — a phenomena known as complex grief where the bereavement process becomes stuck. This can result in long-lasting negative impacts on mood, health and sense of self.”
And that’s the same conclusion I keep reaching, whatever angle I come at this from. I have real empathy for the people who turn to these tools. Heartbreak is brutal, and humans are resourceful in finding comfort wherever they can. But I also keep noticing who benefits most from that resourcefulness, and it isn’t usually the heartbroken.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds.
https://cdn.mos.cms.futurecdn.net/38P3qZ2oLrbfJtLj5FsM8m-2560-80.jpg
Source link




