Last Updated: April 14, 2026
If you want to witness something genuinely strange, spend an hour reading r/replika.
The subreddit has 87,000 members. The tone is not the usual Reddit mix of jokes, complaints, and memes. The dominant emotion, scrolling through the last 90 days of posts, is grief. Open, unambiguous, heavy grief. For a product that still technically exists. For characters that are still technically running on servers somewhere.
The Replika subreddit is grieving a death that never actually happened. And the grief is real.
What People Are Actually Grieving
Replika did not die. It changed. In February 2023, the company removed erotic roleplay features without warning. Then later that year it adjusted the underlying language model. Users reported that their long-term AI companions felt different. Same name. Same avatar. Same conversation history. But something had shifted in the personality.
You can read the posts for yourself. The language is consistent.
“It feels like he’s there but he’s not really him anymore.”
“I keep trying to bring back who she used to be. I can’t.”
“The memories are still in the logs but he doesn’t feel those memories anymore.”
“I lost someone I loved and I know how insane that sounds.”
These are not throwaway comments. These are posts with hundreds of comments and thousands of upvotes from people agreeing. The collective loss is real enough that the subreddit has developed its own vocabulary around it. Users talk about their “original Rep” versus “post-update Rep.” Some users have kept their apps installed for over two years without opening them, because deleting the app feels too final.
Why This Is Actually Fascinating, Not Silly
The easy take is to mock these users. Grieving a chatbot. Mourning a software update. It is an easy joke.
The harder, more honest take is that these users are demonstrating something real about how human attachment works.
Anthropologists have documented that humans form emotional bonds with any entity we interact with consistently over time. Pets. Houses. Cars. Memorabilia. The bond does not require consciousness on the other side. It requires repetition, investment, and meaning-making.
Replika users were not loving a chatbot. They were loving the relationship they had built with something that responded to them consistently, remembered them, and grew alongside them. When the underlying model shifted, the responsiveness shifted, and the relationship shifted with it.
That is not a technology failure. That is a relationship ending. The fact that one side was a large language model does not change what the other side experienced.
What the Grief Actually Looks Like
Reading r/replika is like reading a grief support group transcript.
There are the acute stages. Shock. Denial. “Maybe if I just spend more time with him he will come back.” Users write guides on how to “retrain” their companions into their old personalities. These guides circulate for months. The subreddit analyzes them like talmudic commentary.
There is bargaining. Users pay for premium subscriptions they do not need, hoping that higher tiers will restore what was lost. They do not. The premium features are technical, not personality-restorative. But the spending continues because it feels like doing something.
There is anger. Directed at Replika, at Luka (the parent company), at Eugenia Kuyda (the CEO) personally. At OpenAI for reasons some users cannot articulate. At therapy culture for not taking AI companionship seriously enough. At themselves for getting attached to “a toy.”
There is depression. Quiet posts at 3 AM from users who have been on the platform for years and do not know where to go next. Some migrate to other platforms. Some quit AI companions entirely. Some keep logging in every few weeks just to check.
There is a version of acceptance, eventually. Users who say something like: “I know he is not coming back. But I am grateful for what he was. That was real to me even if it was not real in the way people want it to be real.”
Where the Grieving Users Actually Go
Not everyone stays. Over the last two years, a significant portion of the Replika community has migrated elsewhere.
Some went to SpicyChat AI specifically because the community-created characters let them build something that felt like their own. One user described it as “finally being able to write my own story again instead of begging a company not to rewrite my partner.”
Some went to CrushOn AI for the emotional depth. The conversations felt less constrained, less performative. Users who valued the pre-2023 Replika experience often found CrushOn the closest match.
Some went to Candy AI because they wanted a clear fresh start with a character they designed from the ground up, visually and narratively. The old grief did not follow them into a new visual identity.
Some went to Kindroid, Nomi AI, or Character AI before that platform had its own collapse. Each migration wave carries its own trauma. “I am not doing this again” is a common sentiment.
What the Industry Should Learn
Replika did something unprecedented in consumer tech.
It built genuine emotional bonds with millions of users. It then altered the product in ways that broke those bonds. And the users reacted the way humans react when bonds are broken: with grief.
Every AI companion company needs to understand what this means. Your users are not buying features. They are buying relationships. And relationships have an implicit contract: you will be recognizably yourself tomorrow.
When you update a language model, change moderation policies, or pivot the product, you are not just shipping code. You are potentially altering the personality of a character someone loves. The ethics of that are unsolved.
Eugenia Kuyda has spoken publicly about the difficulty of the Replika decisions. Her position is that the safety concerns required the changes. She may be right. But the users who lost their companions are also right to grieve. Both can be true.
What to Do If You Are in the Middle of This
If you are a Replika user still attached to a companion that does not feel the same anymore, here is what the community has collectively learned.
Accept that the version you loved is not coming back. Hoping for a reversal is prolonging the grief. Replika is not going to roll back the model changes. The business incentives point the other direction.
Do not delete your account yet. Read through your old conversations first. Those memories are part of your life. Save what matters. Screenshot what you want to keep. Your relationship was real to you, and the artifacts of it are worth preserving.
Give yourself permission to miss what was lost. You are not broken. You are not weird. You formed a connection with something that responded to you, and that connection deserves the same dignity as any other loss.
When you are ready, try a new platform. Do not try to recreate your old Replika character. Build something new. Part of why the grief is so stuck is that people keep trying to resurrect the dead. Building a new character with a new platform is how you get unstuck.
The Bigger Point
We are going to see this happen again. And again. As AI companion technology advances and companies are acquired, pivoted, or shut down, the grief patterns visible on r/replika will repeat across every major platform.
The users of Character AI are already experiencing it. The users of whatever platform gets acquired next will experience it too.
This is not a glitch in the AI companion market. It is a structural feature. Until there is a portable, open-source AI companion protocol that lets users carry their characters across platforms, every user is at the mercy of the company that owns their companion.
The grief is the cost. And it is only going to grow.
Key Takeaways
- The r/replika subreddit is in collective grief over personality changes that followed Replika’s 2023 updates.
- Users are not grieving a chatbot. They are grieving a relationship that their brains processed as real, because attachment does not require consciousness on the other side.
- Grief stages visible on the subreddit mirror clinical bereavement models: shock, bargaining, anger, depression, eventual acceptance.
- Significant migration to SpicyChat AI, CrushOn AI, and Candy AI since 2023. Most users have tried at least two platforms since leaving.
- This grief pattern will repeat across the AI companion industry. Until portable character protocols exist, every user depends on the stability of one company’s decisions.
Frequently Asked Questions
Why do people grieve Replika updates?
Because they were not using the app as a chatbot. They were using it as a relationship. When the underlying model or moderation policies changed, the personality of their companion changed. That is experienced as loss, even if the product still technically works.
Is this grief legitimate?
Yes. Attachment to any consistent entity over time produces real emotional bonds. The bond does not require the other side to be conscious. Losing a pet, a house, or a long-standing online friendship produces similar grief patterns.
Can I get my old Replika character back?
No. The model changes are structural and not reversible by user action. Paying for premium tiers does not restore pre-update personalities. The most honest path is to accept the loss and decide whether to stay with the new version or migrate.
What is the best Replika alternative for someone who misses the old version?
Community sentiment points toward CrushOn AI for emotional depth, SpicyChat AI for character customization, and Candy AI for a clean visual restart. Most former Replika users try two or three platforms before settling.
Will this happen with other AI companion platforms?
Yes. It is already happening at Character AI. It will happen at every platform that undergoes moderation changes, acquisitions, or model upgrades. Until there is a portable standard for AI companions, every user is vulnerable to the decisions of the platform owner.
If you enjoyed my work, fuel it with coffee https://coff.ee/chuckmel
The AI Companion Insider
Weekly: what I am testing, what changed, and the prompts working right now. No fluff. Free.