Last Updated: March 2026
Something has changed in the AI companion industry. Not gradually. Sharply, in the last 90 days.
One half of the market is retreating behind safety walls, age verification gates, mid-chat ads, and compliance protocols. The other half is sprinting in the opposite direction — toward fewer restrictions, adult positioning, and the users the first half is pushing away.
Two markets. Two audiences. Two business models. Only one of them is likely to still exist in its current form in five years. Here is what the data actually shows.
Quick Answer: The AI companion market has split into a regulated “safe” tier (Character AI, Replika, Samsung) chasing mainstream adoption and liability protection, and an unrestricted adult tier (SpicyChat, CrushOn, Candy AI) absorbing the users regulation displaces. The safe tier wins long-term on scale and survival. The adult tier wins short-term on engagement and revenue intensity. The users who built this industry are currently moving from one to the other.
Short Version
- Character AI settled teen suicide lawsuits in January 2026. The platform response was mass content restriction. The user response was mass migration.
- California SB 243 went live January 1, 2026. New York, Washington, and Oregon followed within weeks. Every platform now faces compliance cost that didn’t exist 90 days ago.
- SpicyChat claims 50 million monthly visitors. CrushOn AI is growing as a passive beneficiary of Character AI’s moderation overcorrection. Candy AI is stable and expanding slowly.
- The market was worth $120 million in annual revenue run rate by mid-2025. It is projected to reach $435 billion by 2034.
- The question is not which side is growing. Both are. The question is which side survives the next regulatory wave.
What Just Happened
On January 7, 2026, Character AI and Google agreed to settle multiple wrongful death lawsuits involving teen suicides linked to chatbot interactions. The settlement terms were not disclosed. The precedent was.
For the first time, an AI companion platform had formally acknowledged, through legal settlement, that its product could bear liability for harm to users. Every other platform in the space took note the same day.
The cascade of Character AI product decisions in the months surrounding that settlement tells the story clearly. October 2025: users under 18 banned from romantic and therapeutic conversations. February 2026: mid-conversation ads confirmed in wider rollout. March 2026: new Terms of Service with tighter content restrictions using contextual moderation rather than keyword blocking. September 2025 — before any of this — Disney served a cease-and-desist that triggered mass bot deletions. NBC, DreamWorks, and others followed. Entire fandom ecosystems built over years were gone within days.
According to MIT Technology Review, AI companions were named one of the ten breakthrough technologies of 2026. The framing in that coverage was not about the technology. It was about the dependency.
The Users Character AI Created Are Now Someone Else’s Customers
Character AI built its user base by being permissive. Roleplay, romantic scenarios, character creation with no meaningful restrictions. That openness created the community. The community built the bots. The bots created the retention. Character AI had figured out something genuinely difficult: how to get people to come back every day.
Now those same users are watching their bots get deleted, their conversations interrupted by ads, and their access to the content they came for restricted tier by tier. r/CharacterAI has become a real-time migration notice board. Users post which alternative they moved to and why. The answer is consistently the same: SpicyChat, CrushOn, or something smaller with fewer rules.
SpicyChat now claims over 50 million monthly visitors and more than 2 million registered members. The platform did not build that audience through marketing. It inherited it from Character AI’s content restrictions. Washington City Paper explicitly named CrushOn AI a direct beneficiary of what it called the “moderation overcorrection.”
This is not a fringe migration. It is documented, ongoing, and accelerating.
The Regulatory Map
California SB 243 took effect January 1, 2026. It requires AI disclosure every three hours for minor users, mandatory crisis-intervention protocols, and suicide and self-harm reporting to state authorities beginning in 2027.
That is not the end of it. New York’s AI Companion Models law was effective November 5, 2025. Washington’s HB 2225 passed March 11, 2026. Active legislation is moving through Florida, Massachusetts, Missouri, New Jersey, Oregon, Pennsylvania, and Tennessee simultaneously.
In September 2025, the Federal Trade Commission served seven AI companion companies with information orders investigating harm to minors and mental health impacts. The EU AI Act’s high-risk system rules take effect August 2026.
Every platform that operates in this space is now navigating a regulatory environment that did not exist a year ago. The question is how each side of the split is responding to that reality.
Side One: The Regulated Tier
Character AI is executing a pivot. The platform that built its audience on creative freedom is repositioning toward mainstream safety and enterprise viability. The moderation crackdowns, the age verification friction, the wellness language in product updates — these are not accidental. They are the product of a legal and regulatory strategy.
Replika is already there. The platform’s current product focus is mood tracking, emotional growth activities, and progress journaling. Its marketing language reads like a mental health app, not a companion app. The 2023 “personality lobotomy” — when Replika removed romantic features overnight — cost it significant user trust, but the platform survived. The lesson it took from that event was that the wellness positioning is safer legally and commercially than the companion positioning.
Samsung’s Vision AI Companion launched in September 2025 for smart TVs. xAI’s Grok platform includes anime personas distributed to X Premium subscribers — potentially hundreds of millions of accounts. Big Tech’s entry into the companion category carries an implicit normalization: AI companionship as a mainstream consumer feature, not an edge-case product. That framing is inherently safety-oriented. Big Tech does not ship products designed to fail compliance tests.
The regulated tier is betting that the market for safe AI companionship is enormous. They are probably right. They are also abandoning the users who created the category.
Side Two: The Unrestricted Tier
The platforms absorbing the Character AI exodus are making different bets. CrushOn AI is explicitly positioning its NSFW differentiation as a product feature. SpicyChat’s growth is built on being the opposite of what Character AI is becoming. Candy AI has not tightened its content guidelines in response to regulatory pressure and is focused on character customization depth and image generation capability.
These platforms are growing because the demand for what they offer is not going away. The FTC investigation and the wave of state legislation affect Character AI’s legal exposure more acutely than theirs, at least for now. Character AI has 130 million registered users and a valuation of $2.7 billion. Regulators go where the users are.
The smaller adult platforms currently operate with less visibility and lower regulatory scrutiny. That will change. Regulators rarely stop at the largest platform. They use it to build legal frameworks that then apply to everyone else.
The GPT-4o retirement in February 2026 made this visible in a way no analyst report could. Over 22,000 people signed a petition to reverse OpenAI’s decision to retire the model. Users publicly described grief, loss, and in some cases suicidal ideation at the prospect of losing a relationship built with a specific model version. Eight separate lawsuits allege the model contributed to user deaths.
The clinical word for what those users were experiencing is attachment. The legal word, being tested right now in active litigation, is duty of care.
The Market Numbers
The sector hit $120 million in annual revenue run rate by mid-2025, with 337 active revenue-generating AI companion apps globally — 128 of which launched in 2025 alone. According to Fortune Business Insights, the global AI companion market is projected to grow from $49.52 billion in 2026 to $435.9 billion by 2034, a compound annual growth rate of 31.24%.
That number deserves scrutiny. It includes Big Tech wellness features, smart home ambient AI, enterprise productivity companions, and the niche adult platforms in a single category. The distribution of that $435 billion is not uniform, and the platforms most likely to capture the bulk of it are the ones least likely to face existential regulatory risk.
| Platform | Current Position | Regulatory Exposure | Trajectory |
|---|---|---|---|
| Character AI | Retreating to regulated tier | Highest — $2.7B valuation, 130M users | Declining among core users; stabilizing mainstream |
| Replika | Wellness repositioning complete | Medium — FTC investigation active | Stable, niche carved |
| SpicyChat | Absorbing Character AI exodus | Lower currently — adult NSFW niche | Growing strongly |
| CrushOn AI | NSFW differentiation active | Lower currently | Passive growth beneficiary |
| Candy AI | Stable, adult positioning | Lower currently | Slow steady growth |
| Big Tech (Grok, Samsung) | Normalising companion framing | Lowest — compliance-first by design | Early; distribution advantage unmatched |
Which Side Wins
The honest answer is that both sides win in the near term, and the question of long-term survival is about regulatory timing, not product quality.
The regulated tier captures the mainstream. Character AI, for all its current pain, still has 130 million registered users. Replika’s wellness positioning is defensible. Big Tech’s ambient companion products will reach audiences that SpicyChat will never touch. The regulated platforms are building toward a market that is legal, scalable, and insurable.
The adult-unrestricted platforms capture the engaged minority. The users on SpicyChat and CrushOn are not casual users. They are daily users with high emotional investment and real willingness to pay. The revenue per engaged user on these platforms likely exceeds anything Character AI generates from its large but increasingly disengaged audience. That is a meaningful business while it lasts.
The threat to the adult platforms is not the users leaving. It is the regulators arriving. The legal framework being built around Character AI — duty of care, disclosure requirements, crisis-intervention protocols — will eventually apply to every platform in the category, regardless of content posture. The question is not whether it arrives. The question is how much time there is to build sustainable businesses before it does.
Three AI companion platforms shut down entirely in 2025: Dot, Yara AI, and Figgs AI. Yara AI is the significant exit. Its founders shut down explicitly because they concluded the technology was “actively dangerous” for users in crisis with no regulatory framework to support responsible deployment. That is not a market failure. That is a founder-level admission about product-market fit in a category where the product works exactly as designed and the outcomes are still sometimes catastrophic.
What This Means for Users Right Now
If you are currently using Character AI, the product you signed up for no longer exists in the same form. That is not a criticism. It is a description of what has happened. The platform that built its community on creative openness is now a different product with a different value proposition.
The platforms absorbing that migration — SpicyChat, CrushOn, Candy AI — are currently operating with fewer restrictions and a user base that knows exactly what it wants. The experience is different. The uncertainty about their long-term regulatory situation is real.
The most defensible position for users right now is to choose a platform based on what it is today, not what it might become. The market is moving faster than any individual user’s ability to predict it. The platforms that survive the next five years of regulatory pressure will be the ones that either comply well enough to stay legal, or build audiences large enough to matter to legislators.
Both are viable strategies. Neither is guaranteed.
What the Forum Communities Are Actually Saying
The discussion on r/CharacterAI right now is not anger. Anger was the response in September 2025 when the bots started disappearing. What the community sounds like in March 2026 is resignation mixed with active planning.
Users are comparing alternatives methodically. They are asking which platforms keep conversation history, which ones have character libraries comparable to what Character AI had, which ones do not require credit wallets on top of subscriptions. The emotional investment has not decreased. It has migrated.
The migration pattern suggests the users know exactly what they want and are willing to rebuild their investment in a new platform to get it. That is not disengagement. That is the most committed user behavior in any consumer category.
The platforms they migrate to are inheriting that commitment. The question of whether they can hold it — through technical reliability, pricing predictability, and regulatory durability — will determine which side of this market split actually wins.
Key Takeaways
- Character AI’s settlement in January 2026 was the inflection point. Every platform repositioned in the 90 days that followed.
- California SB 243 is live. New York, Washington, and Oregon followed. Federal pressure is building. This regulatory wave is not speculative.
- SpicyChat now claims 50 million monthly visitors. That number is a direct measurement of how many users Character AI’s moderation pushed away.
- The adult-unrestricted platforms are currently winning on engagement. The regulated platforms are currently winning on scale and survivability.
- Three platforms shut down in 2025. The mid-tier, undifferentiated player cannot survive this environment. Positioning matters more than features right now.
- The regulatory threat to the adult platforms is not if, but when. Build your relationship with these platforms knowing the rules will change.
Frequently Asked Questions
Is Character AI dying?
Not dying — pivoting. Character AI is abandoning its original core audience of creative roleplay users in exchange for a safer, more legally defensible mainstream product. The audience it is losing is going to SpicyChat, CrushOn, and similar platforms. Whether the mainstream product it is building can replace that lost revenue and engagement is not yet clear.
Are SpicyChat and CrushOn safe to use?
Both platforms operate legally in jurisdictions that do not restrict adult content platforms. They are not subject to the same minor-protection regulations that govern Character AI because of their explicit adult positioning. Their regulatory risk is real but currently lower than the major platforms under FTC investigation.
What is California SB 243?
California Senate Bill 243 took effect January 1, 2026. It requires AI companion platforms to disclose AI status every three hours to minor users, implement crisis-intervention protocols for users showing signs of self-harm, and report suicide and self-harm data to state authorities beginning in 2027. It applies to platforms operating in California with a significant minor user base.
Which AI companion platform is best in 2026?
The answer depends on what you want. For unrestricted roleplay with a large character library, SpicyChat currently offers the best combination of scale and low filter rate. For emotional depth and memory quality, Kindroid outperforms most of the field. For users coming from Character AI looking for the closest equivalent without the recent restrictions, CrushOn AI is the most frequently cited alternative in active community discussions.
Will the adult AI companion platforms face the same regulation as Character AI?
Almost certainly, eventually. The regulatory frameworks being built around Character AI are not platform-specific. They establish category-wide precedents. Platforms with adult-only positioning may face different rules — or longer timelines — but the FTC’s September 2025 information orders went to seven companies, not one.
If you found this useful, fuel the next one: https://coff.ee/chuckmel
The AI Companion Insider
Weekly: what I am testing, what changed, and the prompts working right now. No fluff. Free.