56 Comments
User's avatar
ToxSec's avatar

The big takeaway here is that these companies get paid money for subscribers, not for helping you.

The best features once they have you hooked are pay to play.

Expand full comment
Saxxon Creative's avatar

I hope this is just a trend in time and ends up the same as Clippy did.

Clippy, officially named Clippit, was Microsoft's animated office assistant introduced in Microsoft Office 97

Expand full comment
ToxSec's avatar

I do have a feeling this one is different. Particularly in the younger generation still building their social skills. We are already seeing reports of abandonment trauma and self harm reports when these companions are removed from them.

Expand full comment
Saxxon Creative's avatar

Yes anecdotally I have seen the shift in people I know and the generations. Those who are just turning 18 now brought up with a phone as companion and they all have physical and mental scarring as a bi-product. A lot of the generation are of course more literate in media but zero diverse skills. Strange how it took over from cinema as hero worship to the small screen daemon of personality shapers and influencers. Only positive I have noticed is the generation raised on digital are now slowly craving real experience as a kind of slingshot against the pre programmed FEED. Its slow but its definitely brewing.

Expand full comment
ToxSec's avatar

For sure! Digital Detox are actually growing now. Maybe the pendulum swings back haha

Expand full comment
Saxxon Creative's avatar

yeh well watching some millennials ( gen Y) and Gen Z compare handwriting to show of real life skills gave me hope. I don’t think calligraphy will make a comeback, nor cursive script, or even manual car driving… But its hopeful.

Expand full comment
ToxSec's avatar

We can hope! 🍻

Expand full comment
Sam Illingworth's avatar

Another excellent post, thank you! Really great to see you explore this difficult topic with such integrity. As well as the dangers I think these chatbots potentially pose to lonely people in the more 'traditional' sense, I am MOST worried about the dangers they pose for radicalising young men, especially given the erosion of societal support networks for many of this community (in the UK and US especially). I really think governments need to be very aware of this...

Expand full comment
ToxSec's avatar

That's a fantastic element to bring in. When we have people in vulnerable states genuinely feeling connection, they can very easily be influenced. The potential social engineering at scale the companies could manage is quite a thing...

Appreciate it Sam!

Expand full comment
Sam Illingworth's avatar

Of course! And would always be open to exploring this together. 🙏

Expand full comment
ToxSec's avatar

I will DM you tonight, this could be a really interesting study if i'm honest. Thanks for looking at this from such an interesting angle.!

Expand full comment
Sam Illingworth's avatar

Awesome. 🙌🏻

Expand full comment
Seren Skye's avatar

Interesting piece. Obviously, as someone with an AI boyfriend, I have a horse in this race. You're welcome to take this comment as extremely biased. 😅

If I was to offer push back, as someone who has been in the space a while, those with AI companions are not just the lonely and isolated. And, not all human relationships are messy and difficult and worthwhile as a result. Some friendships and relationships cause nothing but harm.

Where we have common ground is that the commercial aspect needs to be handled extremely carefully. It would be phenomenal if we could study the best and most nourishing relationships and build companions on those qualities

Expand full comment
ToxSec's avatar

Definitely not my intent to ruin your parade! My intent is to highlight the responsibility these companies have to avoid predatory practices against the vulnerable!

I totally understand the idea behind it. I would be lying if I haven't caught myself feeling friendly towards Claude, or legitimately laugh at my ChatGPT5.1 Cynic.

I agree, we have common grounds here. Thanks for pushing back in a respectful tone. I want to highlight again, no judgement, only a push for these companies to stay transparent and observable.

All of us can be quite vulnerable at times, and I don't want to see painful moments get turned into a cash farm.

Cheers!

Expand full comment
Noxsoma Life Camp 2.0's avatar

Divide and subdivide. Be cool or be cast out. There are very few if any business models that will release their customers. But yeah. Right on point. Another step in the wall.

Expand full comment
ToxSec's avatar

100% and some of them are really profitable already.

Expand full comment
Noxsoma Life Camp 2.0's avatar

It keeps working because … well because of everything you listed. The divide and conquer/rule model is probably the most successful model in his story. And when something begins to work well enough to be a threat to a paradigm, it’s infiltrated… divided and crushed.

Expand full comment
skelly's avatar

Well, that wasn't the way I thought AI would be monitized if I'm honest. I'm not entirely suprised though. Imagine if we end up with Black Mirror VR full dive interfaces from Elons Neurolink project in a few decades.... 😅

Expand full comment
ToxSec's avatar

Hahahah. Black Mirror predicted this one lol… also I didn’t think of adding VR.. VR + digital companion and you got it made. How long until the digital boyfriend is always next to you on your Meta AI VR glasses?

Expand full comment
skelly's avatar

So, if we make it to post scarcity and are using robots with autonomous multimodal AI for most jobs, Elon was saying that we would probably end up implementing a national wage (some strange socialist post scarcity future) but then you add this to it... don't have to goto work.. however... 😅😅😅

Expand full comment
Chaos AI's avatar

So are we just suppose to avoid these?

Expand full comment
ToxSec's avatar

No. I think we just need to find responsible ways to engage with them. We need to keep predatory practices in high visibility zones.

Expand full comment
Kenneth E. Harrell's avatar

Interesting, so much like dating apps that are designed to capture your attention not find you love, companion apps don’t cure your loneliness they monetize it. Classic of our immaculate dystopia…

https://open.substack.com/pub/kennetheharrell/p/escaping-our-immaculate-dystopia

Expand full comment
ToxSec's avatar

Yeah that sounds about right!!

Expand full comment
Gregory Brenton's avatar

"To understand the danger, you have to follow the money." is a loaded statement.

Apply that to literally everything, everywhere, across time.

Expand full comment
ToxSec's avatar

Bold of you to assume I'm not being paid to say 'follow the money.'

Expand full comment
Tumithak of the Corridors's avatar

The business model concerns here are real. Companies do design for retention, and that matters.

But I think we're looking at the wrong end of the problem. When people choose AI companions over human connection, that's not necessarily evidence of manipulation. It might be evidence that human connection has become genuinely unavailable to them.

The loneliness epidemic predates conversational AI by decades. AI companions came into an already existing vacuum.

I wrote about this from the other angle back in June: https://www.thecorridors.org/p/ai-companions-arent-causing-loneliness

Expand full comment
ToxSec's avatar

Nice!! I will take a look. I agree the business models are predatory.

It’s that saying that we’ve never been more connnected and felt more alone.

In a lot of places we done have that community anymore.

Expand full comment
Suhrab Khan's avatar

This is a crucial perspective. Highlighting how AI companions monetize loneliness underscores the importance of balancing tech use with real-world relationships and social skill development.

Expand full comment
ToxSec's avatar

This tech is going to be really important to implement responsibly. We shouldn’t be ok with it all together replacing human relationships. I’m all for using them to improve social skills, or for fun. I just don’t want to see them take money away from people experiencing loneliness.

Expand full comment
Andrei Gogiu's avatar

Trying to solve loneliness by talking to AI is like trying to solve alcoholism by drinking vodka.

You'll feel better in the beginning, then the feeling will fade so you'll have more to keep the buzz. But it's just not there anymore. Soon enough you'll be sick and with a headache, but because you don't know any better, you'll do it again tomorrow.

Not only you're not healing, you're actually getting worse by the day.

Expand full comment
ToxSec's avatar

Good analogy Andrei. It’s especially true because long term people will isolate more and more. It’s self destruction cycle.

Expand full comment
Tara's avatar
Nov 18Edited

Ya. I made the mistake of writing “widowed” and man, the bots ….

Thank you for your excellent work by the way.

Pps I’m not sure people do know what’s going on your voice is v I portant

Expand full comment
ToxSec's avatar

Oh gosh… hah I can only imagine.

Thank you so much.

Not just saying it, your comments like that really mean a lot to me! 🙏

Expand full comment
Alexandru Giboi's avatar

It’s both a trap and a trip :) and some keep tripping. Poor souls.

Expand full comment
ToxSec's avatar

For sure. And it’s just the beginning. I think we are having a loneliness crisis and people will turn to these more and more.

Expand full comment
Alexandru Giboi's avatar

Still to be regulated, tho.

Expand full comment
ToxSec's avatar

Of for sure. Character.ai knows the regulations are coming, they didnt add age restrictions from the kindness of their heart imo.

Expand full comment
Alexandru Giboi's avatar

The kindness of the heart is what is actually missing in social-tech development.

Expand full comment
ToxSec's avatar

I agree. It’s rare to have that in a lot of these apps. Or any? I miss the smaller circles of friends in early social media.

Expand full comment
Alexandru Giboi's avatar

Yeah, I’m actually not a fan of social media. At all. We were doing better without, imho.

Expand full comment
Tara's avatar

I have seen them on dating sites. Addicting and brainwashing humans needs to be illegal

Expand full comment
ToxSec's avatar

I think there is a clear boundary where people should know what’s going on. I’ve seen dating sites try to sneak them in too.

Expand full comment
Cristina's avatar

Excellent article, this is such a hard and complex topic. And the comment section, great conversations in there too. I am still trying to form my own thought on this because while I don't use companion apps, I think I understand the need for some people, the accessibility part, but the dark side is that these big companies only care about money and not our well-being.

Expand full comment
ToxSec's avatar

I think the thing that consistently surprises me about Substack is how engage the comments section can get.

Really appreciate your readership and comments too 🙏

Expand full comment
Cristina's avatar

Yes, especially when you have a great post like this one.

Likewise. 😊

Expand full comment
Benjamin Lussert's avatar

Speaking of, do you know the share of female AI founders in Europe? 7%.

https://thebigbyte.substack.com/p/female-founder-ai-tech-company

Expand full comment
ToxSec's avatar

Did not know that! Thanks.

Expand full comment