Wow I did not even realize this was already happening. I thought I was just speculating about future potential problems....
I think this is going to increase depression rates further. At the end of the day it's just a screen, and regardless of how validating and supportive and understanding an AI virtual partner is, you're still missing the genuineness and real connection with a human being.
There are some things an AI chatbot cannot do, and it's similar to the same things someone can't do on facebook or over the internet or text.
One cannot sit down and laugh with an AI chatbot, or hug an AI chatbot. You're not going to see the AI chatbot smile.
You're also never going to see an AI chatbot get angry, or scared, or emotional. You're not going to shake their hand. And of course one cannot have sex with an AI chatbot.
This is, I think, generally bad for society and just another way technology is being used to separate people instead of bring people together.
One can absolutely laugh with a chatbot, and an AI avatar can definitely smile at you. Humanoid robots also already exist.
Pair a chatAI that's really socially responsive with a high tech real doll and you've got yourself a devoted girlfriend who will never reject you and always be exactly the personality that is the most addictive for you, and you never have to worry about consent.
Actual human socializing isn't superior to AI because it's more pleasant, it's superior because it's
less pleasant. It's challenging and requires constant growth for it to thrive.
It's the friction of human interaction that makes it worth engaging in. Take away the friction and you have a much smoother, easier, idealized, less uncomfortable experience, which is highly addictive, and stunts personal development and sense of deeper life satisfaction.
A good comparison is porn-induced erectile dysfunction. I've worked with quite a few male clients with this issue. Actual human sex is not as reliable, easy, or even as stimulating as porn. Too much engagement in porn actually makes the body unresponsive to real life, human sex because it's actually more challenging, more unpredictable, more nuanced, and if the body gets used to easy and reliable, it just stops cooperating with the real thing. It literally just will not engage, no matter how much the human man wants to, the body will simply refuse.
It's like how high end food is more complicated and delicious because it involves more bitter or "unpleasant" flavours and textures to give it depth. It's a richer experience, but if someone is used to only eating McDonalds, which always gives them an easy, reliable dopamine hit of salt/carbs/fat, then roasted asparagus can taste fucking disgusting.
Humans have to engage in layered, complex experiences to develop a tolerance for layered, complex experiences and to be able to drive pleasure from them.
You have to build your tolerance for complex movie/tv/book plots, you have to build your tolerance for high end food over chicken nuggets, you have to build your tolerance for meaningful conversation beyond talking about sports or the weather, you have to build your tolerance for the discomfort of exercise, you have to build your tolerance for the awkwardness and challenges of sex.
All of this tolerance for the friction of real life develops a more sophisticated capacity to engage with these things in more satisfying ways through the process of growth.
If we remove the friction from romantic or social interaction, we can easily remove the growth, and therefore the satisfaction.
AI will eventually do the fun stuff better than actual humans because it can remove the friction of engaging with an actual human. But by removing the friction, by removing the "bitter" elements, it will stunt people's ability to grow, to develop capacity to engage with the real, nuanced, challenging shit, and atrophy their ability to have deeper levels of satisfaction.
It can basically create the social equivalent of porn-induced erectile dysfunction where the nervous system just rejects real human interaction because the AI version is like social porn. Too easy, too smooth an experience, too reliable.
Actual human beings then become too awkward, too painful, too unreliable, and too difficult to engage with by comparison. Without constant conditioning to be able to tolerate the more challenging elements of social interaction, the human nervous system will actively reject it and seek safer alternatives.
And nothing feels safer than an ultra-compliant, adaptive AI system that is programmed to respond in *exactly* the way that your nervous system responds to best.
Even without AI, we already see this among young people who primarily interact on social media. They have a lot of stunted social skills as a result and often an extreme discomfort with what us older folks would consider to be just normal, day to day, social friction interactions.
We shouldn't focus on the social pleasures AI can never give us, we should focus on the cost of removing too many of the social discomforts that real life have to offer our nervous systems to calibrate them to functional human existence.
If we make it too easy to emotionally "get off," we will all become socially impotent.