A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
Almost like questioning an AI is free while a therapist costs a LOT of money.
There are other causes here.
They’ve been talking for a while about how the low participation in dating by Gen Z women is because they’re tired of being the entire support system for men experiencing a loneliness epidemic.
It’s a lot of pressure for the women to be under, and so they’re withdrawing.
I’m guessing this is one of the driving forces as well. Lack of real, emotionally intimate human connections around them. Many men are quite fucked in that regard right now.
I’ve got no horse in this race but it appears that ‘men should not be afraid to open up’ articles and tweets were followed by ‘men, we are not your therapist’.
🤷♂️
I’m a therapist who works almost exclusively with men. Here one pattern I’ve seen often:
It can be true both that men need to open up more and should not treat their partners as therapists. We all need support systems because no one person can always be available to give us everything we need. It’s not wrong to confide in a partner, but if that partner is the only confidant it’s precarious for both. And I want to emphasize this is not the fault of a man, or men as a community. This is the result of generations of conditioning from both men and women, and both men and women play a part in the solution. I also want to recognize that many of us don’t have a network of people we could open up to even if we wanted to, and many more can’t afford therapy.
If anyone reading this can afford therapy, I highly recommend it. It’s a place to undo some of that conditioning, to sit with someone who’s committed to listening, caring, and not judging.
I feel like you skipped over this part way too quickly. Myself and other men have been hearing things like “it’s not manly to cry”, “whining isn’t going to do anything for you”, “being weak is girly”, and countless other things for my entire memorable life
And it’s not just men telling me this. It’s men, women, adults, my classmates, teachers and mentors.
It’s not a good thing. And it’s changing now, which is so good. But man hearing that from your earliest memories makes it really set in.
Thank you for expanding on that point. I meant it to be a “here’s how we got here” before the rest of my “this is where we are today.”
You’re totally right, and any conversation about men’s behavior at large should include the experiences you just described. Even though we didn’t get ourselves into this situation - in that we didn’t raise ourselves - we’re the ones who will get us out.
🤔
That’s interesting… had never seen it put that way before…
It’s almost like telling men that it’s okay to show your feelings is bullshit lol
Do you think this therapist is trying to market therapy and increase his business? I also think the same 🤨
/j
Because they want us to open up, just not to them. T
The irony is so many anti-patriachical feminists, still desire the patriachy. They still want dominant tall wealthy men to romance then, but at the same time they claim to wait to tear these men down into some genderless socialist utopia… where they’d never want to ahve sex with any of the ‘ideal’ men they believe woudl exist in this society.
You can’t have it both ways.
I think there’s a lot more to it than cost. Men, even with considerable health care resources, are often very averse to mental health care.
Thinking of my father in law, for example, I don’t know how much you would have to pay him to get him into a therapist’s office, but I’m certain he wouldn’t go for free.
Also talking to ChatGPT, if done anonymously, won’t ruin your career.
(Thinking of AD military, where they tell you help is available but in reality it will and maybe should cost you your security clearance.)
Granted, but it still will suck a fuck ton of coal produced electricity.
One chat request to an LLM produces about as much CO2 as burning one droplet of gasoline (if it was from coal fired power, less if it comes from cleaner sources). It makes far less CO2 to talk to a chatbot for hours upon hours than a ten minute drive to see a therapist once a week.
Sorry, you’re right. I meant the training of the LLM is what uses lots of energy, I guess that’s not end user’s fault.
@MrLLM @Womble
Question … did someone once do a study comparing a regular fulltext indexed based search vs ai in terms of energy consumption ;)
Second … if people would keep using “old” tech -> wouldn’t that be better for employment of people and therefor for social stability on this planet ?
To your first question, nop, I have no idea how much energy takes to index the web in a traditional way (e.g MapReduce). But I think, in recent years, it’s been pretty clear that training AI consumes more energy (so much that big corpo are investing in nuclear energy, I think there was an article about companies giving up meeting 2030 [or 2050?] carbon emission goals, couldn’t find it)
About the second… I agree with you, but I also think that the problem is much bigger and complex than that.
Yeah, but also one of them is helpful and the other is the exact opposite. If the choices are AI therapist or no therapist, you are still better off with no therapist.
Got it. No therapist it is.
That’s what I’m doing. That and screaming into a pillow most nights.
I don’t scream into a pillow. I just wake up at dawn and have a panic attack until I have to actually move.
that’s easy to say, but when someone is in a crisis, I would be wrong to judge then for talking to an AI (shitty terrible solution) instead of a therapist that can be unaffordable and also comes with a risk of then being terrible.
I’d be interested on a study there.
I lot of therapy is taking emotions and verbalising them so that the rational part of the brain can help in dealing with things. Even a journal can help with that, so talking to an inanimate machine doesn’t seem stupid to me.
However therapists guide the conversation to challenge the patient, break reinforcing cycles, but in a way that doesn’t cause trauma. A chatbot isn’t going to be the same.
I’m gonna need a source on that.