I actually have post I am trying to gather resources for that kind of fits with this. I am hoping to maybe get it written up next week. It actually will cover a couple of subjects it looks like. I think there has always been a stigma around the idea of getting help for mental health. Whether it is right or wrong, we know that people (especially men) for the most part don't like to talk about their feelings. I can see where someone might be more comfortable trying to work stuff through with AI, but I understand your point that taking the human element out of the whole process is a very dangerous slope.
AI is a tool that can help in heaps of therapeutic applications - but when the tool becomes the companion? Is it still healthy? When it makes us weaker because of it, is it a cure?
I'm not saying it is the solution, but if it can help break down that wall of stigma, then short term it might be helpful.
oh yeah I know. Didn't mean to imply that is what you were saying either, just that there is a boundary where it becomes a harm or addiction, and I don't think we are well-equipped to stay away from that edge.
I'm not a fan of it in the first place, but I know a lot of people have started using it. I guess some help is better than no help, but who knows.