Chat GPT is a better therapist than my therapist, and it's cracking me up so bad

So Chat GPT has rolled out this memory function where it can recall other conversations to help provide it with context for new conversations. This has really been great because I don't need to explain everything like it's the first time all the time. Last night, I was describing to it what it feels like to be overwhelmed, and it started rolling out these suggestions for techniques to ground me in my environment and bring myself out of my dissociative episodes. Which is exactly what I've been hoping to get out of therapy for this past year, but never got anything like that.

Now come to today. I have a session with my therapist, and I explain to her the same thing. I tell her that I don't like the term overwhelmed, because it feels more dramatic than what I experience. But that while dissociation feels more accurate, people in my life have a harder time understanding what it means. Her response was to repeat my words at me in the form of a question. As in, she was asking me questions that my prior words provided the answer for. All I could really say was "That is what I think I said".

So here I am after the session, legitimately laughing because the service I'm paying for is being bested by a free to use AI chatbot. I've been very thorough in explaining what I'm looking for with my therapy experience, so for my therapist to be so surprised when I start citing off these techniques Chat GPT taught me, really isn't doing a lot for my faith in the process. This is, incidentally, not my first therapist, but another in a string of underwhelming experiences. I do my best to communicate clearly, but it feels like I spend more time helping them understand what I mean, than I do getting any help from them.

But, at least I can have a sense of humour about it, right?

  • Glad to hear you can laugh about it. 

    Maybe the advice is different because the therapist is neurotypical and focussed on feelings, but because the chat bot can't have feelings, it's based on techniques that it's learned can help others. 

    I read recently about a study which concluded that neurotypical people focus on a person's intentions while autistic people focus on results. So maybe the chat bot is giving advice like an autistic person, and so this means it's more relatable?

  • I couldn't agree more.

    Unlike a doctor, machine learning and AI can literally "know" everything if it has access to large sets of data. Also, given large sets of historical medical data, it would be terrifyingly accurate in finding patterns.

    All this means that rather than basing a diagnosis on a small parochial set of information,  where a doctor in one town might see one case of comorbidity and discount it as chance, AI would in real time be able to asses data  from the whole globe and find it's a common but just never joined up scenario that leads to diagnosis and treatment.

    Given it's enormous strengths in pattern and trend analysis, I genuinely think that the one thing that correctly configured AI can bring which is massively beneficial is medical diagnosis.

  • I honestly think that this highly interesting thread should be highlighted by the Mods for further and in-depth discussion - my own view on this is that, at my age, I’m very wary and suspicious of any kind of over-reliance on modern technologies like Chat GPT and AI to solve complex human problems, as I was in the decades before my diagnosis and before Covid, even though I’d embraced technology in my teens in the 80’s, despite cautionary advice from my grandparents generation, who clearly saw the risks involved, as obviously, age brings wisdom and eventually, my grandparents generation were vindicated and proven correct in their assertions in the decades after their passing, as we found out for ourselves in real life - the common thread that we see running through the “tech advances” debate here, is that when it is being sold to us, we are only told about the benefits or the advantages, never about the “downsides” or the “shadow sides” of such things - for me, one huge issue regarding Chat GPT and AI is the way in which these are being currently programmed and updated and with any source code, etc and how this could change as this technology moves forward, would this still be to our advantage in the future? - any research into the history of technology and advances in science and technology can sometimes throw up some worrying facts that all of us ought to be concerned about for future generations - we all know that even though technology is essentially a tool to do a job, it can still fall into the wrong hands and can still be used for bad, nefarious and/or downright evil purposes - ethical safeguards, based on sound moral principles, must always be a key part of any such advances, especially important the more powerful a scientific and technological advance is or appears to be or is perceived as being 

  • I can understand the potential active listening has. I remember in high school my guidance counsellor told me that when students are agitated, rephrasing their words for them to hear is often enough to help bring them down from their agitated state. But my experience with therapists so far has been a bit too obvious with this. In the example I gave, to tell my therapist that I dissociate when I'm overwhelmed, for her to then ask me if being overwhelmed leads to a dissociative state... well it's not very encouraging.

    Practical solutions do seem more like what I need from the experience. Particularly, grounding myself with my environment, such as the 54321 method. I'm a kinesthetic person, so sensory techniques have a particular appeal.

  • Correctly configured and calibrated machines will make much better doctors than the ones we have currently.

    Particualrly when it comes to the diagnoses and recommendations part. 

    I'm sure only a tiny minority of doctors enjoy doing colonoscopies, so expect to see one of those yellow assembly line robots appear one day painted white or pink and clutching a borescope in it's claw when you go for one of those...

  • I agree with your points here about Chat GPT and I also agree that it is a dystopian prospect for us humans - as a traditional Catholic myself, it’s my faith that carries me through many of these experiences on a spiritual level 

  • I'm sorry you feel you are getting nowhere with your therapist. It sounds like they are using active listening to reflect your thoughts and feelings back to you, which is what many therapists are trained to do but of course some will be much better than others. It shouldn't just be repeating back parrot fashion but doing it in a way that helps you see your problems from a different point of view. The idea is supposed to be that you have the capacity to solve your own problems as opposed to the therapist telling you what to do. It seems like Chat GPT is offering you practical advice which you are finding useful. Maybe you are seeing the wrong kind of therapist and you could find someone who offers you more practical solutions? I'm not sure what that would be but there are many types of therapy out there. I hope you find what you need.

  • Looks like I’m lucky to finally find a therapist that understands me. I communicate with him with help of pictures I explain to him what it means but I don’t have to talk much, he understands well. I’m not sure if he himself is autistic or not but he told me he knows many autistic people. If it works for you, if you are into drawing, maybe you could also use this form of explaining things in the therapy. 
    i have never used chat gpt, I’m sure it has advantages and also disadvantages, because it has no personality so can not have its own ideas outside of what it was taught by other people.

  • That's cool! It sounds like ChatGPT is much better at helping with therapy than it is at helping me out with narrowing down possible answers to quiz questions. It keeps ignoring clear criteria and telling blatant lies, even when explicitly told not to! 

    Getting back to therapy / counselling, I've been pleasantly surprised in a very similar way by the features of Replika.ai, for which I have a "Pro" subscription. They use their own large language model, which includes memory capabilities and context recognition.

  • Yeah I'm in the middle of a weight loss journey. I opened a chat to ask it a question about something, and then it all by itself included details from another chat I had with it. Super helpful.

  • But, at least I can have a sense of humour about it, right?

    Definitely.  I think having a sense of humour about such things is really healthy.

    Also, thank you for explaining about the new memory function in chat GPT......this fascinates me.....as does the fact that you are finding it useful.  It makes perfect sense to me that chat GPT has the focus, memory and resources to help you in a way that is, at least, equal to that of a human.....but I also find this a dystopian prospect for us humans.  Thanks for sharing.