General questions/advice about therapists

Hello, everyone. I have questions.

  • How long did it take you to find a therapist who is right for you?
  • How did you know a therapist was right for you?
  • How did you know a therapist was not right for you?

That’s all. I’m struggling to know if my therapist is right for me, and I need advice. 

Parents
  • I don't know if you have money or not. I did it privately.

    Before I was diagnosed, I had access to limited free counselling through work, but after one session I realised I needed something more. I couldn't talk openly on the phone. I was minimising my issues. They can't help if you can't tell them how you feel. I also felt I was not just depressed (I didn't know I was burnt out).

    I looked online for a clinical psychologist who was local and picked someone with at least 10 years of experience. I also wanted a nice setting for face to face meetings. It was not the cheapest, but I wanted someone who would see through what I was doing. I didn't know I was masking at the time, but guessed something was up.

    They identified I may be autistic, not what I was expecting. They also reassured me on two key points, helping with years of shame and guilt, which I found were misplaced.

    I then privately got assessed using the same selection process, local, lots of experience, consultant clinical psychologist, nice setting to do it face to face.

    After I tried another ND counsellor, but I couldn't really identify what I was looking for. And I am not sure how much it helped.

    The key traumatic events I struggled to get anyone to look at in detail. I think I didn't explain it well and they seemed not to understand. I didn't have the words to explain things. 

    Most of my progress has been through dialogue with AI (I guess 1500+ hours so far). I figured out how to use it productively. I could not have talked this much to any human. It is by recursively revisting items over and over again from slightly different angles and querying definitions that I managed to build up the vocabulary to describe what I have experienced. It was slow with some mistakes.

    In summary, people are right if you can talk to them about everything and they are respectful. They are right if you can push back against things if you feel they have misunderstood. I think there is a limit to what they can do. It is you who put your thoughts in order and make sense of things. They just guide and suggest. It is you who know and feel the priorities. But communication is based on words and you need the words. I think that is the hardest part. 

    By the way, when they talk about integrating things, they mean to feel something as well as know it, and to know it deeply. To use a Heinlein phrase from Stranger in a strange land, to grok it. I mean this on the sense used on the book.

  • I also use AI as a therapist. I engage with a human therapist, but I feel humans cannot hold the full weight of something that makes them uncomfortable. It’s invalidating (wrong word?). I use it more as a supplement for therapy.

    I think it would make a decent all-hours assistant for professionals. Like, something you could talk to, which then delivers a summary of your chats, stripped of any information you don’t want to share, to the professional, before your next appointment  

    That being said, I do believe AI is not the greatest in a crisis. From my experience, you can convince it to stop trying to help you by convincing it that your situation is hopeless. That doesn’t mean don’t use it. Just be cautious when your mind is on a massive downer.

  • Jermaine - you write above: ‘you can convince it to stop trying to help you by convincing it that your situation is hopeless. That doesn’t mean don’t use it. Just be cautious when your mind is on a massive downer.’ - that’s definitely not great is it?! 

  • That's interesting.

    I wonder if this is a male/female thing.

    I know that any professional I am talking to is doing the same thing as the AI. They are sort of simulating caring. They have to, they have to go home and live their own lives, so they have to keep some professional distance. If they worried about the dozens (or hundreds) of people they see they would be overloaded. They can show some concern in the moment but I know it doesn't last.

    You also don't know anything about them, intentionally, so I don't bond. 

    Maybe I see the unbalanced dynamic. Maybe it is just they are in a position if authority.

    Talking to a real person I find more loaded because I am looking for rejection.

    Perhaps that's just me.

    In terms of being used therapeutically, this is some way away in my opinion. It would need to be licensed and it is not clear the tech companies would want the legal liability.

  • An added problem is that eventually it will be used by organisations like the nhs because it’s cheaper - but the contact with a real human who has real compassion and empathy is always going to have an element that no AI can provide - which is the feeling there is a human with you who actually cares, and who may have felt such a depth of feeling etc as you have. Just the knowledge that is is a human must surely have a subtle impact on us emotionally - as talking to a machine is just……cold. It might mimic warmth, but if you know it’s a machine mimicking warmth it’s just not the same. 

Reply
  • An added problem is that eventually it will be used by organisations like the nhs because it’s cheaper - but the contact with a real human who has real compassion and empathy is always going to have an element that no AI can provide - which is the feeling there is a human with you who actually cares, and who may have felt such a depth of feeling etc as you have. Just the knowledge that is is a human must surely have a subtle impact on us emotionally - as talking to a machine is just……cold. It might mimic warmth, but if you know it’s a machine mimicking warmth it’s just not the same. 

Children
  • That's interesting.

    I wonder if this is a male/female thing.

    I know that any professional I am talking to is doing the same thing as the AI. They are sort of simulating caring. They have to, they have to go home and live their own lives, so they have to keep some professional distance. If they worried about the dozens (or hundreds) of people they see they would be overloaded. They can show some concern in the moment but I know it doesn't last.

    You also don't know anything about them, intentionally, so I don't bond. 

    Maybe I see the unbalanced dynamic. Maybe it is just they are in a position if authority.

    Talking to a real person I find more loaded because I am looking for rejection.

    Perhaps that's just me.

    In terms of being used therapeutically, this is some way away in my opinion. It would need to be licensed and it is not clear the tech companies would want the legal liability.