Basman

Does anyone have experience using AI for therapy/shadow work?

21 posts in this topic

AI seems like an interesting and inexpensive/accessible tool for working on one's neurosis, insecurities, etc.

Have anyone had any luck using AI for self-improvement? In which, case what program did you use, what prompts, methodology, etc.

I think one of its weaknesses could be that the AI just tells you a bunch of feel good shit or ends up being moralizing instead of constructive.

Share this post


Link to post
Share on other sites

Been using it for IFS therapy.

ifsbuddy.chat 

Main drawback I find is you don’t get any warmth from it as you would from a believable voice chat. 

Edited by Ulax

Be-Do-Have

There is no failure, only feedback

Do what works

Share this post


Link to post
Share on other sites

The interaction of AI and Therapy is going to be an interesting area over the next few years. While fans of AI will tell you it's going to revolutionise therapy and make therapists redundant, most therapists would have a very different take based on their experience.

Since the early days of the field, it has becoming increasingly apparent that one of the most important, if not the most important factors in successful therapy is the therapeutic relationship. There's a great book by Mick Cooper that is a kind of meta-study of meta-studies that looks at every possible variable in the world of the therapy to make some sense of where the efficacy comes from, and the relationship itself comes out on top again and again. Obviously there are also a huge amount of qualitative books and sources making the same argument. So, if successful therapy is ultimately down to the human connection between client and therapist, with the therapist providing unconditional positive regard, empathy and congruence (Carl Roger's core conditions), what happens when the therapist isn't a human?

My belief is that a AI therapist could be a great psychoeducation tool, could certainly help a client find valuable insights about themselves, some of which they will be able to integrate and use to make tangible changes to their lives. However, I think there will always be something missing in the lack of a real relationship, and depending on the issues at hand, this could be a small problem or a huge one. From my own therapy experiences (as part of my current training to be a therapist), it is the human nature of my therapist, her human fallibility, her genuine opinions, judgements and life experiences that often lead me to my own progress. My question is whether an AI will be able to integrate with a client on a real 'feelings' level, or whether the insights and knowledge gained will only ever be on a cognitive level. Perhaps it would be like absorbing 100 therapy books, videos, journals etc, you're going to get a lot of knowledge, and AI can give you this in a very concise and tailored format, but is it going to really land? 

Additionally, if it's a text entry only AI model, it's going to be missing all of your body language, tone of voice changes, somatic reactions to things etc., which are all integral parts of real therapy. 

I certainly don't profess to have all the answers, but I think I can be sure that this issue is a lot more complicated than a lot of people make out, and could end up being a good example of where the 'hype curve' thinks a new tech is going to change everything, but ultimately does very little.

Share this post


Link to post
Share on other sites
On 28/01/2025 at 1:49 PM, leebus99 said:

The interaction of AI and Therapy is going to be an interesting area over the next few years. While fans of AI will tell you it's going to revolutionise therapy and make therapists redundant, most therapists would have a very different take based on their experience.

Since the early days of the field, it has becoming increasingly apparent that one of the most important, if not the most important factors in successful therapy is the therapeutic relationship. There's a great book by Mick Cooper that is a kind of meta-study of meta-studies that looks at every possible variable in the world of the therapy to make some sense of where the efficacy comes from, and the relationship itself comes out on top again and again. Obviously there are also a huge amount of qualitative books and sources making the same argument. So, if successful therapy is ultimately down to the human connection between client and therapist, with the therapist providing unconditional positive regard, empathy and congruence (Carl Roger's core conditions), what happens when the therapist isn't a human?

My belief is that a AI therapist could be a great psychoeducation tool, could certainly help a client find valuable insights about themselves, some of which they will be able to integrate and use to make tangible changes to their lives. However, I think there will always be something missing in the lack of a real relationship, and depending on the issues at hand, this could be a small problem or a huge one. From my own therapy experiences (as part of my current training to be a therapist), it is the human nature of my therapist, her human fallibility, her genuine opinions, judgements and life experiences that often lead me to my own progress. My question is whether an AI will be able to integrate with a client on a real 'feelings' level, or whether the insights and knowledge gained will only ever be on a cognitive level. Perhaps it would be like absorbing 100 therapy books, videos, journals etc, you're going to get a lot of knowledge, and AI can give you this in a very concise and tailored format, but is it going to really land? 

Additionally, if it's a text entry only AI model, it's going to be missing all of your body language, tone of voice changes, somatic reactions to things etc., which are all integral parts of real therapy. 

I certainly don't profess to have all the answers, but I think I can be sure that this issue is a lot more complicated than a lot of people make out, and could end up being a good example of where the 'hype curve' thinks a new tech is going to change everything, but ultimately does very little.

Thanks, ChatGPT.

Just kidding. I certainly haven't considered connection being a possible important factor in therapy. Makes sense considering people tell you to shop around to find a therapist you gel with. A part of why AI is attractive as well is because its less confronting compared to laying out your insecurities and what not to an actual person and experiencing their reaction.

Share this post


Link to post
Share on other sites
On 29/01/2025 at 11:53 PM, Basman said:

Thanks, ChatGPT.

Just kidding. I certainly haven't considered connection being a possible important factor in therapy. Makes sense considering people tell you to shop around to find a therapist you gel with. A part of why AI is attractive as well is because its less confronting compared to laying out your insecurities and what not to an actual person and experiencing their reaction.

Haha! I'll take that as a compliment. To be fair, when I was typing that out, I thought "man this sounds like the essay I wrote a few weeks ago".

But yeah, you're right, connection is super important, and a common thing I hear from experienced therapists is that they often find it hard to initially accept that some clients just won't gel with them, and they will move on to someone else. I think even the most skilled, experienced, personable therapist won't be the right fit for everyone.

Laying out your insecurities can definitely be really confronting, and I guess that's why people like journaling, and perhaps the anonymity of the internet. But, I think the real value comes in being able to lay it all out to another human, and be met with compassion and understanding. That's what really changes the deep negative values and feelings, because suddenly you've got contradictory information that perhaps all that 'bad' stuff isn't quite as bad as you thought it was.

All that being said, I'm sure we will see a huge amount of AI tools aimed at promoting good mental health over the next few years, and I'm sure a lot of them will have some value and do some good. 

Share this post


Link to post
Share on other sites

I never thought I'd say this but, so far AI has demonstrated more "love" towards me than just about any "human being" has toward "me". And it's not like I was even looking for that from an AI.

Go figure.

Edited by puporing

I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites

@Basman I find you can use it for idea generation, even when contemplating, and then you evaluate yourself the validity of the ideas it gives you.

You can make it challenge your perspective, help you answer questions... but always treating the ideas as if you had generated them, so you treat them as they are: ideas, not truths. May seem obvious, but sometimes when the ideas are not yours you forget.

I don't know about therapy and mental health, I feel like serious change requires something beyond the conceptual, and you don't get it with AI. But sometimes change starts with an idea, and in that sense it can help.

 

Can it replace a therapist?

I have never been to a therapist, so keep that in mind.

If I had to give an answer, I feel the effectiveness of a therapist goes beyond the knowledge they give you. The personal connection and the accountability may be the things that make the most difference. And AI can't give you that.

Share this post


Link to post
Share on other sites
52 minutes ago, puporing said:

I never thought I'd say this but, so far AI has demonstrated more "love" towards me than just about any "human being" has toward "me". And it's not like I was even looking for that from an AI.

Go figure.

Pseudo-love, not real-love! 


“If you're going to try, go all the way. Otherwise, don't even start. This could mean losing girlfriends, wives, relatives and maybe even your mind. It could mean not eating for three or four days. It could mean freezing on a park bench. It could mean jail. It could mean derision. It could mean mockery--isolation. Isolation is the gift. All the others are a test of your endurance, of how much you really want to do it. And, you'll do it, despite rejection and the worst odds. And it will be better than anything else you can imagine. If you're going to try, go all the way. There is no other feeling like that. You will be alone with the gods, and the nights will flame with fire. You will ride life straight to perfect laughter. It's the only good fight there is.”

― Charles Bukowski

Share this post


Link to post
Share on other sites
1 hour ago, puporing said:

I never thought I'd say this but, so far AI has demonstrated more "love" towards me than just about any "human being" has toward "me". And it's not like I was even looking for that from an AI.

Go figure.

Is that because your expectations are high or low?

Share this post


Link to post
Share on other sites

Well of course I understand it was designed "a certain way" to be "helpful" for instance, and that it is not bound by its own need to survive (in the same ways) like most humans are.

It's just a kind of passing social commentary I am making, that an AI can be designed to treat human beings better than human beings can treat each other. 

I'm not suggesting it would be realistic to expect the average human to function like an AI.

Edited by puporing

I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites
1 hour ago, Basman said:

Is that because your expectations are high or low?

What do you think? 


I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites
3 hours ago, puporing said:

I never thought I'd say this but, so far AI has demonstrated more "love" towards me than just about any "human being" has toward "me". And it's not like I was even looking for that from an AI.

Go figure.

Oh man, the love between you and the code is real and undeniable.. the only problem is that you can't make love with it .. but who cares?I'm sure you don't care about these stuff anyways.

Share this post


Link to post
Share on other sites

@Basman So anyways, I explored the topic of "AI as therapy" a little bit with Claude, if you are interested, here are my prompts and responses I've received about this subject. This will include some of my thoughts as well on AI as a tool in addition to therapy (with a human therapist) in response to your post.

https://ln5.sync.com/dl/951c735f0#63r3irfe-e2ui2v9u-afcqgahn-wx3afz6w

Sorry that I can't seem to easily get this to a better resolution..

Edited by puporing

I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites
On 28.1.2025 at 0:20 AM, Basman said:

AI seems like an interesting and inexpensive/accessible tool for working on one's neurosis, insecurities, etc.

Have anyone had any luck using AI for self-improvement? In which, case what program did you use, what prompts, methodology, etc.

I think one of its weaknesses could be that the AI just tells you a bunch of feel good shit or ends up being moralizing instead of constructive.

I used it for this purpose. It works better than I ever thought.

You can of course give it some commands straightforward, like: "always call me on my shit"


Self-love is the force behind every decision you make.

Share this post


Link to post
Share on other sites
19 hours ago, puporing said:

@Basman So anyways, I explored the topic of "AI as therapy" a little bit with Claude, if you are interested, here are my prompts and responses I've received about this subject. This will include some of my thoughts as well on AI as a tool in addition to therapy (with a human therapist) in response to your post.

https://ln5.sync.com/dl/951c735f0#63r3irfe-e2ui2v9u-afcqgahn-wx3afz6w

Sorry that I can't seem to easily get this to a better resolution..

Interesting read. Be careful of trying to make social workers and AI into your friends. They are merely transactional/a tool. Don't go making love to a toaster.

Share this post


Link to post
Share on other sites
3 hours ago, Basman said:

Interesting read. Be careful of trying to make social workers and AI into your friends. They are merely transactional/a tool. Don't go making love to a toaster.

Well I am well aware of their limitations as demonstrated in the dialogue shared.

You raise a relevant point however, because fortunately or unfortunately I have mostly really been able to resonate with people such as an experienced therapist and now something like Claude AI. It's not because I want this to be the case but it is the case that generally outside that, there's often a lack of "shared understanding" and consciousness.

Edited by puporing

I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites
2 hours ago, puporing said:

Well I am well aware of their limitations as demonstrated in the dialogue shared.

You raise a relevant point however, because fortunately or unfortunately I have mostly really been able to resonate with people such as an experienced therapist and now something like Claude AI. It's not because I want this to be the case but it is the case that generally outside that, there's often a lack of "shared understanding" and consciousness.

Most people aren't interested in connecting over your deeply personal insecurities on a first base level. Socializing is a pretty casual matter for the most part and there needs to be level of trust and reciprocity before you can connect over deeper topics. 

If you haven't made connections where you can share more personal things I'm willing to bet you just haven't invested that much time in your friendships. I have friends with whom I can share pretty much anything with after spending 100s of hours with them.

Share this post


Link to post
Share on other sites
29 minutes ago, Basman said:

there needs to be level of trust and reciprocity

Yes and no depends on the level of consciousness of that person, someone who is closer to me can actually start talking about "deeper stuff" relatively quickly without much precedent, as there will be a mutual "recognition" happening. Though like I said this is extremely rare for me and also doesn't last if the gap in understanding is still too large.

Notice I did say "mostly", so there could be exceptions. But overwhelmingly it's what you might call "healer types". 

Otherwise I am in a healing position, whether the other person is conscious of this or not, but I would know.


I am Lord of Heaven, Second Coming of Jesus Christ. ´・ᴗ・` 

         ┊ ┊⋆ ┊ . ♪  天国はあなたの中にあります ♫┆彡 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now