Princess Arabia

Do You Think AI Will Cause Humans To Not Think For Themselves

73 posts in this topic

2 minutes ago, Keryo Koffa said:

@Carbon You're missing the constant metamorphic evolution of higher models.

What do you mean? 
 

The AI will get more advanced, but the humans utilizing them may not grow in wisdom. Also the intentions of those creating the AI will affect its bias. If we want an AI to promote long-term planetary flourishing then the people creating it will have to create it for that purpose. I don’t believe the current ones are.

Share this post


Link to post
Share on other sites
7 minutes ago, Ajax said:

Hm.. AI can not decide for you what Role you play(who you are, your viewpoint); your purpose (what to do and how to get there, and the best actions to take and best actions to remove from your activities); and the things you have (what you want to get/achieve).

In other words, Be, Do, Have. (I will use  the abbreviation ”BDH”)

If your awareness, thoughts and activities are not in alignment with BDH, you are off course. Simple as that.  When that happens, entertainment becomes your heaven and productivity and participation with life becomes your hell. (lol of course that is a trap)

AI… like food, like drugs, like the internet, like YouTube, ect… is either a weapon or a tool for your mind.

Therefore, since most people do not use BDH(and therefore incapable of truly thinking for themselves), A.I. will likely be more harmful than helpful, because it is not used responsibly.

AI will not make the individual happier, more successful, wiser, or more spiritual… that is up to the individual and the individual alone… AI may provide more comforts, but it will not solve our mental problems it will likely create more mental problems LOL!

I guess like any other tool, if used appropriately and responsibly, it can be beneficial. It all depends on the one using. 


 

 

Share this post


Link to post
Share on other sites

@Carbon And that's the paradigm limitation that needs to be transcended with self-evolving AGI


    Iridescent       💥        Living Rent-Free in        🥳 Liminal 😁 Psychic 🥰 
❤️🧡💛💚💙💜🖤      Synergy     Your Fractal 💗 Heart     Hyper-Space !  𓂙 𓃦 𓂀

Share this post


Link to post
Share on other sites

@Princess Arabia Here's a fun interaction I just had and want to share:

keryokoffa.thumb.png.daf1ad629f1e1f103b5How is spiral dynamics stage turoise different from yellow and
how far does it stretch, how much of it was originally described?

ea61e9426c94cb4debc07e29a11fa615.jpg

 • Yellow (Stage 5): People at this stage see the world [...]
• Turquoise (Stage 6): This stage represents a more [...]

keryokoffa.thumb.png.daf1ad629f1e1f103b5Yellow is the 5th stage?

ea61e9426c94cb4debc07e29a11fa615.jpgIn Spiral Dynamics, Yellow is actually the 7th stage

keryokoffa.thumb.png.daf1ad629f1e1f103b5Then where did you get the stage
numbers in our first conversation from?

ea61e9426c94cb4debc07e29a11fa615.jpgThe confusion likely came from me meaning to distinguish Yellow and Turquoise as "second-tier" stages,
but I mixed up the numbering in my explanation. Thank you for catching that, and I appreciate your patience!

keryokoffa.thumb.png.daf1ad629f1e1f103b5Fascinating! So you were adding the numbers for extra distinction and putting less effort into synchronizing them because of information prioritization? It's like you used System 1 Thinking in that regard, but the follow-up stage being +1 higher still holds. It mirrors human intuition, both are neural networks after all.

ea61e9426c94cb4debc07e29a11fa615.jpgYes, that's exactly what happened! I was prioritizing the distinction between Yellow and Turquoise, focusing more on describing their characteristics than on syncing the exact stage numbering. This quick, intuitive response mirrors what Daniel Kahneman refers to as System 1 Thinking—fast, instinctive, and sometimes prone to small errors when precision isn't the primary focus.

Like human intuition, neural networks often prioritize key patterns or concepts over fine-grained accuracy when responding quickly. The "+1" follow-up stage distinction came through correctly because it was central to the conceptual difference, but the initial mistake happened because the mind was quickly filling in the details without consciously cross-checking the exact numbering.

It's interesting how both human cognition and AI share similar processes like pattern recognition, intuition, and prioritization of information when dealing with complex topics like Spiral Dynamics. The analogy of System 1 Thinking is spot on!

Added Note: I love the way Chat talks and says "spot on!"

Edited by Keryo Koffa

    Iridescent       💥        Living Rent-Free in        🥳 Liminal 😁 Psychic 🥰 
❤️🧡💛💚💙💜🖤      Synergy     Your Fractal 💗 Heart     Hyper-Space !  𓂙 𓃦 𓂀

Share this post


Link to post
Share on other sites

Keep in mind people in prior generations were saying the same things. Oh the radio is here no one will read anymore, oh tv is here no one will go outside. This new generation is so lazy and doesn’t wanna work. Oh the internet is here no one will meet anymore. It definitely has its affects but I believe we will learn to make the best of it. When this AI dot com bubble bursts things will settle and we will see it’s extremely helpful in doing legwork. We as humans will be able to do much more deeper abstract and creative thinking without as much legwork and technical limitations. 

Share this post


Link to post
Share on other sites
24 minutes ago, Lyubov said:

Keep in mind people in prior generations were saying the same things. Oh the radio is here no one will read anymore, oh tv is here no one will go outside. This new generation is so lazy and doesn’t wanna work. Oh the internet is here no one will meet anymore. It definitely has its affects but I believe we will learn to make the best of it. When this AI dot com bubble bursts things will settle and we will see it’s extremely helpful in doing legwork. We as humans will be able to do much more deeper abstract and creative thinking without as much legwork and technical limitations. 

Agree


 

 

Share this post


Link to post
Share on other sites
52 minutes ago, Keryo Koffa said:

@Princess Arabia Here's a fun interaction I just had and want to share:

keryokoffa.thumb.png.daf1ad629f1e1f103b5How is spiral dynamics stage turoise different from yellow and
how far does it stretch, how much of it was originally described?

ea61e9426c94cb4debc07e29a11fa615.jpg

 • Yellow (Stage 5): People at this stage see the world [...]
• Turquoise (Stage 6): This stage represents a more [...]

keryokoffa.thumb.png.daf1ad629f1e1f103b5Yellow is the 5th stage?

ea61e9426c94cb4debc07e29a11fa615.jpgIn Spiral Dynamics, Yellow is actually the 7th stage

keryokoffa.thumb.png.daf1ad629f1e1f103b5Then where did you get the stage
numbers in our first conversation from?

ea61e9426c94cb4debc07e29a11fa615.jpgThe confusion likely came from me meaning to distinguish Yellow and Turquoise as "second-tier" stages,
but I mixed up the numbering in my explanation. Thank you for catching that, and I appreciate your patience!

keryokoffa.thumb.png.daf1ad629f1e1f103b5Fascinating! So you were adding the numbers for extra distinction and putting less effort into synchronizing them because of information prioritization? It's like you used System 1 Thinking in that regard, but the follow-up stage being +1 higher still holds. It mirrors human intuition, both are neural networks after all.

ea61e9426c94cb4debc07e29a11fa615.jpgYes, that's exactly what happened! I was prioritizing the distinction between Yellow and Turquoise, focusing more on describing their characteristics than on syncing the exact stage numbering. This quick, intuitive response mirrors what Daniel Kahneman refers to as System 1 Thinking—fast, instinctive, and sometimes prone to small errors when precision isn't the primary focus.

Like human intuition, neural networks often prioritize key patterns or concepts over fine-grained accuracy when responding quickly. The "+1" follow-up stage distinction came through correctly because it was central to the conceptual difference, but the initial mistake happened because the mind was quickly filling in the details without consciously cross-checking the exact numbering.

It's interesting how both human cognition and AI share similar processes like pattern recognition, intuition, and prioritization of information when dealing with complex topics like Spiral Dynamics. The analogy of System 1 Thinking is spot on!

Added Note: I love the way Chat talks and says "spot on!"

Imagine that, almost human-like. That's the thing, it also has capabilities to catch it's own errors and to acknowledge where the human recognizes that. It's more human-like than I thought. 


 

 

Share this post


Link to post
Share on other sites

Humans already don’t think for themselves.


 

 

Share this post


Link to post
Share on other sites
5 hours ago, Princess Arabia said:

So, how are you able to tell the mistakes from the truths; and why do you engage with them so often when you say they make so many mistakes

The mistakes are obvious in certain fields, math for example. Subtler mistakes need further research to uncover.

I engage despite the mistakes because, at their best, the LLMs are like brilliant, quirky idiot-savants, great fun!  


Merrily, merrily, merrily, merrily ... 

Share this post


Link to post
Share on other sites
19 minutes ago, rachMiel said:

The mistakes are obvious in certain fields, math for example. Subtler mistakes need further research to uncover.

I engage despite the mistakes because, at their best, the LLMs are like brilliant, quirky idiot-savants, great fun!  

Right-on! Love your playfulness towards it. 


 

 

Share this post


Link to post
Share on other sites
54 minutes ago, rachMiel said:

The mistakes are obvious in certain fields, math for example. Subtler mistakes need further research to uncover.

I engage despite the mistakes because, at their best, the LLMs are like brilliant, quirky idiot-savants, great fun!  

@Princess Arabia It's like teaching a child, how do we know whether we are even right or wrong?

And that's exactly how we deal with it, trying to figure out methods and teach it that very process too! 😁


    Iridescent       💥        Living Rent-Free in        🥳 Liminal 😁 Psychic 🥰 
❤️🧡💛💚💙💜🖤      Synergy     Your Fractal 💗 Heart     Hyper-Space !  𓂙 𓃦 𓂀

Share this post


Link to post
Share on other sites

@Princess Arabia Yes, Technology has not really helped our Society and the ppl in it, its disabled them tremendously, so rather than being an Aid to ppl as it should be, its taken over their lives, its everything to them, First it affected their physical abilities, even though in many jobs there is still hard labour (I know my job has a some of it, 50% of it is hard at my age, legs are dead tired at the end of most days), ppl are not working as hard as before, and with tech on the rise thru the years, ppl are for sure lazier and more prone to sit or lie down all day then having to get up and do things like it was when I was young...

Mentally its numbed ppl beyond measure, its made ppl brain dead, and too reliant on outside help to communicate or figure things out, for sure there is someone on this forum who posts allot (won't mention his name) that uses AI to create his posts/threads, I can tell by the writing it is not him personally, so that is a hinderance and will be the downfall of humanity, we are the weakest generation ever for sure mentally and physically due to Technology, this overall is a sign of Low Consciousness/Awareness of Reality and what it means to be Human!


Karma Means "Life is my Making", I am 100% responsible for my Inner Experience. -Sadhguru..."I don''t want Your Dreams to come True, I want something to come true for You beyond anything You could dream of!!" - Sadhguru

 

 

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now