Carl-Richard

Moderator
  • Content count

    13,373
  • Joined

  • Last visited

Everything posted by Carl-Richard

  1. Nope. Again: correlates on the screen of perception. The brain does not cause the screen to arise, but brain activity correlates with certain perceptions, i.e. emotions and thoughts.
  2. No. I'm saying that neural correlates can be used to predict the experience of thoughts and emotions specifically. I'm not saying that neural correlates produce the basic experience of qualities. Like zurew is saying, it's more of a scientific statement than an ontological one. I'm talking about correlations on the screen of perception, not the screen itself. It's like "here is an observation: birds flying correlate with bird shit falling on people" and you answer "I think you're mixing up birds with qualia".
  3. Depends when I wake up
  4. It's perfectly fine to think that the most basic types of phenomenological experience (like the experience of red and blue) simply exist "out there" in the aether so to speak, independent of any structural-functional configuration of stuff. Panpsychism (which is most likely what the paper refers to when it says "ontologically pansentient universe") and idealism are both compatible with that position. However, again, the question about AI sentience is not really about that. It's about very complex experiences like emotions and thoughts. When people say that the AI writes like a human and therefore is sentient, they're claiming that it also feels or thinks at least somewhat like a human, and this claim goes way beyond any discussion about the most basic levels of phenomenal consciousness, to the point that it's frankly irrelevant to the discussion, unless you claim that emotions and thoughts generally arise independently of any structural-functional configuration of stuff (which is patently absurd). According to our best current knowledge, we know that emotions and thoughts are somehow tied to a certain structural-functional configuration of stuff known as biology, and that therefore, to start to question whether AI is sentient or not, you have to talk about the plausibility that these complex inner experiences are able to arise in a medium that is not biological. Again, to mention any discussion about basic phenomenological experiences is simply a red herring.
  5. I had some lower back sensitivity particularly last year, and the times I would go into a spontaneous no-mind state (which scares me to death), I would first sense it through my spine. The only thing that helped was to stop all meditation for a couple of years and eat regularly and a lot
  6. Haha no. "Western values" is just as much an Orange slogan as a Blue one. It just means modern techno-democratic values instead of traditional Christian values. The war against terror was and is a modernistic intervention in the "regressive" Middle East. The Ukraine war is in a broad sense similarly a clash between Orange and Red/Blue. Hello? Ever heard of "punch nazis"? You'll rarely ever find a Green that doesn't let standing up for the oppressed derail their perspectival nuance.
  7. That the things that were previously thought of as universals and absolutes had in fact a level of relativity to them, e.g. the idea that language, cultural values and even scientific knowledge is more nuanced and complex than to give such clear-cut answers (you mentioned positivism and the contracted perspective of the traditional dogmatist). It opened up to a more pluralistic and self-aware worldview, which is a necessary step to create a greater unity (socially, spiritually and epistemically), but it then manifested at the extremes in naive skepticism, nihilism and crippling self-criticism. Those are up to the metamodernists to figure out, i.e. "granted or even despite a level of relativism, how do we proceed?"
  8. Let's turn it down a couple of notches, or what guys? ☺
  9. The lesson there is that absolute skepticism or epistemic neutrality is untenable. In the act of opposing yourself to some framework, you've created just another framework. That is why religion became irrelevant in the modern world. It became an ossified, literalist and moralist doctrine, rather than a source of meaning and liturgy that could evolve with the times. The individualistic spirituality arose because the collective spirituality (religion) got severely co-opted by survival constraints. This is what the modern meaning crisis is about: restoring a relevant collective spirituality. It nevertheless identified the problems with the cemented and corrupted aspects of the spiritual traditions (as well as the intellectual traditions) and defined the metamodern project.
  10. I think the most problematic dogma or mode of thinking is naive skepticism, i.e. the unnuanced dismissal or shutdown of intellectual thought and collective wisdom (rationality & science, tradition & myth). It's related to the narcissistic tendencies of (post)modern spirituality and self-help in that it's fundamentally separated from any underlying tradition or principles, and that everybody has to reconstruct the basics from scratch ("the religion of the self"). In this setting, where nobody knows what the fuck is going on, skepticism is the safest bet.
  11. I don't want to know how many comments got deleted here in the rollback ?
  12. I think some defining characters of Tier 2 cognition are context awareness (e.g. "people are shaped by their environment in complex ways") and construct awareness (e.g. "symbols, language, thoughts, beliefs, frameworks, worldviews etc. shape perception and behavior"). Green has decent context awareness, but struggles a bit with construct awareness: Green mainly likes to deconstruct, while Tier 2 is more constructively oriented (e.g. being critical of systems vs. working with systems).
  13. I'm missing a few posts.
  14. I've been to the gym on a microdose of LSD, and I felt weaker and more easily fatigued. Might be related to 5HT2A activity elevating cortisol and lowering testosterone.
  15. How does a systems understanding of psychology differ from a non-systems understanding?
  16. Which "you" are you talking to? There is no you or me, only consciousness. Do you see the problem? You're not being consistent in your use of language (and you're also not at all being courteous to what I'm trying to communicate), and that is because you're playing the Advaita guru game: you're not talking about AI sentience — you're trying to teach me about non-duality. Do you acknowledge that this is happening or will you continue to not address the frame?
  17. ...wait, did you just invoke statistics, i.e. a spectrum from 0-100%? Where are the misogynist types? ?
  18. Don't just assert that. Enlighten me. I spoiled your Advaitan magic trick. Were you doing something else? Let's do this then (even though you were alluding to another Advaitan escape route, i.e. "logic is futile"): Do you and I experience thoughts? Do we experience each other's thoughts? Can AI experience thoughts? That is the question of AI sentience. Consciousness as a transpersonal field of qualities is irrelevant to this discussion. Do you agree?
  19. There are no thoughts, just awareness. There is no awareness, just is-ness. Is-ness is only a construct and a way of speaking. Absolute Truth cannot be spoken etc. You've fallen into the Advaita trap, my friend. I will now allow you to use concepts in a consistent fashion and communicate like a normal person, which is especially important when talking about AI sentience.
  20. How would you explain how different people report different thoughts popping into awareness (singular): person A reports x thoughts, and person B reports y thoughts?
  21. Ok, so now you're retreating back into transpersonal consciousness again. It's not just a semantic rabbit hole: it's Neo-Advaitan whack-a-mole. How would you explain how different people report different thoughts popping into awareness (Person A reports x thoughts, and person B reports y thoughts)?
  22. Do you want to continue or do you want to escape to a semantic rabbit hole?: We agree that consciousness is not a thought and that consciousness is not bound to anything. You are aware of thoughts, and I am aware of thoughts, and you are not aware of my thoughts. Is AI aware of thoughts? That is the question of AI sentience. Do you agree?