aurum

Member
  • Content count

    5,923
  • Joined

  • Last visited

Everything posted by aurum

  1. But how are you thinking about love in this case? If you mean "strong feelings", then that's a false binary. Love can be relatively stable over the years. And yes, even entropy might happen. That does not necessarily mean your marriage is unhappy or unsuccessful.
  2. Just frame control your toilet
  3. Looking for love in all the wrong places, as they say. I think the biggest thing is having proper expectations. Most of our frustrations come from expecting things that can't be fulfilled. Your toilet has never let you down, because you never expect it to be anything but a toilet.
  4. That feels like an impossible expectation at a certain point.
  5. The total scale of human civilization is somewhat new, yes. But scale increase itself is not new. The scale is always bigger than ever. 200 years ago, the scale was bigger than ever.
  6. Natural resources have always been scarce and worried about. Again, nothing new. People just lack perspective.
  7. There's nothing uniquely chaotic about our current times. If anything, our times are pretty stable.
  8. Yes it's not a game technique, but it's more than mere self-esteem. It's autonomy of mind.
  9. You don't need to "frame control" her, but you should be able to maintain your own frame. If you can't maintain your frame when it matters, you are genuinely cooked. That would be a steelman of the frame control argument.
  10. @thierry My guess is she just knows she’s going back to real life. And she’s not wanting to get more invested now that vacation is over. It doesn’t sound like anything specifically wrong you did.
  11. Start with understanding feminism.
  12. Don't overcomplicate it. If she matched with you, assume she's already interested in a date. You don't have to "land" anything. A full conversation is often not necessary on a dating app. Just a few messages can be enough. As soon as things seem like they're heading in the right direction, get to setting up the logistics of the date.
  13. Of course. But the entire purpose of genuine truth-seeking is to overcome such self-deception. If the concern is self-deception, then what people need is more truth-seeking, not less.
  14. Conspiracy theorists are atrocious with their epistemology. They lack construct-awareness and introspection. They speculate beyond what they can verify. They focus all their skepticism outward rather than on self. They rationalize from their survival. Their perspectives are unholistic. They are usually completely unaware of their own bias. Which is not to say all conspiracy theories are automatically wrong. It just means the sense-making process is atrocious.
  15. @theleelajoker That could be the case for some people. But it would be mistake to reduce truth-seeking to just that. Truth-seeking by definition involves confrontation with reality and facing imaginary fears. And yes, that is possible. Also, consider that everyone needs truth to be able to navigate reality. To frame this as an inherently pathological need for control is a mistake. It would set a bar for being non-controlling that no one will be able to reach.
  16. The argument has not been that it "can't" eventually evolve to AGI. The argument has been that tech CEOs are selling a fantasy on how soon it's going to happen and what the ramifications are for society. They are selling this fantasy because of some combination of genuinely not understanding how intelligence works + financial incentive + silicon valley group think. I personally think we will eventually get AGI.
  17. DeepMind is somewhat different than strict LLMs, but it is essentially still just a prediction engine. Reinforcement learning is based on reward functions that allow it to predict the probability of the best move.
  18. Yes that would be a good start. Although labeling intelligence as a binary, i.e true intelligence or not true intelligence, can be a bit misleading. Really what you have is a spectrum of intelligence. Saying “true intelligence” is mostly short-hand for a relatively high intelligence that is more conscious of its unity. Low levels of intelligence still exist, but will be less conscious of its unity.
  19. No, not at all. The LLM is at its core just a prediction algorithm based on its training data. It cannot perceive, self-reflect, reason, generalize principles, generate deeply novel insight, update itself or act informally. It does not even know it exists. This is why the jagged intelligence phenomena happens. LLMs are just brute force compute. They are not a true intelligence.
  20. But what would it even mean to sell "pure truth" anyway? Truth must include relative survival advice.
  21. To perceive and act in alignment with truth.
  22. Maybe. But then I'd say it shouldn't be considered intelligence.
  23. Intelligence should not be reduced to problem-solving ability. If we create a reductionistic definition for intelligence, of course our conclusions about a super-intelligent AI will be wrong.
  24. Brain paradigm only gets you so far. Consciousness cannot be understood at deep levels through the paradigm of having a brain.
  25. I have frameworks and overarching principles which I use during sense-making. I would not say I use any formal process. My sense-making tends to be highly informal.