Hello folks.
I was wondering what you think happens when we die.
When I say "die", I mean in the sense most people think of it. (some people think we go to heaven, some think it's nothing, and so on).
As I understand. What many spiritual teachers teach, is that you're not limited to what dies, like my physical body.
So when I "die". Wouldn't it make sense, that I realize it was all an illusion, and in turn see the truth as it is? (enlightenment?)
(And no. I won't kill myself )
I could probably write this better somehow. But I think you should see my point, despite how limited language can be in this sense.
Thanks.
BR. Simon Christiansen.