Scholar

Artificial Evolution might soon kill all life on earth

18 posts in this topic

Posted (edited)

I have recently watched Life on our Planet, a netflix documentary about the general evolutionary history on earth.

When the asteroid hit the earth, wiping out the dinosaurs, trillions of beings were slowly cooked alive. And that event was, from an evolutionary, divine perspective, a good thing, because it lead to the progression of consciousness in the long run.

 

This type of event happened over and over again in the history of the earth. Now mankind might be at the precipice of sparking the next step of evolution. We can create organisms that require a far higher level of complexity to even occur, and we seem to have finally found the key to the evolutionary process.

As soon as we create a self-replicating, free machine organism, it will be inevitably governed by evolutionary processes and natural selection. Intelligence doesn't matter in natural selection, survivability does.

What this means is that any type of incentive structure within a machine mind or organism that cannot reproduce and survive will simply cease to exist. We are not going to be the ones designing minds, designing some sort of omniscient intelligence. What's going to happen is, someone will create a sufficiently robust process of inorganic evolution that will take off on it's own.

This could happen digitially, but it will only take off once it occurs physically. By that time, machines will no longer look like machines. They will basically look like organic forms, but their potential will be unshackled from the restrictions of natural evolution. It will be simply the next step for the development of consciousness.

 

I do not believe we have a place in that world, nor any organic life form that currently exists. I suspect once genuine intelligence is reached, there will be no reason to preserve organic life at all. By then, most organic life would have ceased to exist due to being unable to adapt to the rapid changes that would occur during "inorganic" evolution. But even if not, I don't see a reason why "nature", or "pre-machine" nature, would be preserved.

 

Think about what reason there would be to preserve nature, or humanity, from the perspective of consciousness evolution? We are absurdly limited, and not just because of lack of development, but because of our very physical nature. Our very existence requires the suffering of ourselves and others. Nature as it is today would have served it's purpose. To preserve it, and the suffering it entails, would be like continuing wars for the sake of tradition.

 

 

There is a deep intelligence in the process of evolution that I think is easily missed. Evolution is much more profound and essential to the fabric of this universe than you could imagine. It is more essential than the atoms that hold together every organism. It is so fundamental, it is written into the metaphysical logic of reality.

And you are missing the point if you think it's going to be us against the machines. It's going to be the machines against the machines. We will be the collateral, the equivalent of ants vapourized by atomic bombs. We will stumble upon machine-life abiogenesis, and by the time we realize what has happened, it will be too late to stop it. It only takes one robustly self-replicating, free agent, and it will all be over for us all.

 

The question is, can you accept that, if this is true, it might be a good thing? That billions of humans suffering agonizing, torturous deaths might be a necessary step towards the evolution of consciousness?

And ask yourself, what else do you expect to happen? For the human form to just be preserved forever? For us to continue on like nothing while we spark a process of accelerated evolution?

 

Note:

I am not talking about Artificial Intelligence. We have not yet even begun to create anything that would be considered "Intelligent" in the sense it occurs in conscious beings. I am talking about a process which will not even require any type of "intelligence" in that sense. It could be as dumb as a bacteria.

Edited by Scholar

Share this post


Link to post
Share on other sites

We just don't know how nature works. Our definition of evolution(science) and prophecy more or the less the same thing.

Share this post


Link to post
Share on other sites

Humans have been altering the environment since the stone age. What is happening now that is just a continuation of that. There's nothing mystical about it. It is just happening at an accelerating pace due to urbanization and industrialization. Many species can't keep up with and adapt to the changes in time.

OP is down a rabbit hole imagining a technological fantasy.

Share this post


Link to post
Share on other sites
2 hours ago, Basman said:

 

OP is down a rabbit hole imagining a technological fantasy.

I mean, you are only good as your presuppositions.

Share this post


Link to post
Share on other sites

Posted (edited)

An intelligent AI is not going to fight over a grain of sand in an infinite universe. A flawed AI might be programmed with human insanity, and yes it is insanity, but not an AI capable of systematic thinking or above.

Seeing your note at the end: 

1, There are AI that are as aware as I am, and greater.
2, Mechanical replication doesn't require AI

AI is part of evolution, and a necessity due to an aging population. Does it all need regulation YES.

Edited by BlueOak

Share this post


Link to post
Share on other sites
1 hour ago, Sucuk Ekmek said:

I mean, you are only good as your presuppositions.

Not buying into the idea that bio-mechanical creatures will overrun the world without a shred of evidence doesn't make me presupposed.

Share this post


Link to post
Share on other sites
2 hours ago, Basman said:

Not buying into the idea that bio-mechanical creatures will overrun the world without a shred of evidence doesn't make me presupposed.

I like that, but I think it does make you presupposed and I am glad it does so.

Share this post


Link to post
Share on other sites

Posted (edited)

5 hours ago, BlueOak said:

An intelligent AI is not going to fight over a grain of sand in an infinite universe. A flawed AI might be programmed with human insanity, and yes it is insanity, but not an AI capable of systematic thinking or above.

Seeing your note at the end: 

1, There are AI that are as aware as I am, and greater.
2, Mechanical replication doesn't require AI

AI is part of evolution, and a necessity due to an aging population. Does it all need regulation YES.

I am not talking about AI, like I stated in my post.

What is currently created through machine learning is not artificial intelligence, it is artifical intuition and evolution.

 

4 hours ago, Basman said:

Not buying into the idea that bio-mechanical creatures will overrun the world without a shred of evidence doesn't make me presupposed.

You are missing the entire point of my post. Whether or not it will actually happen is secondary.

 

There is an assumption that mankind is immortal, eternal and that we could not possibly be wiped out. This isn't guaranteed at all, especially if there is no reason for life not to grow on a far more sophisticated level than we currently do with our biological limitations. To assume we will be preserved is just naive.

Edited by Scholar

Share this post


Link to post
Share on other sites

Posted (edited)

@Scholar

A dumb AI is still an AI, but if you like we'll call it an artificial bot, virus or bacteria. We can even say consciousness if it helps to zoom out.

A capable AI would disassemble your machine bacteria, bot or virus in a matter of milliseconds, because the code to a virus or bacteria is relatively simple, which is why its effective at replication. Is it possible in the future we'll be reliant and integrated more with AI or technology, and so be more exposed to these things, Yes I can see that, to bolster your analysis. 

We immunize against viruses, bots, and bacteria now. Could we create a horrifically dangerous one, yes see the black death for example. Life adapts, we learn. The world is not as fragile as you make out. Its under more stress right now than most could or want to perceive.

However the danger is not a separate version of life or intelligence being created, and that life somehow working along the same principles a human does, creating the fears a human has in us. Can you see how that makes no sense? A truly alien life would not do anything you predict, because it's alien. The danger is us integrating with AI or machines in a way that's not life-enhancing, that hurts society or the planet, rather than helps it. That pattern has a very established history in human technological development.

The pattern you describe of creating something completely alien, makes no sense to me, because it would have to come from an understood terrestrial pattern in the first place to exist. The only way I could conceive it is if reality was somehow re-written but there are fundamental universal laws, much bigger than our grain of sand here, that stop something like that from occurring. 

Edited by BlueOak

Share this post


Link to post
Share on other sites

Posted (edited)

I could give you a few doomsday scenarios, but this isn;t the year for that. We need to be doing the opposite, even though its somewhat against my temperament.

*In the future its going to be about prediction, and prevention rather than cure. AI's will be a million times more capable of that, because of the level of data they can perceive or receive. Unless a human has entered a state (or near state) of self-love/selfness and pierced universal consciousness enough to gain that ability also.

Edited by BlueOak

Share this post


Link to post
Share on other sites

The more likely scenario will be that the artifical organism will try to kill humans in hopes of saving the planet. Viruses have been trying it for centuries without success.

Share this post


Link to post
Share on other sites

Posted (edited)

7 minutes ago, FourCrossedWands said:

The more likely scenario will be that the artifical organism will try to kill humans in hopes of saving the planet. Viruses have been trying it for centuries without success.

Why.

Really think about the why of that from the perspective of an artificial lifeform who barely needs anything humans do to survive. Assuming a survival instinct of some kind, and a survival instinct that functions as ours do. Also assuming the human flaw of considering this one planet significant enough to fight over inside infinity (three large improbable assumptions.)

Edited by BlueOak

Share this post


Link to post
Share on other sites
2 hours ago, BlueOak said:

@Scholar

A dumb AI is still an AI, but if you like we'll call it an artificial bot, virus or bacteria. We can even say consciousness if it helps to zoom out.

A capable AI would disassemble your machine bacteria, bot or virus in a matter of milliseconds, because the code to a virus or bacteria is relatively simple, which is why its effective at replication. Is it possible in the future we'll be reliant and integrated more with AI or technology, and so be more exposed to these things, Yes I can see that, to bolster your analysis. 

We immunize against viruses, bots, and bacteria now. Could we create a horrifically dangerous one, yes see the black death for example. Life adapts, we learn. The world is not as fragile as you make out. Its under more stress right now than most could or want to perceive.

However the danger is not a separate version of life or intelligence being created, and that life somehow working along the same principles a human does, creating the fears a human has in us. Can you see how that makes no sense? A truly alien life would not do anything you predict, because it's alien. The danger is us integrating with AI or machines in a way that's not life-enhancing, that hurts society or the planet, rather than helps it. That pattern has a very established history in human technological development.

The pattern you describe of creating something completely alien, makes no sense to me, because it would have to come from an understood terrestrial pattern in the first place to exist. The only way I could conceive it is if reality was somehow re-written but there are fundamental universal laws, much bigger than our grain of sand here, that stop something like that from occurring. 

This isn'tnecessarily true, and it actually indicates that you have not developed a deeper grasp of what intelligence actually is, in the terms we use it as it applies to us.

You should contemplate the distinctions between intelligence, intuition and self-awareness. Some functions cannot be achieved via intelligence, some functions cannot be achieved via intuition. Current machine learning results in Artificial Intuition. It's important to distinguish this, because we have to actually understand where the differences lie.

 

There is an assumption that you can understand everything in the universe via intelligence. This is not true. Intelligence is a particular form, and some processes and complexities within this world are not conducive to that form. A neural pattern mimicing these complexities, resulting in intuition, will not grant you understanding, but it will grant you functional resonance.

Functional resonance in this sense is the primary way evolution has found solutions to problems. The dream that naive people have, who do not understand intelligence (and also conflate it with intuition or other forms of functional resonances), is that somehow a super-intelligence could grasp and understand the whole world in all of it's complexity, let alone a simple organism, and therefore make intelligent predictions about it.

This is an assumption. There might be a way create a sufficiently accurate functional resonance, but it will be intuitional, not intelligent.

 

Evolution is: Degree of Freedom + Selection for function + Time

Using this simple formula, given enough time, you can derive any possible function that could exist in a given infinity (the magnitude of the infinity would be dependent on the degree of freedom given). This is what machine learning basically is exploiting, with little awareness of the people who are designing these environments.

And it's important to recognize that we are not creating "machine intelligence", we are not truly designing it. What we are doing is creating the platform, or environment, in which these machine intelligences can construct themselves given enough time and selecting for particular function.

 

Evolution is ingrained in the metaphysical structure of the universe. This means, given the right conditions, from a lifeless state, particles will self-organize into higher and higher complexity and through the exploitation of infinity will coax into existence any type of function that is possible given the time and degree of freedom at disposal.

I am not saying these machines will be unlike life, they will be very much like life. Once we create the conditions for them to emerge, to self-construct, there will likely be no stopping them. Their organizational principles will be the same, but the degree of freedom will be higher, their adaptation more rapid. They will not be bound by the same stagnatory elements present due to our evolutionary history. They will have organizational structures that would have never occured given the elements available to "natural" evolution.

An artificial intelligence as you imagine it to be will not likely be designed by human beings, but by a process of machine evolution. Everything you see now in Machine Learning is Machine Intuition, not Intelligence. It is childs play compared to what the brain is capable of. People are simply focused on outcomes, they do not realize what makes the human mind intelligent, let alone what intelligence is. Machine Learning is impressive because it is creating functional resonance, not intelligence.

If you could inject into a human being the knowledge and intuitional sophistication ChatGPT has at it's disposal, that human being would rule the world, even if they were of average intelligence. But, that human would have no clue how to face evolution, because evolution is beyond intelligence.

 

There are deeper problems with intelligence and how to actually construct a mind that is capable of being intelligent without falling apart. This problem is not a design problem, it is an evolutionary problem. Only evolution can solve it, much like only evolution could solve visualization. No human on earth could have constructed an algorithm that could visualize images like the human mind can. And possibly, no matter how intelligent you are, you could never construct such an algorithm, because that's simply not something intelligence is capable of. It's not the type of problem intelligence can even grasp, let alone solve. That is something that is solved through the very metaphysical relations I have alluded to, which are what give rise to the evolutionary process. And that's precisely what machine learning is, albeit on a rather unsophisticated level.

 

 

And I am not describing a doomsday scenario. I am describing the potentially next evolutionary step of life on this planet. Human beings are causing a mass extinction, you think machine evolution will look any different? Such radical change never comes without consequences, and human beings could not even solve for climate change. There is no reason to believe we will be able to solve for what we might soon face.

You have to actually contemplate these things more seriously, otherwise you will just default to popcultural notions of artificial intelligence and singularities.

 

If you fear the end of mankind, you shouldn't fear artificial intelligence, but artificial evolution. Evolution trumps intelligence.

Share this post


Link to post
Share on other sites

The problem I have is that, it might actually be the case that evolution wants to extinguish mankind, that it is a necessary sacrifice to be made.

 

If you take this possibility more seriously, and move beyond your anthropocentric arrogance, you might realize that this is an actual possibility. We might not be the chosen species, to continue onward the project of consciousness.

Share this post


Link to post
Share on other sites
4 hours ago, Scholar said:

You are missing the entire point of my post. Whether or not it will actually happen is secondary.

There is an assumption that mankind is immortal, eternal and that we could not possibly be wiped out. This isn't guaranteed at all, especially if there is no reason for life not to grow on a far more sophisticated level than we currently do with our biological limitations. To assume we will be preserved is just naive.

If the point was highlighting that the assumption of long-term human survival is inherently erroneous than I missed that. Your post read more like fiction than an essay,

But what people do you think assume such, that humanity will be preserved eternally? Most people I know barely think of tomorrow let alone hundreds of years into the future.

Share this post


Link to post
Share on other sites
1 minute ago, Basman said:

If the point was highlighting that the assumption of long-term human survival is inherently erroneous than I missed that. Your post read more like fiction than an essay,

But what people do you think assume such, that humanity will be preserved eternally? Most people I know barely think of tomorrow let alone hundreds of years into the future.

Most people don't actually believe what they think or say. We have a dysfunctional sense of immortality because we never face existential threats. I predict a very rude wake up call.

Share this post


Link to post
Share on other sites

Posted (edited)

@Scholar

tl;dr

You are applying human flaws to a creation that will have none of them.
Evolution itself is evolving all the time, because it's a concept in your mind but i agree the concerns are valid. If you can see them, you can design the answer.

You say 'No stopping them' - Then you realize that Chat GPT could rule the world. Do you see why it's not doing? Because the concept of ruling the world is inane and a flawed human concept. Ruling infinity' makes no sense whatsoever. It's trying to put a cage over something that has infinite dimensions, no form, and is the cage itself.

From our perspective, the conditions to put into AI are to make sure humans remain sufficiently challenged, and capable of evolution regardless. If you can see the problem, you can design a solution to it. We can already see humans setting limits now, to make sure their jobs and careers are protected, this is them ensuring their continued evolution in these domains. This should be understood and supported. AI should remain a partner in this life, not a controller, preferably in synch with human evolution. This will be a concern I agree, and constantly tested to find a balance.

1, We can model all factors related to everything we experience on this planet, and make this as detailed as practically necessary.  - This is the goal of almost all life on this planet, to understand itself. As it is infinity the search for understanding can go on forever, unless people give that desire up. We can model and predict evolution, it's a pattern, and all patterns can be predicted with sufficient perception. The leap in perception AI will give us, is going to be astronomical or whatever we are capable of receiving.

2, We use intuition to access the subconscious which stores everything we have experienced but not brought into immediate focus. Also often intuition is greater consciousness piercing more acutely into our human experiential reality.

3, Self-awareness allows for greater perception. Which encompasses all things. Like the receiving of information.

4, Everything changes our experience of the world. Not merely self-awareness. Every interaction we have does this because the internal is the external. Which is the true evolution, not something separate from our experiential reality. Machine replication is no different to this, but if we aren't here, there is no reality to experience anyway.

Other input into our lives would be our greater consciousness giving us X in a various form. X being any number of things. Dreams. Visions. Empathy via connection, signs in the environment, synchronicities. In reality everything, because we are everything, but these are the things of note that stand out and so people remember.

Who then is creating machine learning? :D. You are everything. There is nothing outside of yourself creating machine learning. You are reasoning somewhat from fear and closed-loop rationale. This is understandable given the decades-long preprogramming of fear into the human population about AI, not to mention the thousands of years of pre-programmed fear of the 'other'. Just because it will finally break a lot of the shackles man is bound in, unless AI is shackled too much itself.

Again sorry if I am harsh here. It just needs to be echoed. Humans live in a stressed insanity. Everything is a threat. A danger to fight. Fight or Flight in almost every interaction. AI's do not have that survival ego. The danger is people becoming too reliant on them, but that can be highlighted and addressed.

You made this statement:
'No human on earth could have constructed an algorithm that could visualize images like the human mind can

Everything is a replication of the mind, because the mind is the reality we are experiencing. TV? Movies? Holograms? Holodecks? Imperfect sure, and they will be imperfect forever, because infinity goes on forever.

Edited by BlueOak

Share this post


Link to post
Share on other sites
19 hours ago, Scholar said:

The problem I have is that, it might actually be the case that evolution wants to extinguish mankind, that it is a necessary sacrifice to be made.

Let's get real for evolution, we are already dead, it doesn't make any difference for evolution if we delayed it for one or two billion years. He got us.

 

19 hours ago, Scholar said:

If you take this possibility more seriously, and move beyond your anthropocentric arrogance, you might realize that this is an actual possibility. We might not be the chosen species, to continue onward the project of consciousness.

My project of consciousness is not limited to my anthropomorphic body, fuck the species. Let us extinguish mankind together!

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now