Some of the leading minds of our time have recently started to call for limits on AI.

This is a great pair of posts detailing the whole situations Part One Part Two]

My argument against it is simple, if we blow this chance to create ASI, and take life to the next level there could never be another chance in the life our this universe. I don’t think we should let our fear stop life evolving.

We do not know if there is other intelligent life out there, we can’t detect it and we don’t know why. This is a great and scary post on the Fermi paradox. It is possible that Intelligent life at our level is incredibly rare, or we could even be unique. So there are potentially very few chances for life to emerge from biotic, mortal life to abiotic essentially immortal life.

We also may not have the capability to create ASI for long. If in deed we ever can achieve it. Our civilisation at it’s current level of tecnology will not last for long, we are burning up the planets resources fast. Even if humans survive the ecological collapse, we will not easily be able to get to our current level of technology again. We have picked all the low hanging fruit in terms of minerals and energy.

We also know that eventually life on this planet will expire, in billions of years earth will be swallowed by the sun. Any life which has not managed to find a way out of the solar system is done for. Of course a near by super nova could do the job much sooner.

So essentially we (all life on earth) are doomed without ASI, we may not be with it.

Bootnote

I prefer the term Aboitc Inteligance to Artificial. The word ‘Artificial’ can imply fake, or inferior, and can seem prejudicial, aboitic is just descriptive.