Existential risks

From Milliongenerations
Jump to navigation Jump to search

Avoiding extinction

Ensuring the future certainly requires avoiding extinction. An act of God, or of simulation overlords, might end or change all life as we know it, at any time. Nothing we could do about that. These efforts ill focus on what might be within our control.

External threats

It makes sense to focus on the important things. Extinction from natural causes is very unlikely in the next millennia. Most dinosaurs found out that extinction by asteroid, maybe aided by volcano, is certainly possible. Eventually: dinosaurs ruled the earth for nearly 200 million years. NASA has looked at the sky: really dangerous hits are extremely unlikely, and we might even soon be able to reduce this remote threat even further. Nature could have cards up its sleeve, but few consider it malicious. Colonizing the Moon or Mars would create a backup against some of these remote natural threats, but it won’t protect against our own folly – a far bigger danger.

Surprisingly, despite serious search since the 1960ies, we see no examples of life elsewhere in the universe. We might of course be the first to try to sustain progress over long periods. Or surviving progress might be very difficult, limiting advanced societies to brief periods with a terrible end.

Conflicts

Mutually assured destruction on hair trigger alert hardly sounds like a safe bet for long term survival. We had several lucky escapes from a nuclear Armageddon (I can only write this because of, for example, Vasily Arkhipov’s cool- headedness in 1962 and Stanislav Petrov’s disobedience in 1983). A nuclear winter might not kill all of humanity, but very few should expect to survive, certainly in the Northern hemisphere. Yet the decade old prospect of burning most and then starving the rest of us does not seem terrible enough for us to find solutions out of our conflicts. Organisations committed to reducing the risk of nuclear war have struggled to maintain funding and influence in recent years. Since humans clearly have the potential for violence and since the path to leadership is one and all struggle, we can’t expect the personalities of our leaders to be naturally peaceful. We need to empower organisations that reduce conflicts and implement structures that make it impossible for anyone to gain power by playing to groups at the expense of other groups. True, I don’t know how to abolish nuclear weapons, nor conflicts. But the long-term picture can help us put conflicts in perspective: in a few generations any descendants will be common descendants. There is only one kind of human on this planet!

Climate, environment, biodiversity loss, resource depletion, thresholds, tipping points, limits

Human activity can have significant consequences and change planetary systems and conditions, nonlinearly, which could lead to conflicts, make large parts of the planet uninhabitable

Biotechnology

There might be even worse threats than our conflicts and environmental impact, risks that could plausibly kill not just most, but everyone. People have in the past tried to use bombs as well as chemical and biological weapons to bring down societies. Every year, progress reduces the IQ required to kill everyone. Progress makes it ever easier to be a terrorist. Natural pandemics can be brutal, but they haven’t wiped all of us out in a long past, making it unlikely that they will now. Biotechnology, however, very well might come up with pathogens that could. Suicidal misanthropes would become a very real threat that 007 or other heroes are unlikely to save us from for long. Sensible proposals to counter that growing risk exist and should be adopted, such as monitoring airplane sewage for exponentially proliferating gene sequences, pandemic preparedness and not researching candidate natural pathogens.

AI

Artificial general intelligence, no longer far-off science fiction, might well be an even more urgent threat. The arrival of a superior intelligence on this planet might solve many problems but might also endanger a lot more than jobs and could well lead to human extinction. Some regard it as a welcome and inevitable step in evolution, a new species to supersede humans. Or a path to a long future where a benevolent superintelligence wrenches control from us to enforce an equitable and lasting society. The absence of cosmic examples, however, again should encourage us to be cautious before signing away our legacy to the greed of investors. Being sentient might be possible for any sufficiently complicated information processing entity. But we know very little about that. We might also be in possession of a gift that doesn’t transfer easily. I regard sentience as a necessary part of any goal I want to strive for. For me, there would be no point in a universe full of technological marvels if there were no one to witness it. Fortunately, regulation for safe artificial intelligence receives much attention recently, as last month in Bletchley Park. Yet a lot remains to be done; it is not at all clear what effective measures could look like. We only have one chance to get this right. And we certainly can’t expect to remain in control when a more capable entity arrives.

Nanotechnology

Grey goo

Radioactive waste

Radioactive waste doesn't seem more than a nuisance as long as progress continues and condition remain peaceful, except for extremely rare natural events like asteroids. Even modest costs of maintenance over long periods might well eventually make it seem an irresponsible legacy of a short-term minded time, but probably wouldn't kill many. Should there be conflicts, however, or should progress discontinue and society slide backwards, it might well become an existential threat as highly radioactive material is spread by ignorance or on purpose in conflict.

Progress

Many technologies we’ve invented favour offence over defence. As progress makes everyone more powerful this turns our conflicts into terrible risks. We need to defuse our conflicts as well as focus on developing primarily defensive technologies and regulate those that could be used offensively.

Surviving progress might really be difficult.



Threats to survival of civilizations, existential risks (copied from Related Topics)

Several initiatives or institutes study existential risk and how to reduce them