A team of researchers at Oxford University, the Global Challenges Foundation and the Global Priorities Project recently issued a report, “Global Catastrophic Risks.” In it, they discuss and issue rankings for events than can eliminate ten-percent or more of the human population within the next five years. Actually, the report suggests that a person may be five times more likely to die in an extinction event than in a car crash.
Sebastian Farquhar, director at the Global Priorities Project told the Press Association,
There are some things that are on the horizon, things that probably won’t happen in any one year but could happen, which could completely reshape our world and do so in a really devastating and disastrous way. History teaches us that many of these things are more likely than we intuitively think. Many of these risks are changing and growing as technologies change and grow and reshape our world. But there are also things we can do about the risks.
Mr. Farquhar says,
What we want to worry about in the future, though is that it becomes easier and cheaper to do a lot of things in an almost of-the-shelf kind of way, or to order the parts, for say a smallpox virus, off the Internet that may start to change. We have seen that in the field of synthetic biology and genetic manipulation of small organisms or things like viruses that cost have come down unbelievably in the last decade. It is still too expensive to worry about rogue groups trying to use the technology but that might not remain true.
A major threat in the future is something most people don’t think about. The emergence of artificial intelligence poses a major threat.
Mr. Farquhar explains,
There is really no particular reason to think that humans are the pinnacle of creation and the best thing that is possible to have in the world. It seems conceivable that some AI systems might at some point in the future be able to systematically out-compete humans in a bunch of different domains and if you have a sufficiently powerful form of that kind of artificially intelligent system, then it might be the case that if its goals don’t match with what humanity’s values are then there might be some sort of adverse consequences. So this doesn’t depend on it becoming conscious, it doesn’t depend on it hating humanity, it is just a matter of it being powerful, its objectives being opaque or hard to determine for its creators, and it being in some sense indifferent to at least some of the things we find valuable.
This may seem like far-out science fiction, but is a concern discussed among some of the world’s most brilliant minds. In 2014, Stephen Hawking, Elon Musk, Steve Wozniak and hundreds of others issued a letter unveiled at the International Joint Conference warning that artificial intelligence can be potentially more dangerous than nuclear weapons.
Bill Gates, in 2015 said,
I am in the camp that is concerned about super intelligence. First, the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that, though, the intelligence is strong enough to be a concern. I agree with Elon Musk and some others on this and don’t understand why some people are not concerned.
The report was not issued to frighten people but to call for cooperation in the international community to implement planning for pandemics, to prepare health systems, to investigate the risks of biotechnology and artificial intelligence and to continue to decrease the number of nuclear weapons.
Mr. Farquhar sums up,
What is really important to remember is that many of these risks don’t stop at the borders and wait patiently for their passports to be checked, they are truly global in nature. This is not the sort of thing where one country can say, “Oh well, we are prepared and the rest of the world can fend for itself.” That is one of the things we saw with the Ebola crisis is how this thing spilled over national borders.
The report offers a sobering view on how humanity is on the verge of extinction and what the world community must to lesson the risk of an apocalypse. Scary stuff!