As we face the uncertain future of the world as we knew it, we need to all remember Stephen Hawking's words, who predicted back in 2016 that humans have 1,000 years left on Earth.
The famous physicist added that our only option would be to occupy other planets. So, how did late scientist get to such ominous predictions?
The end of the world as we know it
The physicist, one of the greatest minds who ever lived, said that advances in science and technology could lead to the end of the world.
He gave several possible scenarios, as he explained to BBC:
"Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years."
"By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race."
During a talk at the Oxford Union debating society, Hawking further added:
"We must also continue to go into space for the future of humanity."
"I don't think we will survive another 1,000 years without escaping beyond our fragile planet."
He continued explaining that technological advances in nuclear weaponry may lead to our demise.
Another reason why Hawking believed our time's limited is the climate change. He said:
"We don't know where global warming will stop."
"But the worst-case scenario is that Earth will become like its sister planet Venus with a temperature of 250 [Celsius] and raining sulfuric acid. The human race could not survive in those conditions."
The scientist was also afraid of what AI will bring to humankind:
"Artificial Intelligence technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms."
Finally, Hawkin did mention man-made viruses as a way to destroy the human race.
Hawking on human-made viruses
Back in 2016, the scientist warned that genetically engineered viruses could wipe out entire populations.
Though he didn't say much, Hawking warned us that it's only a matter of time before "antibiotic-resistant viruses that we never imagined could exist will develop."
Just four years later, we're facing one of the biggest pandemics in history, and though the virus isn't from the lab, it shows that we're powerless in front of such an enemy.
If you combine all of his fears, it's clear that the human race will destroy itself. That's why this great mind encouraged everyone to think about plan B.
Many great minds tried to put a stop to AI and autonomous weapons back in 2015. Apart from Hawking, Elon Musk, Steve Wozniak, and Noam Chomsky also expressed their concerns. Similar to lab-made viruses, autonomous weapons can wipe this planet promptly with the help of artificial intelligence.
Life on Mars
Apart from all the human-made mistakes, Hawking believed that we're closer to aliens than ever before. He stated:
"I am more convinced than ever that we are not alone…they will be vastly more powerful and may not see us as any more valuable than we see bacteria."
Though it sounds like something from the X-files, many scientists agreed that alien life could end us. Nikola Tesla certainly thought so, and though eccentric, the man brought us electricity.
Despite all the advantages, we have never been this vulnerable. So, an extraterrestrial invasion doesn't sound farfetched if you think about it.
Every achievement comes with consequences. Many are already exploring other planets, searching for a possible new home for humans. It would be selfish to think that we're all alone in the universe, and, one of these days, the explorers might find more than they were hoping for.
The disadvantages of AI
The theoretical physicist claimed:
"The development of full artificial intelligence could spell the end of the human race."
Professor Hawking said that "the primitive forms of artificial intelligence developed so far have already proved very useful." Still, he feared the consequences of creating something that can match or surpass humans.
In other words, Hawking warned us about the Terminator scenario. If artificial intelligence keeps growing, eventually, the machines will get smarter than us. They will be able to work without a human touch.
The late professor added:
"It would take off on its own and re-design itself at an ever-increasing rate."
"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."
Where do we go from here?
We have two options. One is to embrace the pending doom, and the other is to try to change our wrongdoings. Though the second option seems more reasonable, it's also less likely.
Despite our many great qualities, unlike troglodytes, we put too much effort into becoming rich and powerful. It would be foolish to believe that someone whose greed brought them so much would give up their fortune to save the planet, which isn't going to stop existing tomorrow.
However, despite the chaos we are still experiencing due to Covid-19, we can still do something. Going back to nature and putting our health and wellbeing above indulgence are a necessity. Facing our mortality is a strange yet liberating feeling.
Keep that in mind and, as Hawking once said: "look up to the stars and not down at your feet." Even if our planet has its expiration date, it's our duty to be more informed and aware of our actions.