Stephen Hawking, probably the most popular physicist in the world, says that humanity has only about 1000 years left on Earth. The only way to save it is to create colonies in other parts of the solar system. “[W]e must . . . continue to go into space for the future of humanity. I don’t think we will survive another 1,000 years without escaping beyond our fragile planet,” professor Hawking explains.
His genuine concern about humanity’s survival led to the question of artificial intelligence, for which he says it’s the worst thing that could happen to humanity. On the other hand, Elon Musk, the famous founder of SpaceX has announced his desire to establish a human colony on Mars. “I don’t have a doomsday prophecy,” he says, “but history suggests some doomsday event will happen.” Hawking estimates that human colonies on Mars won’t happen for at least a hundred years more, and says that we should be very careful in future decades. “Although the chance of disaster to planet Earth in a given year may be quite low, it adds up over time
and becomes a near certainty in the next 1,000 or 10,000 years. By that time, we should have spread out into space and to other stars, so a disaster on Earth would not mean the end of the human race,” Hawking says.
Aside from the climate change, global pandemics, antibiotic resistance and every nation’s nuclear capacities, we may be confronted with unknown enemies soon. Recently, Hawking and Musk were part of a coalition with more than 20 000 experts who called for a ban on autonomous weapon development. Musk is currently in the middle of a research dedicated to AI ethics and considers today’s robots submissive, but has no idea what might happen if robots had no limitations. “AI systems today have impressive but narrow capabilities. It seems that we’ll keep whittling away at their constraints, and in the extreme case they will reach human performance on virtually every intellectual task. It’s hard to fathom how much human-level AI could benefit society, and it’s equally hard to imagine how much it could damage society if built or used incorrectly,” the founders of the coalition said.
Imagine if we were to create robots smarter than us and at the same time discover that aliens have picked up the signals we’ve been putting out in the Universe for decades. What if, given our climate change, aliens suddenly became aggressive and sniff out a weakened enemy who has created artificial life far smarter than itself? It would result in an extraterrestrial war. “I am more convinced than ever that we are not alone. And if the aliens do know of us, “they will be vastly more powerful and may not see us as any more valuable than we see bacteria,”Hawking says in a new online film called Stephen Hawking’s Favorite Places. He says that we should leave our planet and find a new place in the Solar system as soon as possible, and notes that it’s a glorious time to be alive, considering how our fundamental understanding of the universe advanced during his lifetime. “Our picture of the universe has changed a great deal in the last 50 years and I am happy if I have made a small contribution. The fact that we humans, who are ourselves mere fundamental particles of nature, have been able to come this close to understanding the laws that govern us and the universe is certainly a triumph,” professor Hawking explains.