Stephen Hawking, as he has done before, expresses a common sentiment, that we need to colonize space in order to survive.
Humans should go and live in space within the next 1,000 years, or it will die out, Stephen Hawking has warned.
“We must continue to go into space for the future of humanity,” Mr Hawking said. “I don’t think we will survive another 1,000 years without escaping beyond our fragile planet.”
…In February, he said that humans should colonise other planets as “life insurance” for the species, and could be the only way of ensuring that humanity lives on.
My first reaction to this is that if we’re looking for space colonies to ensure the survival of the human race, we have a long way to go. It seems to me that the first goal is simply to create a successful viable long term closed ecological system that can support humans. As I understand it, every experiment attempting to do this so far has failed. I think we need to succeed pretty strongly at that before attempting to do it in habitats millions of miles away, like on Mars. Until we do, any space colony is going to be crucially dependent on a thin and fragile lifeline from Earth’s biosphere.
It’s also worth noting that, once we can create a closed ecological system, we might be better off creating colonies here on Earth. A closed hardened underground habitat would be a lot easier to build and maintain and would probably do just as much to ensure humanity’s survival.
Anyone who thinks doing off world colonies is a substitute for fixing our environmental and social problems doesn’t understand the obstacles involved in any foreseeable colony. Mars, the best candidate right now, is cold and desolate in a way that makes Antarctica look like The Garden of Eden. Add no oxygen, very low air pressure, and we have an environment that humans can’t exist in without spacesuits. Add radiation exposure from Mar’s lack of a magnetic field, that would force humans to stay underground most of the time, and the idea of consigning humans to live there for the rest of their lives starts to look a bit sadistic.
(None of this is to say that I think we shouldn’t have researchers and scientists on Mars, just as we currently do in Antarctica. But no one is really tempted to colonize Antarctica.)
Looking at the longer term, people talk about things like terraforming. But I strongly suspect that, by the time we have the technology and power to actually have a chance at terraforming an environment, we’re going to find that it’s a lot cheaper and easier to modify ourselves for the environment rather than the environment for us. We will likely colonize other worlds, but doing so will probably force us to give up the evolved forms that are fine tuned for Earth’s biosphere and location.
At the end of the lecture, Hawking encouraged his audience to “look at up at the stars and not down at your feet”.
I’ve written before about the immense difficulties in any foreseeable interstellar travel. In short, FTL (faster than light) travel, a common plot device in science fiction, would most likely require a new physics. But before you let that bother you, consider that even getting to an appreciable percentage of the speed of light will require appalling amounts of energy. (Think in terms of fuel equivalent to the mass of a planet possibly being necessary to accelerate a decent sized manned ship to, say, 10% of the speed of light.)
Our most likely path to the stars will be microscopic probes, with enough intelligence to bootstrap an infrastructure at the destination solar system using local resources, and to transmit their findings back to us. It’s hard to see human interstellar travel being anything but the most extravagant of vanity projects, unless mind uploading of some type or another becomes possible.
Stephen Hawking has repeatedly warned of the danger that humanity finds itself in, as a result of the rise of artificial intelligence and the dangers of human aggression and barbarity.
I’ve written repeatedly about why I think the dangers of AI, although real to some degree, are vastly overblown. I won’t reopen that debate here. The only thing I’ll point out is that if AIs are a danger on Earth, they’d also be a danger in a space colony, or anywhere else we’d go and be tempted to use them.
On the dangers of human aggression and barbarity, if we did solve the problems of closed ecosystems and had colonies around the solar system, and humanity reached a point where it destroyed Earth’s biosphere in a war, it’s not clear to me why such a war would stop there. It’s extremely difficult to protect yourself from a space based attack. The attacker can always go further out to accelerate an asteroid or something similar at you, allowing kinetic energy to wrought destruction. Space colonies might slightly increase the probability that humanity survives such a war, but not nearly as much as people like to think.
None of this is to say that I think humans shouldn’t colonize space, in the long term. But thinking that we are doing it to preserve the species is misguided, except in the very broadest of terms and time scales. (Think human intelligence, in one form or another, surviving the evolution and eventual death of the Sun.)
In the mean time, our best chance of survival, it seems to me, is to address the real issues we have here, because we’re a lot more likely to destroy ourselves than to have nature do it to us. The threats of nuclear war or terrorism, global warming, biological warfare, or overall overpopulation, worry me a lot more than a species ending asteroid strike or other mass extinction event, which only happens once every 50-100 million years. (Not that we shouldn’t do what we can to protect against asteroid strikes. Even one that doesn’t endanger the whole world can cause a lot of devastation.)
I think the best way to protect against the threats of us destroying ourselves, indeed the only way over the long term, is to give as much of humanity as possible a stake in the success of human civilization. This involves fighting poverty worldwide, and promoting women’s rights, which will help with the population problem, which in turn helps with just about every other problem.
If we really want to maximize humanity’s long term survivability, that’s where we should start. The good news is that, when viewed through the broad sweep of history, things are moving in the right direction. The only question is whether that movement will be fast enough.
- Click to share on Facebook (Opens in new window)
- Click to share on Twitter (Opens in new window)
- Click to share on Reddit (Opens in new window)
- Click to share on Tumblr (Opens in new window)
- Click to email this to a friend (Opens in new window)
- Click to print (Opens in new window)
- Click to share on Pinterest (Opens in new window)
- Click to share on Pocket (Opens in new window)