The July issue of Wired magazine includes an interview with Peter Diamandis, who can fairly be described as one of the more prominent technological enthusiasts on the planet. Among other things, Diamandis is a co-founder of Singularity University, co-author of Abundance: The Future is Better than You Think, and co-chairman of Planetary Resources, the company that recently announced plans to mine precious metals from asteroids in outer space.
The first question Wired asks Diamandis is whether he's always wanted to change the world.
"No," he answers. "My first ambition was to get off the world."
He goes on the explain that he's dreamed since childhood of helping humanity become a "multiplanetary species." We're driven genetically to explore, he explains, but there's more to it than that.
"I believe we have a moral obligation to back up the biosphere, take it off-planet, and give ourselves the safety of ubiquity."
The phrase "the safety of ubiquity" caught my eye. It reminded me of a comment by another of the world's leading technological enthusiasts, Ray Kurzweil.
Kurzweil is the author of The Singularity is Near, which predicts that by 2045 humans will merge with their machines, creating a new race of immortal super-beings. He's also Peter Diamandis' co-founder at Singularity University. Among the many fantastic predictions Kurzweil makes in his book is this one, on page 29:
The law of accelerating returns will continue until nonbiological intelligence comes close to 'saturating' the matter and energy in our vicinity of the universe with our human-machine intelligence…Ultimately the entire universe will become saturated with our intelligence. This is the destiny of the universe.
Set aside for the moment the presumption that we can know the destiny of the universe. The question I would like to ask Kurzweil and Diamandis is whether they've taken a good look lately at conditions here on Planet Earth.
If they have, I would then hope they might be able to tell us how they can possibly look forward with eagerness to the day when human intelligence will "saturate the universe," and on what basis they could describe such a state of affairs as "the safety of ubiquity."
Photo credit: The Guardian/Andrew Brusso/Corbis
©Doug Hill, 2012