Members get a snapshot view of new Long Now content with easy access to all their member benefits.
Published quarterly, the member newsletter gives in-depth and behind the scenes updates on Long Now's projects.
Special updates on the 10,000 Year Clock project are posted on the members only Clock Blog.
Filmed on Friday June 11, 02004
See our blog for updates on tickets and other media; tickets will go on sale one month before the Seminar.
One reason lots of people don’t want to think long term these days is because technology keeps accelerating so rapidly, we assume the world will become unrecognizable in a few years and then move on to unimaginable. Long-term thinking must be either impossible or irrelevant.
The commonest shorthand term for the runaway acceleration of technology is “the Singularity”—a concept introduced by science fiction writer Vernor Vinge in 1984. The term has been enthusiastically embraced by technology historians, futurists, extropians, and various trans-humanists and post-humanists, who have generated variants such as “the techno-rapture,” “the Spike,” etc.
It takes a science fiction writer to critique a science fiction idea.
Along with being one of America’s leading science fiction writers and technology journalists, Bruce Sterling is a celebrated speaker armed with lethal wit. His books include The Zenith Angle (just out), Hacker Crackdown, Holy Fire, Distraction, Mirrorshades (cyberpunk compendium), Schismatrix, The Difference Engine (with William Gibson), Tomorrow Now, and Islands in the Net.
The Seminar About Long-term Thinking on June 10-11 was Bruce Sterling examining “The Singularity: Your Future as a Black Hole.” He treated the subject of hyper-acceleration of technology as a genuine threat worth alleviating and as a fond fantasy worth cruel dismemberment.
Sterling noted that the first stating of the Singularity metaphor and threat came from John Von Neuman in the 1950s in conversation with Stan Ulam—”the rate of change in technology accelerates until it is mind-boggling and beyond social control or even comprehension.” But it was science fiction writer Vernor Vinge who first published the idea, in novels and a lecture in the early 1980s, and it was based on the expectation of artificial intelligence surpassing human intelligence. Vinge wrote: “I believe that the creation of greater than human intelligence will occur during the next thirty years. I’ll be surprised if this event occurs before 2005 or after 2030.” Vinge was not thrilled at the prospect.
The world-changing event would happen relatively soon, it would be sudden, and it would be irrevocable.
“It’s an end-of-history notion,” Sterling drawled, “and like most end-of-history notions, it is showing its age.” It’s almost 2005, and the world is still intelligible. Computer networks have accelerated wildly, but water networks haven’t—in fact we’re facing a shortage of water.
The Singularity feels like a 90s dot-com bubble idea now—it has no business model. “Like most paradoxes it is a problem of definitional systems involving sci-fi handwaving around this magic term ‘intelligence.’ If you fail to define your terms, it is very easy to divide by zero and reach infinite exponential speed.” It was catnip for the intelligentsia: “Wow, if we smart guys were more like we already are, we’d be godlike.”
Can we find any previous Singularity-like events in history? Sterling identified three—the atomic bomb, LSD, and computer viruses. The bomb was sudden and world changing and hopeful—a new era! LSD does FEEL like it’s world changing. Viruses proliferate exponentially on the net. LSD is pretty much gone now. Mr. Atom turned out to be not our friend and has blended in with other tools and problems.
Singularity proponents, Sterling observed, are organized pretty much like virus writers—loose association, passionate focus, but basically gentle. (They’d be easily rounded up.) “They don’t have to work very hard because they are mesmerized by the autocatalyzing cascade effect. ‘Never mind motivating voters, raising funds, or persuading the press; we’ve got a mathematician’s smooth line on a 2D graph! Why bother, since pretty soon we’ll be SUPERHUMAN. It’s bound to happen to us because we are EARLY ADAPTERS.’”
Vernor Vinge wrote: “For me, superhumanity is the essence of the Singularity. Without that we would get a glut of technical riches, never properly absorbed.” Said Sterling, “A glut of technical riches never properly absorbed sounds like a great description of the current historical epoch.”
Sterling listed five kinds of kinds of reactions to the Singularity. 1) Don’t know and don’t care (most people). 2) The superbian transhumanists. 3) The passive singularitatians—the Rapture of the Nerds. 4) The terrified handflapping apocalypse millennialists (a dwindling group, too modish to stay scared of the same apocalypse for long). 5) The Singularity resistance—Bill Joy killjoys who favor selective relinquishment. Sterling turned out to be a fellow traveler of the Resistance: “Human cognition becoming industrialized is something I actually worry about.”
Vinge did a great thing, said Sterling. The Singularity has proved to be a rich idea. “In the genre of science fiction it is more important to be fruitfully mistaken than dully accurate. That’s why we are science fiction writers, not scientists.”
Suppose some kind of Singularity does come about. Even though it is formally unthinkable to characterize post-Singularity reality, Sterling proposed you could probably be sure of some things. The people there wouldn’t feel like they are “post”-anything. For them, most things would be banal. There wouldn’t be one Singularity but different ones on different schedules, and they would keep on coming. It would be messy. Death would continue as the great leveler.
Suppose humanity elected to slow down an approaching Singularity to a manageable pace, what could we actually do? How do you stop a runaway technology?
—You could reverse the order of scientific prestige from honoring the most dangerous new science (such as nuclear physics) to honoring the most responsible restraint—a Relinquishment Nobel. It would be scientific self-regulation.
—You could destroy scientific prestige through commercialization. Scientists diminish into mercenaries—”put-upon Dilberts and top-secret mandarins.” It would still be dangerous, though.
—Have a few cities leveled by a Singularity technology and you could bounce into world government with intense surveillance and severe repression of suspect technologies and technologists. Most societies are already anti-science; this would fulfill their world view. “You can run but you can’t hide! You will be brought to justice or justice will be brought to you! Into the steel cage, Mr. Singularity. Into Guantanamo till you tell us who your friends are. Then they join you in there.” Or maybe it’s not that fierce, and it’s all done by benevolent non-governmental organizations.
Sterling concluded: “It does come down to a better way to engage with the passage of time. The loss of the future is becoming acute. The most effective political actors on the planet right now are guys who want to blow themselves up—they really DON’T want to get out of bed in the morning and face another day. People need a motivating vision of what comes next and the awareness that more will happen after that, that the future is a process not a destination. The future is a verb, not a noun. Our minds may reach the ends of their tethers, but we’ll never stop futuring.”--Stewart Brand
Condensed ideas about long-term thinking summarized by Stewart Brand
(with Kevin Kelly, Alexander Rose and Paul Saffo) and a foreword by Brian Eno.
We would also like to recognize George Cowan (1920 - 2012) for being the first to sponsor this series.Would you like to be a featured Sponsor?
Seminars About Long-term Thinking is made possible through the generous support of The Long Now Membership and our Seminar Sponsors. We offer $5,000 and $15,000 annual Sponsorships, both of which entitle the sponsor and a guest to reserved seating at all Long Now seminars and special events. In addition, we invite $15,000 Sponsors to attend dinner with the speaker after each Seminar, and $5,000 Sponsors may choose to attend any four dinners during the sponsored year. For more information about donations and Seminar Sponsorship, please contact email@example.com. We are a public 501(c)(3) non-profit, and donations to us are always tax deductible.
The Long Now Foundation • Fostering Long-term Responsibility • est. 01996 Top of Page