What should we be worried about? Real Scenarios that keep scientists up at night. This is the title of the new book edited by the literary agent John Brockman, founder of the website Edge.org, a discussion forum that includes novelist Ian McEwan, musician Brian Eno, physicists Frank Wilczek and Freeman Dyson, as well as yours truly, 13.7's Tania Lombrozo, and a couple of hundred of other academics and intellectuals. Every year, Brockman asks this group a "question." Answers in the form of short essays are compiled in paperback volumes with the intention of providing food for thought, as well as showcasing some of the cutting-edge ideas in science and technology and how they shape our culture. This is year, nothing less than 150 answers were published, of which I provide a meager sample here.
Even if I tend to contribute every year, this time I'm absent; not because I don't have anything to worry about (who doesn't?) but because I was real busy with my own book (due out in June) when Brockman's email with the question came about.
Evolutionary psychologist Steve Pinker places the threat of future wars in the minds of unstable leaders or groups who believe their values to stand morally above all others. He quotes the UNESCO slogan: "since wars begin in the minds of men, it's in the minds of men that the defenses of peace must be constructed." To Pinker, wars are more the result of pathologies and tribal impulses than of the scarcity of resources or economic expansionism.
Vernor Vinge, the mathematician and science-fiction writer who coined the term "singularity" to designate the point when machines will be able to think and outperform us, fears that the increasing automation of nuclear weapons, in particular their control systems, make them vulnerable to terrorist groups. Martin Rees, astronomer royal of Great Britain agrees, painting a cataclysmic scenario based on the social and economic chaos that global warming will unleash, as millions migrate from coastal zones and the financial system collapses. For good measure, Rees adds bioterrorism, cyberterrorism and nanoterrorism as other threats for the future. Philosopher Daniel Dennett and historian of science George Dyson agree, especially with cyberterrorism, stressing that the world would essentially collapse if someone could sabotage the Internet. Many of the contributions touch on similar themes.
Another cluster of fears surrounds the question of consciousness and its relation with machines. Timo Hannay, managing director of Digital Science and co-organizer of the Sci Foo conferences, eloquently describes how our ignorance of what consciousness is and how it manifests itself in other species can lead to a growing indifference to living beings and the future of the biosphere. Max Tegmark, a physicist at MIT, worries that artificial intelligence will make us superfluous, a topic we have discussed here before. Others disagree, doubting that such a possibility is viable or that the singularity will ever come.
On the other hand, what is improving at a tremendous rate is the efficiency of virtual realities created in videogames. Will people in a couple of generations be unable to distinguish between real and virtual reality? What would it even mean to be able to experience false realities that feel true? Welcome to the world of holographic pink flamingos!
Given that I don't have much more space here, I mention only one more fear, that of the growing polarization between two social groups, the Engineers and the Druids, as suggested by Paul Saffo, from Stanford University. The Engineers believe that science will solve all the troubles of the world, present and future: hunger, illness, scarcity of water and energy. The Druids believe the opposite, that the excess of science will lead to even worse crises and that no solutions will be found. The way out, Saffo suggests, is to find some kind of balance between the two, free of tribal fanaticism. After all, everything starts in our minds. And perhaps that's the biggest worry of them all, that our minds are our freedom and our prison.