Superintelligence: Paths, Dangers, Strategies is an astonishing book with an alarming thesis: Intelligent machines are “quite possibly the most important and most daunting challenge humanity has ever faced.” In it, Oxford University philosopher Nick Bostrom, who has built his reputation on the study of “existential risk,” argues forcefully that artificial intelligence might be the most apocalyptic technology of all.
In its Voices of Tomorrow feature, the Bulletin of the Atomic Scientists invites graduate students, undergraduates, and high school scholars to submit essays, opinion pieces, and multimedia presentations addressing at least one of the Bulletin's core issues: nuclear weapons, nuclear energy, climate change, biosecurity, and threats from emerging technologies.
Touring Alaska this week to shine a spotlight on global warming, President Obama warned that “climate change is no longer some far-off problem. It is happening here; it is happening now. Climate change is already disrupting our agriculture and ecosystems, our water and food supplies, our energy, our infrastructure, human health, human safety. Now. Today.”
In September, Congress will vote on a bill to rescind US adoption of the Joint Comprehensive Program of Action (JCPOA), the agreement negotiated by the P5+1 with Iran that limits the main elements of Iran’s nuclear program for the next 15 years and other elements for the next 25 years.
The Iran nuclear deal, more formally known as the Joint Comprehensive Plan of Action, may be the most consequential, comprehensive, and politically fraught diplomatic agreements in modern history.
President Obama has staked his foreign policy legacy on the agreement’s ability to prevent a nuclear-armed Iran and avert another Middle East war; but in the next two weeks or so, the deal’s fate will be decided by a Congress that has been hell-bent on blocking the president’s agenda.