Is artificial intelligence really an existential threat to humanity?

Superintelligence: Paths, Dangers, Strategies is an astonishing book with an alarming thesis: Intelligent machines are “quite possibly the most important and most daunting challenge humanity has ever faced.” In it, Oxford University philosopher Nick Bostrom, who has built his reputation on the study of “existential risk,” argues forcefully that artificial intelligence might be the most apocalyptic technology of all.


Voices of Tomorrow and the Leonard M. Rieser Award

In its Voices of Tomorrow feature, the Bulletin of the Atomic Scientists invites graduate students, undergraduates, and high school scholars to submit essays, opinion pieces, and multimedia presentations addressing at least one of the Bulletin's core issues: nuclear weapons, nuclear energy, climate change, biosecurity, and threats from emerging technologies.

Title prefix: 
Short description: 

With climate change, things get hotter. And meaner. And more divided

The bioweapons convention: A new approach

The August 2015 meeting of state parties to the Biological Weapons Convention brought a welcome but little-noticed development: a document submitted by the United States encouraging fellow members to develop a common understanding of “tacit knowledge,” arguably the key determinant of bioweapons development, but one which past nonproliferation efforts have largely ignored in favor of more tangible threats, such as the spread of materials, technologies, and more recent