Is artificial intelligence really an existential threat to humanity?

Superintelligence: Paths, Dangers, Strategies is an astonishing book with an alarming thesis: Intelligent machines are “quite possibly the most important and most daunting challenge humanity has ever faced.” In it, Oxford University philosopher Nick Bostrom, who has built his reputation on the study of “existential risk,” argues forcefully that artificial intelligence might be the most apocalyptic technology of all.

Undefined

Voices of Tomorrow and the Leonard M. Rieser Award

In its Voices of Tomorrow feature, the Bulletin of the Atomic Scientists invites graduate students, undergraduates, and high school scholars to submit essays, opinion pieces, and multimedia presentations addressing at least one of the Bulletin's core issues: nuclear weapons, nuclear energy, climate change, biosecurity, and threats from emerging technologies.

Undefined

Letter from Kazakhstan: Why we believe in the nuclear fuel bank

With the US Congress soon to vote on the nuclear deal between Iran and six world powers, opinion polls suggest that the American public is divided over whether to accept it. One new factor to consider is a development that could potentially reinforce the agreement, while minimizing any incentives for Iran to pursue uranium fuel production capabilities.

Undefined

Europe won't bow to an anti-Iran-deal US Congress

As the US Congress debates the deal struck between Iran and six world powers to curb Tehran’s nuclear program, many of those who oppose the agreement expect Europeans to fall into line if Washington rejects it. Some US senators, claiming unrealistically that a better deal is possible, think they will be able to persuade or coerce European allies into renegotiating. Their assumption is unlikely to hold, though, and could have damaging repercussions for trans-Atlantic relations.

Undefined

Register Now for Our Teleconference with Henry Sokolski on “Our Not So Peaceful Nuclear Future”

Pages