Title: Superintelligence: Paths, Dangers, Strategies
Author: Nick Bostrom
Hardcover: ISBN: 978-0-19-967811-2, £18.99. Also available as an: eBook
Imprint: Oxford: Oxford University Press, 2014 (To be published in America in September by Oxford University Press, USA)
Superintelligence asks the questions: What happens when machines surpass humans in general intelligence? Will artificial agents save or destroy us? Nick Bostrom lays the foundation for understanding the future of humanity and intelligent life.
The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. If machine brains surpassed human brains in general intelligence, then this new superintelligence could become extremely powerful – possibly beyond our control. As the fate of the gorillas now depends more on humans than on the species itself, so would the fate of humankind depend on the actions of the machine superintelligence.
But we have one advantage: we get to make the first move. Will it be possible to construct a seed Artificial Intelligence, to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation?
This profoundly ambitious and original book breaks down a vast track of difficult intellectual terrain. After an utterly engrossing journey that takes us to the frontiers of thinking about the human condition and the future of intelligent life, we find in Nick Bostrom’s work nothing less than a reconceptualization of the essential task of our time.
Reviews“Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era.” — Stuart Russell, Professor of Computer Science, University of California, Berkley
“Those disposed to dismiss an ‘AI takeover’ as science fiction may think again after reading this original and well-argued book.” — Martin Rees, Past President, Royal Society
“a magnificent conception … it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs and by physicists who think there is no point to philosophy.” — Brian Clegg, Popular Science
“There is no doubting the force of [Bostrom’s] arguments…the problem is a research challenge worthy of the next generation’s best mathematical talent. Human civilisation is at stake.” — Clive Cookson, Financial Times
“This superb analysis by one of the world’s clearest thinkers tackles one of humanity’s greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn’t become the last?” — Professor Max Tegmark, MIT