If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI

·
· Random House
5.0
3 reviews
Ebook
272
Pages
Eligible
Ratings and reviews aren’t verified  Learn More

About this ebook

AN INSTANT NEW YORK TIMES BESTSELLER

'The most important book of the decade'
MAX TEGMARK, author of Life 3.0

'A loud trumpet call to humanity to awaken us as we sleepwalk into disaster - we must wake up' STEPHEN FRY

‘The best no-nonsense, simple explanation of the AI risk problem I've ever read’ YISHAN WONG, former Reddit CEO

AI is the greatest threat to our existence that we have ever faced.

The scramble to create superhuman AI has put us on the path to extinction – but it’s not too late to change course. Two pioneering researchers in the field, Eliezer Yudkowsky and Nate Soares, explain why artificial superintelligence would be a global suicide bomb and call for an immediate halt to its development.

The technology may be complex, but the facts are simple: companies and countries are in a race to build machines that will be smarter than any person, and the world is devastatingly unprepared for what will come next.

Could a machine superintelligence wipe out our entire species? Would it want to? Would it want anything at all? In this urgent book, Yudkowsky and Soares explore the theory and the evidence, present one possible extinction scenario and explain what it would take for humanity to survive.

The world is racing to build something truly new – and if anyone builds it, everyone dies.
** A Guardian Biggest Book of the Autumn **

Ratings and reviews

5.0
3 reviews
Hussein Azmy
October 7, 2025
A fantastic read so far that is more or less in layman's terms (which is great since I am not a technical person). It is unfortunate that AI companies are risking all of our lives for a quick buck, but at least this book will prove, after the AI apocalypse, that there were those among us who had the wisdom to easily foresee the outcome of developing a super smart AI.
Did you find this helpful?

About the author

Eliezer Yudkowsky (Author)
Eliezer Yudkowsky is a founding researcher of the field of AI alignment, with influential work spanning more than twenty years. As co-founder of the non-profit Machine Intelligence Research Institute (MIRI), Yudkowsky sparked early scientific research on the problem and has played a major role in shaping the public conversation about smarter-than-human AI. He appeared on Time magazine’s 2023 list of the 100 Most Influential People In AI, and has been discussed or interviewed in the New York Times, New Yorker, Newsweek, Forbes, Wired, Bloomberg, The Atlantic, The Economist, Washington Post, and elsewhere.

Nate Soares (Author)
Nate Soares is the president of the non-profit Machine Intelligence Research Institute (MIRI). He has been working in the field for over a decade, after previous experience at Microsoft and Google. Soares is the author of a large body of technical and semi-technical writing on AI alignment, including foundational work on value learning, decision theory, and power-seeking incentives in smarter-than-human AIs.

Rate this ebook

Tell us what you think.

Reading information

Smartphones and tablets
Install the Google Play Books app for Android and iPad/iPhone. It syncs automatically with your account and allows you to read online or offline wherever you are.
Laptops and computers
You can listen to audiobooks purchased on Google Play using your computer's web browser.
eReaders and other devices
To read on e-ink devices like Kobo eReaders, you'll need to download a file and transfer it to your device. Follow the detailed Help Center instructions to transfer the files to supported eReaders.