The Listener
  • The Listener home
  • The Listener E-edition
  • Opinion
  • Politics
  • Health & Nutrition
  • Arts & Culture
  • New Zealand
  • World
  • Business & Finance
  • Food & Drink

Subscriptions

  • Herald Premium
  • Viva Premium
  • The Listener
  • BusinessDesk

Sections

  • Politics
  • Opinion
  • New Zealand
  • World
  • Health & nutrition
  • Business & finance
  • Art & culture
  • Food & drink
  • Entertainment
  • Books
  • Life

More

  • The Listener E-edition
  • The Listener on Facebook
  • The Listener on Instagram
  • The Listener on X

NZME Network

  • Advertise with NZME
  • OneRoof
  • Driven Car Guide
  • BusinessDesk
  • Newstalk ZB
  • Sunlive
  • ZM
  • The Hits
  • Coast
  • Radio Hauraki
  • The Alternative Commentary Collective
  • Gold
  • Flava
  • iHeart Radio
  • Hokonui
  • Radio Wanaka
  • iHeartCountry New Zealand
  • Restaurant Hub
  • NZME Events

SubscribeSign In

Advertisement
Advertise with NZME.
Listener
Reviews
Home / The Listener / Reviews

Book of the Day: If Anyone Builds It, Everyone Dies: The case against superintelligent AI

Danyl McLauchlan
Review by
Danyl McLauchlan
Politics Writer/Feature Writer/Book Reviewer ·New Zealand Listener·
21 Oct, 2025 05:00 PM5 mins to read
Danyl McLauchlan is a politics writer, feature writer and book reviewer for the NZ Listener

Subscribe to listen

Access to Herald Premium articles require a Premium subscription. Subscribe now to listen.
Already a subscriber?  

Listening to articles is free for open-access content—explore other articles or learn more about text-to-speech.
‌
Save
    Share this article

    Reminder, this is a Premium article and requires a subscription to read.

A 3D image of a Dyson swarm, a hypothetical megastructure that could orbit a star to harvest its energy. Photo / Getty Images

A 3D image of a Dyson swarm, a hypothetical megastructure that could orbit a star to harvest its energy. Photo / Getty Images

Eliezer Yudkowsky – Big Yud to his enemies, of whom there are many – is one of the brightest lights in the alien constellation of Silicon Valley’s intelligentsia: the blaze of tech-adjacent druids, fascists, Buddhists, vegans, accelerationists and anarchocapitalists gleefully imagining the future the rest of us will have to live – or die – in.

Yudkowsky became internet-famous in the early 2010s when he published Harry Potter and the Methods of Rationality, a 2000-page fan-fiction serial in which the boy wizard is instructed in logic, cognitive psychology, game theory and rationalist thought. You can read this online and if you enjoy it you can donate money to the Machine Intelligence Research Institute in San Francisco, which was founded by Yudkowsky and promotes the argument animating much of his life’s work: that if anyone builds artificial superintelligence the human race will be destroyed.

This isn’t an original observation; the entire genre of science fiction has been sounding this warning for more than 100 years. But in the books and movies, the plucky humans generally pull through. Perhaps we short-circuit the computer by presenting it with a paradox; sometimes we teach it to love and cherish all life. Other times we blow it up.

For Yudkowsky and his co-author Nate Soares, the institute’s president, those scenarios radically underestimate the inhuman nature and likely capability of machine intelligence. It won’t reason in anything we recognise as language – some of today’s models operate more efficiently thinking in vectors of 16,384 numbers instead of words.

Transistors can switch billions of times a second; at best the neurons of our nervous systems spike a few hundred times a second. This means the machines already think orders of magnitude faster than we do – subjectively, a week for us is a century for them. A superintelligence will outperform all humans in all knowledge domains and cognitive tasks and multitask by spinning up new instances of itself by the million.

For Yudkowsky and Soares, there’s no war between humans and such a being. We just die. Perhaps it will be when the superintelligence turns all of the organic matter on the planet into computational resources or when it boils the oceans dry by using them for coolant. Or certainly after it has built a Dyson swarm – a megastructure surrounding the sun to capture all of its energy – and the Earth falls into darkness.

All of which sounds like science fiction. Yudkowsky knows people say this – he’s been arguing about the topic for decades, has heard every objection. Imagine jumping back 200 years and describing our world to your ancestors. Sounds like science fiction, doesn’t it? Now look at the rate of human technological change since the early 19th century, project it forward another 200 years and remember that two weeks for us is like two centuries for the superintelligence. It should sound like science fiction. Indeed, the reality will be weirder than the weirdest science fiction because our stories are constrained by the limits of human cognition and the superintelligence operates outside that.

Are we doomed? Not if we solve the alignment problem. Every programmer knows their code does exactly what they’ve told it to do, not what they want it to do – and exactly what they’ve told it to do often leads to programs consuming all a computer’s resources until it crashes. The alignment problem is the challenge of ensuring advanced AI systems reliably pursue the goals we intend rather than the goals we’ve explicitly set. Programmers can rewrite their code but AIs aren’t programmed, they’re grown. They’re not legible or rewritable in the manner of traditional software. At present we can retrain them or turn them off because we’re smarter than they are.

Advertisement
Advertise with NZME.

But what happens when we aren’t? Yudkowsky wants a moratorium on advanced AI research, the closure of large GPU (interconnected computer) clusters and a ban on large training runs until the alignment problem is solved. In a recent New York Times op-ed, he argued that it’s morally justifiable to carry out air strikes on rogue data centres. This did not endear him to the frontier AI companies.

A moratorium is unlikely. The “magnificent seven” – Apple, Microsoft, Nvidia, Tesla, Amazon, Meta and Alphabet – have all bet heavily on AI. The chips Yudkowsky wants to ban are manufactured by Nvidia, the world’s most-valuable company. The future of the species might be at stake but no one’s going to write off that much shareholder value.

Discover more

Premium
Reviews

Book of the Day: The Revolutionists: The Story of the Extremists Who Hijacked the 1970s

20 Oct 05:00 PM
Premium

Top 10 bestselling NZ books: October 18

17 Oct 04:58 PM
Premium
Reviews

Should women start planning for divorce from the day they’re married? Economist Corinne Low thinks so

13 Oct 05:00 PM
Premium
Reviews

Book of the Day: Sparta: The Rise and Fall of an Ancient Superpower by Andrew Bayliss

12 Oct 05:00 PM

So it’s comforting to look to the arguments against this theory of AI doom: the technology might plateau, or market incentives might constrain it – the cost of these models scales very steeply. Safety research is inventing new tools and alignment methods all the time. And today’s models are statistical tools with no agency. It’s not obvious they’ll become monomaniacal beings with grandiose goals once they hit some cognitive threshold. If you ask GPT-5 to assess Yudkowsky’s ideas, it replies they rely on a series of ingenious thought experiments rather than empirical forecasts, allowing him to present a highly speculative outcome as inevitable.

But the most sci-fi-friendly critique is that when we look up at night we see stars. If Yudkowsky’s form of machine intelligence is such a probable outcome of technological progress we should see darkness, as the power of those suns is harnessed for computation by AIs developed by other species.

Yudkowsky would scorn all of this as motivated reasoning: people want him to be wrong and build their arguments from there.

Most AI researchers admit there’s some level of danger from superintelligence. A recent leak from OpenAI revealed Sam Altman wanted to build a bunker to hedge against the risk of advanced models going rogue. How many chances do you want to take with the future of our species? And as for GPT-5’s criticisms – well, it would say that, wouldn’t it?

If Anyone Builds It, Everyone Dies:The case against superintelligent AI, by Eliezer Yudkowsky & Nate Soares (Bodley Head, $40), is out now. Image / Supplied
If Anyone Builds It, Everyone Dies:The case against superintelligent AI, by Eliezer Yudkowsky & Nate Soares (Bodley Head, $40), is out now. Image / Supplied
Save
    Share this article

    Reminder, this is a Premium article and requires a subscription to read.

Advertisement
Advertise with NZME.
Advertisement
Advertise with NZME.

Latest from The Listener

Listener
Listener
The sound of 2025: The year’s best music
Graham Reid
ReviewsGraham Reid
|Updated

The sound of 2025: The year’s best music

The Listener’s music writers on their essential albums of 2025.

14 Dec 04:59 PM
Listener
Listener
Top volumes: Colourful personal history of NZ rock underground in 21st century
Russell Brown
ReviewsRussell Brown

Top volumes: Colourful personal history of NZ rock underground in 21st century

14 Dec 04:58 PM
Listener
Listener
Danyl McLauchlan’s annual awards for political bravery & buffoonery
Politics

Danyl McLauchlan’s annual awards for political bravery & buffoonery

14 Dec 05:02 PM
Listener
Listener
Can you really trust your gut? The science of acting on intuition
Books

Can you really trust your gut? The science of acting on intuition

14 Dec 05:01 PM
NZ Herald
  • About NZ Herald
  • Meet the journalists
  • Contact NZ Herald
  • Help & support
  • House rules
  • Privacy Policy
  • Terms of use
  • Competition terms & conditions
  • Manage your print subscription
  • Subscribe to Herald Premium
NZ Listener
  • NZ Listener e-edition
  • Contact Listener Editorial
  • Advertising with NZ Listener
  • Manage your Listener subscription
  • Subscribe to NZ Listener digital
  • Subscribe to NZ Listener
  • Subscriber FAQs
  • Subscription terms & conditions
  • Promotion and subscriber benefits
NZME Network
  • NZ Listener
  • The New Zealand Herald
  • The Northland Age
  • The Northern Advocate
  • Waikato Herald
  • Bay of Plenty Times
  • Rotorua Daily Post
  • Hawke's Bay Today
  • Whanganui Chronicle
  • Viva
  • Newstalk ZB
  • BusinessDesk
  • OneRoof
  • Driven Car Guide
  • iHeart Radio
  • Restaurant Hub
NZME
  • About NZME
  • NZME careers
  • Advertise with NZME
  • Digital self-service advertising
  • Book your classified ad
  • Photo sales
  • NZME Events
  • © Copyright 2025 NZME Publishing Limited
TOP