I’m sure they’ll put out a full post, but go give a like and retweet on Twitter/X if you think they are deserving. They make their pitch to consider pre-ordering earlier in the X post.
Blurb from the X post:
Above all, what this book will offer you is a tight, condensed picture where everything fits together, where the digressions into advanced theory and uncommon objections have been ruthlessly factored out into the online supplement. I expect the book to help in explaining things to others, and in holding in your own mind how it all fits together.
Sample endorsement, from Tim Urban of Wait But Why, my superior in the art of wider explanation:
“If Anyone Builds It, Everyone Dies may prove to be the most important book of our time. Yudkowsky and Soares believe we are nowhere near ready to make the transition to superintelligence safely, leaving us on the fast track to extinction. Through the use of parables and crystal-clear explainers, they convey their reasoning, in an urgent plea for us to save ourselves while we still can.”
If you loved all of my (Eliezer’s) previous writing, or for that matter hated it… that might *not* be informative! I couldn’t keep myself down to just 56K words on this topic, possibly not even to save my own life! This book is Nate Soares’s vision, outline, and final cut. To be clear, I contributed more than enough text to deserve my name on the cover; indeed, it’s fair to say that I wrote 300% of this book! Nate then wrote the other 150%! The combined material was ruthlessly cut down, by Nate, and either rewritten or replaced by Nate. I couldn’t possibly write anything this short, and I don’t expect it to read like standard eliezerfare. (Except maybe in the parables that open most chapters.)
Eliezer Yudkowsky and Nate Soares are putting out a book titled:
If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
I’m sure they’ll put out a full post, but go give a like and retweet on Twitter/X if you think they are deserving. They make their pitch to consider pre-ordering earlier in the X post.
Blurb from the X post: