Personally, I would steer away from titles which make it clear that the book is talking about the dangers posed by strong AI. Why? Because the audience that it’s most useful to target is those who’re already interested in AI, but not already afraid of it, and for most readers, fear of AI is more likely to pattern match to Luddism or vague “human dignity” concerns than anything resembling MIRI’s position. You can bridge the gap of inferential distance in the space of a book, but not in the space of a subtitle.
Vote cast.
Personally, I would steer away from titles which make it clear that the book is talking about the dangers posed by strong AI. Why? Because the audience that it’s most useful to target is those who’re already interested in AI, but not already afraid of it, and for most readers, fear of AI is more likely to pattern match to Luddism or vague “human dignity” concerns than anything resembling MIRI’s position. You can bridge the gap of inferential distance in the space of a book, but not in the space of a subtitle.