I agree it fits well here. However, it has a very different tone from other posts on the MIRI blog, where it has also been posted.
Laziness. Though I note Stuart_Armstrong had the same opinion as me, and offered even fewer means of improvement, and got upvoted. I should have also said I agree with all points contained herein, and that the message is an important one. That would have reduced the bite.
This article is very heavy with Yudkosky-isms, repeats of stuff he’s posted before, and it needs a good summary, and editing to pare it down. I’m surprised they posted it to the MIRI blog in its current form.
Edit: As stated below, I agree with all the points of the article, and consider it an important message.
Any RSS feeds?
Eliezer thinks it’s a big deal.
Even in that case, whichever actor has the most processors would have the largest “AI farm”, with commensurate power projection.
That interview is indeed worrying. I’m surprised by some of the answers.
Great news! I’ve been waiting for this kind of thing.
More likely, he also “always thought that way,” and the extreme story was written to provide additional drama.
Thank you for replicating the experiment!
Somewhat upper middle class job; low cost of living, inexpensive hobbies, making donations a priority.
I donated $5000 today and continue my $1000 monthly donations.
I feel, and XiXiDu seems to agree, that his posts require a disclaimer or official counterarguments. I feel it’s appropriate to point out that someone has made collecting and spreading every negative aspect of a community they can find into a major part of their life.
So MIRI and LW are no longer a focus for you going forward?
Note XiXiDu preserves every potential negative aspect of the MIRI and LW community and is a biased source lacking context and positive examples.
Skin reacts to light, too.
tl;dr: buy Index Funds, like the Vanguard Total Stock Market Index, because money can be turned into a great many utilons after holding it for a long time.