Prize idea: Transmit MIRI and Eliezer’s worldviews

Motivation

Claim (80% confidence): At least 50% of the disagreement between people who align more with MIRI/​Eliezer and those who align more with opposing clusters of views (Christiano, Garfinkel, etc.) is caused not by rational disagreement, but by more subconscious emotional stuff like tastes, vibes, culture, politics, etc.

10%20%30%40%50%60%70%80%90%
At least 50% of the disagreement between people who align more with MIRI/Eliezer and those who align more with opposing clusters of views (Christiano, Garfinkel, etc.) is caused not by rational disagreement, but by more subconscious emotional stuff like tastes, vibes, culture, politics, etc.

In particular, I think Eliezer’s writing tends to appeal more to people who:

  1. Enjoy reading fiction, and are okay with points primarily being made via parables

    1. Similarly, don’t mind long pieces with no summaries and a lack of clear organizational structure of the claims being made and the evidence for them

  2. Aren’t turned off by perceived arrogance (I’m not taking a stance in this post on whether the arrogance level is justified or not)

Past attempts to communicate their worldview such as the MIRI conversations have helped some, but I think mostly weren’t attacking the core issue for which my current guess is large differences in communication styles.

For people who aren’t high on the above axes, I think Eliezer’s writing often tends to be fairly hard to read and offputting, which is unfortunate. It leads to people not taking the points as seriously as they should (and perhaps has the opposite effect on those who the style resonates with). While I disagree with MIRI/​Eliezer on a lot of topics, I agree with them to some extent on a lot and think it’s very valuable to understand their worldview and build a “MIRI-model”.

I’ve written briefly about my personal experience taking Eliezer and MIRI less seriously than I should have here. I still haven’t read most of the sequences and don’t intend to read HPMOR, but I now take MIRI and Eliezer’s worldviews much more seriously than I used to.

Proposal

An idea to mitigate this is to give out substantial prizes for write-ups which transmit important aspects of the MIRI/​Eliezer worldview in ways which are easier for people with different tastes to digest.

A few possible candidates for this “transmission” include (not an exhaustive list!):

  1. The Sequences

  2. Eliezer’s and Nate’s portion of the MIRI conversations

  3. 2022 MIRI Alignment Discussion

    1. AGI Ruin: A List of Lethalities may be a good candidate, though while extremely important I think this one is already decently structured and argued relative to some other pieces

I propose prizes of ~$1-10k (weighted by importance) for pieces that do a good job at this, judged by some combination of clarity to those with different tastes from MIRI/​Eliezer and how well they maintain a representation of MIRI/​Eliezer’s views. I’d be able to commit ~$1-5k myself, and would be happy for commitments from others.

But won’t the writeups not do a good job conveying MIRI’s intuitions?

If a writeup is getting popular but contains serious mistakes, MIRI/​Eliezer can chime in and say so.

Isn’t MIRI already trying to communicate their view?

I think so, but it seems great to also have outsiders working on the case. I’d guess there are advantages and disadvantages to both strategies.

Have I talked to MIRI about this?

Yes. Rob suggested going for it and re-iterated the above, saying MIRI’s involvement shouldn’t be a bottleneck to investigating MIRI’s worldview.

Next steps

I’d love to get:

  1. Feedback on whether this is a good idea, and suggestions for revision if so

  2. Volunteers to help actually push the prize to happen and be judged, which I’m not sure I really want to do myself

  3. Commitments to the prize pool

  4. Other ideas for good pieces to transmit