Low Conviction Opinions

NOTE: The original location of this event was Emily Murphy Park, but has been changed to Dogpatch due to the possibility of rain.

The August meetup will be on Thursday August 17th at 7:00 PM.

Location: Dogpatch

Discussion Topic: Low Conviction Opinions

Tyler Cowen frequently points out the AI risk crowd has strong opinions about AI risk, but have generally failed to publish meaningful evidence of their concerns in reputable journals. I am inclined to agree, and think that this represents a larger problem for EA and rationalism in general. We already have rigorous scientific processes for assessing truth. EA—insofar as it is concerned with AI risk—forgoes these in favor of thought experiments and hypotheticals. Why should I believe that this is a good way to spend resources?

Okay, well, actually, I don’t really believe this. Well, okay, I do kinda sorta think that it might be true, I guess, but I haven’t been particularly invested in this topic. I’m certain that many others better thoughts about this than I do.

What are your low conviction opinions? Do you have any intuitions about the world that you think have a high chance of being wrong?

Bring us one, and let us tell you why you are wrong.

Readings:

None! Just come equipped with any belief that you hold that you think might be subject to change. Anything will do—it doesn’t have to be rationalist-adjacent topics.