An aspiring rationalist who picked a lousy time to move to Boston for the social scene, I came for the fanfic and stayed for the principle of charity. This lead to me growing interested in another sort of effective charity, x-risk, and in meeting people who were weird in ways I find familiar and comfortable. If you’re ever Boston area and there isn’t a pandemic on, feel free to say hi.
I’d like to read things about turning rationality into money and/or power as well as things about introducing people to rationality. I commit to reading and commenting on the first ten examples of either that are brought to my attention.
Feedback and suggestions for improvement are very welcome!
It’s true that someone can easily get an excellent calibration score at the cost of getting no points. This tends to be very obvious when you read out the leaderboard. A quick patch is to turn all the questions into statements and have people estimate how likely they think the statement is true. “What is the element with Atomic Weight 29” becomes “The element with Atomic Weight 29 is Copper.” Then there is no easy path to excellent scores of either kind.
That version is a little less fun and I don’t think the change is necessary. I’m curious is if that patch would satisfy your objection? It might be relevant that I don’t view the goal as measuring calibration, but to train it. When I’ve run this, I often see a rapid change in confidences over the course of the first dozen questions as some people who hadn’t previously practiced the skill begin to use numbers other than the highest and lowest available.