Ditto with “conspiracy.” I’d argue that giving LW the language and trappings of a 12-year old boys’ club is ultimately detrimental to its mission, but it looks like I’m in the minority.
The business about the Bayesian Conspiracy is, I think, more an in-joke than anything else. Eliezer’s written various bits of fiction set in a future world featuring an actual “Bayesian Conspiracy”, and he’s on record as saying that there’s something to be said for turning things like science and rationality into quasi-mystery-religions (though I expect he’d hate that way of putting it) -- but he’s not suggesting that we actually should, nor trying to do so.
Dunno whether such things help or hinder the mission of LW. I think it would be difficult to tell.
It just seems at odds with the scientific ethos of cutting out the bullshit whenever possible. Instead, Eliezer seems bent on injecting bullshit back into the mix, which I’d argue comes at the expense of clarity, precision, and credibility. However, I do realize it’s a calculated decision intended to give normally dry ideas more memetic potential, and I’m not in a position to say the trade-off definitely isn’t worth it.
Deliberately so. The original OB posts started with it as a thought experiment, “what if we kept science secret, so people would appreciate its Awesome Mysteries?”
Despite that, I think that whole style is a tremendous mistake. It’s an interesting thought experiment, but we should be clear that it runs completely counter to the things that actually bring about accurate results.
Ditto with “conspiracy.” I’d argue that giving LW the language and trappings of a 12-year old boys’ club is ultimately detrimental to its mission, but it looks like I’m in the minority.
The business about the Bayesian Conspiracy is, I think, more an in-joke than anything else. Eliezer’s written various bits of fiction set in a future world featuring an actual “Bayesian Conspiracy”, and he’s on record as saying that there’s something to be said for turning things like science and rationality into quasi-mystery-religions (though I expect he’d hate that way of putting it) -- but he’s not suggesting that we actually should, nor trying to do so.
Dunno whether such things help or hinder the mission of LW. I think it would be difficult to tell.
It just seems at odds with the scientific ethos of cutting out the bullshit whenever possible. Instead, Eliezer seems bent on injecting bullshit back into the mix, which I’d argue comes at the expense of clarity, precision, and credibility. However, I do realize it’s a calculated decision intended to give normally dry ideas more memetic potential, and I’m not in a position to say the trade-off definitely isn’t worth it.
Deliberately so. The original OB posts started with it as a thought experiment, “what if we kept science secret, so people would appreciate its Awesome Mysteries?”
Despite that, I think that whole style is a tremendous mistake. It’s an interesting thought experiment, but we should be clear that it runs completely counter to the things that actually bring about accurate results.