There’s a lot I like about this post (I was mulling over a similar sort of post, spelling out what collection of norms I actually think would actually work best for a dedicated truthseeking space).
There are two crystallizations here that I like, which I’d been struggling to articulate: over the past year I’ve updated harder in the “yes, it’s really important for LessWrong’s highest value to be truthseeking, and not to make any tradeoffs for other things.” But something about that still felt nagging to me. I grappled a bit with it in Tensions in Truthseeking but wasn’t satisfied with my ability to articulate it.
But:
“You can have sacred or meta-level values that don’t trade off against non-sacred values… but you still have to figure out how to tradeoff sacred values against each other.”
And
“You should expect value to be fragile, and picking a single value to optimize is likely to leave your space impoverished”
feel like they do a better job of articulating my fear.
There’s a lot I like about this post (I was mulling over a similar sort of post, spelling out what collection of norms I actually think would actually work best for a dedicated truthseeking space).
There are two crystallizations here that I like, which I’d been struggling to articulate: over the past year I’ve updated harder in the “yes, it’s really important for LessWrong’s highest value to be truthseeking, and not to make any tradeoffs for other things.” But something about that still felt nagging to me. I grappled a bit with it in Tensions in Truthseeking but wasn’t satisfied with my ability to articulate it.
But:
“You can have sacred or meta-level values that don’t trade off against non-sacred values… but you still have to figure out how to tradeoff sacred values against each other.”
And
“You should expect value to be fragile, and picking a single value to optimize is likely to leave your space impoverished”
feel like they do a better job of articulating my fear.