If the wise are truly wise they wouldn’t judge your honesty as a binary choice. They would allow you to run more complex algorithms that were scrupulously honest in some situations but dishonest in others and see them as separate clusters in person-honesty-given-situation space.
I agree. The wise ought to recognise when you were forced into telling a lie because you valued something more highly than your reputation, and that, in an oddly self-nullifying way, should enhance your reputation. At least among the wise.
EDIT: Gods! I just noticed the accidental similarity to Newcomb’s problem (omega (“the wise”) is allocating reputation instead of money). I’ve been reading Yudkowsky for too long.
If the wise are truly wise they wouldn’t judge your honesty as a binary choice. They would allow you to run more complex algorithms that were scrupulously honest in some situations but dishonest in others and see them as separate clusters in person-honesty-given-situation space.
I agree. The wise ought to recognise when you were forced into telling a lie because you valued something more highly than your reputation, and that, in an oddly self-nullifying way, should enhance your reputation. At least among the wise.
EDIT: Gods! I just noticed the accidental similarity to Newcomb’s problem (omega (“the wise”) is allocating reputation instead of money). I’ve been reading Yudkowsky for too long.