Over-analysing unimportant decisions can make you better at analysing. If you want to learn how to use a hammer well, it can be useful to solve all kinds of problems with the hammer even if the hammer isn’t the best tool to solve them.
By your logic you could call Bezos decision to have a desk made from a door pseudo-frugality. That doesn’t change the fact that making decisions like how he become one of the richest people on our planet.
Fighting for the truth, even when you’re burning more social capital than the argument is worth is a symbolic act that shows that you value truth. This means that you shift the cultural norm in the direction of valuing truth. It also shifts your own norms in being more truthful and is going to make you more likely to be focused on truth in other cases when it actually matters.
There’s nothing irrational at valuing symbolic value of an act at higher than zero.
I suppose there are some game theory considerations here. If people can silence you by costing you more than a certain amount of utility, then they have an incentive to hurt you when you say something that they don’t like.
And I also agree that there is value in building the virtue of truthfulness. And that the symbolic act may inspire more than if you were just being pragmatic.
Hmm… but at the same time, I don’t think that social forces can be completely ignored. They are just too powerful. Always fighting for the truth will significantly compromise your ability to achieve your other objectives. I guess I’m more against uncritical acceptance of this more than anything. Perhaps some people will find that their comparative advantage actually is being a public figure who is always forthright, though I suspect that is a rather small proportion of the population.
(Adding more detail: It’s actually much easier for people who aren’t as good at thinking for themselves to declare themselves in favour of always saying the truth no matter the cost, because they don’t have any truly controversial or damaging views outside of their social group. Or if they do, hypocrisy + cognitive dissonance can solve that problem. True rationalists don’t have this out, so the cost is much higher for them).
Hmm… but at the same time, I don’t think that social forces can be completely ignored. They are just too powerful.
I don’t think doing something for it’s cultural meaning is ignoring social forces. Saying things shouldn’t be done for their cultural meaning looks to me much more like ignoring social forces.
Commitments to strategies and cultural values an be useful.
On the personal level having clear values helps reduce akrasia. On the organisational level cultural values lead organisations to use shared heuristics for decision making.
If a new employee sees the Amazon door desk and ask other employees about it, they will get a speech about frugality and see that Amazon is serious about frugality.
Making symbolic decisions like that is best practice of how startups create company culture that’s more than a strategy document that nobody reads.
It’s actually much easier for people who aren’t as good at thinking for themselves to declare themselves in favour of always saying the truth no matter the cost
That wasn’t what we were talking about. We weren’t talking about declaring being in favor of truth but about actually fighting it.
There are people who profess to have values and that don’t follow those values. Seeing being skeptical as a value and then acting skeptical is a simple expression of living one’s values.
We can argue about whether being skeptical or fighting for truth are good values to have and that might be different for different people but there’s no a priori reason for arguing that holding either of those values isn’t rational.