If you believe true things, then whatever moral framework you’re working within will be more effective, improving outcomes and making you a better person (for some definition of better person).
This feels self-congratulatory (obviously what I believe is true, so I’m also just morally better than the people who don’t believe what I do), but is it accurate? Do you guys think that believing true things helps make you moral?
Believing true things (and more importantly, not believing false things or depending on inappropriate models for prediction) will make your decisions more effective in meeting your goals.
It won’t necessarily make your goals more moral, nor give you the ability to make decisions on more topics that matter (for instance, if you have severe enough impulse control problems, knowledge itself won’t let you follow-up on your belief that stabbing this barista for misspelling your name is a bad idea).
It may make you somewhat more able to see the connections from your behaviors to your goals, and this could make you think on longer time-scales and larger groups of moral patients. If so, this makes you more morally-focused.
There are unfortunately cases when knowing the truth tends to make people less moral. If you discover the truth that the bureaucracy you work for tends to reward loyalty over hard work, this will probably not make you a better worker.
In fact, most of the people we consider highly moral (Ghandi, Mother Teresa, MLK) come across as pretty nutty to ordinary people. Of course you could argue they were following a higher truth. So perhaps the truth makes you more moral, but simply increasingly the number of true things you know will not necessarily make you more moral.
Looking at the extremes of the situation:
If I am omniscient, that doesn’t make me omnibenevolent. I could surely see every consequence of my actions, know exactly what would be the moral choice, and still decide to act in an evil or selfish way. Knowing the truth makes it easier to be moral, should I choose to do so, but does not make me more moral.
If I am completely absent of ability to foresee consequences of my actions, then my “morality” from a consequentialist viewpoint can be no better than random chance. Faced with complete ignorance I cannot choose to be moral or immoral, I can only do things and see what happens. Therefore it is necessary to have some level of belief in true things to be a moral agent.
Interpolating from these endpoints, it seems that believing true things is not correlated to morality so much as moral agency.
As an aside, you can imagine a situation where you are omniscient and omnibenevolent, but live in a world without moral realism. If “truth” only includes information about what will happen, and not what moral theory is “correct”, then you’re still unable to make a moral choice.
FWIW, the philosopher William Wollaston’s magnum opus is devoted to defending the thesis that truth and morality completely overlap with one another: that to adhere to truth and to be moral are identical.
Here’s a free ebook version of his argument: https://standardebooks.org/ebooks/william-wollaston/the-religion-of-nature-delineated
And my summary of his argument: https://www.lesswrong.com/posts/P75rzmpJ62E2Qfr3A/truth-reason-the-true-religion