[Question] How to estimate a pre-aligned value for a common discussion ground?

There are actually all sorts of false science in the wild, such as “the earth is flat” and “pandemic is government-driven lies”, extending their influence even though the rational community struggled to promote the rational thinking paradigm, which I believe is a strong prerequisite for the Alignment question to be solved.

So the question is:

  • How to define “absolute truth” under the rational thinking paradigm?

  • How to forcefully educate this “absolute truth” to the general public, for a common consensus to be presumed before any meaningful discussion can exist?

Edit: I found “forcefully” particularly not suiting under the context, but I can’t bring a more proper expression at this time. This question is mainly on the first point, not by educating/​convincing but by estimating a pre-aligned value.

Second Edit: This question might not be suitable because it’s some long-standing question merged together with a manipulative intention I was not intended to express. I’ve changed the title to a more suitable one. Thanks for replying.

*Background: I’ve recently been struggling to educate some fellow artists with no ML backgrounds on generative ML models, but with little luck. *