Noting that I don’t think pursuing truth in general should be the main goal: some truths matter way, way more to me than other truths, and I think that prioritization often gets lost when people focus on “truth” as the end goal rather than e.g. “make the world better” or “AI goes well.” I’d be happy with something like “figuring out what’s true specifically about AI safety and related topics” as a totally fine instrumental goal to enshrine, but “figure out what’s true in general about anything” seems likely to me to be wasteful, distracting, and in some cases counterproductive.
I think the more precise thing LW was founded for was less plainly “truth” but rather “shaping your cognition so that you more reliably attain truth”, and even if you specifically care about Truths About X, it makes more sense to study the general Art of Believing True Things rather than the Art of Believing Truth Things About X.
Noting that I don’t think pursuing truth in general should be the main goal: some truths matter way, way more to me than other truths, and I think that prioritization often gets lost when people focus on “truth” as the end goal rather than e.g. “make the world better” or “AI goes well.” I’d be happy with something like “figuring out what’s true specifically about AI safety and related topics” as a totally fine instrumental goal to enshrine, but “figure out what’s true in general about anything” seems likely to me to be wasteful, distracting, and in some cases counterproductive.
I think the more precise thing LW was founded for was less plainly “truth” but rather “shaping your cognition so that you more reliably attain truth”, and even if you specifically care about Truths About X, it makes more sense to study the general Art of Believing True Things rather than the Art of Believing Truth Things About X.