I’ve heard enlightenment described as separating oneself from a mindset you are fused with.
An analogy I’ve seen is when you’re so immersed in a video game or story that you’re invested and emotionally involved as if it were real. But then you take a step back, realize that it’s not your entire existence, and its salience / importance goes back to a reasonable baseline for fiction. The deep emotional investment is gone though you may still appreciate the story.
Is that analogy accurate in your opinion? Am I mischaracterizing it?
Lesswrong is very good at taking known facts to their predictable conclusion, even when there isn’t a society-wide consensus on them. Especially when the conclusion is outside the norm. Examples include:
Predicting the extent of COVID by following the exponential
‘Feeling the AGI’ a few years before the mainstream based on rapidly doubling compute and earlier developments that weren’t mainstream (like GPT-2)
Currently unresolved examples might include:
Global population will not hit the predicted peaks because birthrates continue to fall, which the most prominent predictions don’t account for (they assume birthrates will hold at their current low levels).
IABIED