I think I have case of rationalisation paranoia. But I might just be rationalizing. . . I think the problem may be that when I try to solve/find solutions to problems that have a big impact on how I view the world on a fundamental level i.e. things I think are important, I get emotional. Then I start to want a specific worldview/theory to be true, but since I know that I want a specific theory to be right, I become very suspicious about every argument in favor my preferred theory. In the end my judgment becomes severely clouded and I can’t tell what feels like a plausible argument form what is a plausible argument. Even if I conclude that theory A is right (the preferred theory), I still feel like I just rationalized (I might just be).
What’s extra problematic is that this approach has give fruit during my childhood (e.g. thinking about supernatural stuff and the likes).
I’m still reading EY:s sequence “How To Actually Change Your Mind” so I might just resolve this meta-confusion . . .
I think I have case of rationalisation paranoia. But I might just be rationalizing. . . I think the problem may be that when I try to solve/find solutions to problems that have a big impact on how I view the world on a fundamental level i.e. things I think are important, I get emotional. Then I start to want a specific worldview/theory to be true, but since I know that I want a specific theory to be right, I become very suspicious about every argument in favor my preferred theory. In the end my judgment becomes severely clouded and I can’t tell what feels like a plausible argument form what is a plausible argument. Even if I conclude that theory A is right (the preferred theory), I still feel like I just rationalized (I might just be). What’s extra problematic is that this approach has give fruit during my childhood (e.g. thinking about supernatural stuff and the likes).
I’m still reading EY:s sequence “How To Actually Change Your Mind” so I might just resolve this meta-confusion . . .