I agree with this. I find about 50% (very rough estimate) of the time when I say “I think this is what is going on in my head” and my OH disagrees, he’s right and I’m wrong. I usually to have a strong tendency to rationalise, and I don’t think I’d be close to how successful I am with Alicorn-style luminosity without that sort of outside input (though admittedly I’m still pretty bad—that stuff is hard!). I reciprocate when he introspects as well.
I do still find it annoying and instinctively argue back, but results spoke for themselves when I turned out to be wrong, and now I welcome it as an overall positive-utility interaction even though it still annoys me on an instinctive level.
I don’t think I’d be close to how successful I am with Alicorn-style luminosity without that sort of outside input
This nicely dovetails with Alicorn’s luminosity origin story: people in her life refused to believe claims about her own mental states, and this experience was so intolerable that she resolved to become an obvious expert on mental states. Now the circle is… complete?
I agree with this. I find about 50% (very rough estimate) of the time when I say “I think this is what is going on in my head” and my OH disagrees, he’s right and I’m wrong. I usually to have a strong tendency to rationalise, and I don’t think I’d be close to how successful I am with Alicorn-style luminosity without that sort of outside input (though admittedly I’m still pretty bad—that stuff is hard!). I reciprocate when he introspects as well.
I do still find it annoying and instinctively argue back, but results spoke for themselves when I turned out to be wrong, and now I welcome it as an overall positive-utility interaction even though it still annoys me on an instinctive level.
This nicely dovetails with Alicorn’s luminosity origin story: people in her life refused to believe claims about her own mental states, and this experience was so intolerable that she resolved to become an obvious expert on mental states. Now the circle is… complete?