Rationality, Transhumanism, and Mental Health

My name is Brent, and I’m probably insane.

I can perform various experimental tests to verify that I do not perform primate pack-bonding rituals correctly, which is about half of what we mean by “insane”. This concerns me simply from a utilitarian perspective (separation from pack makes ego-depletion problems harder; it makes resources harder to come by; and it simply sucks to experience “from the inside”), but these are not the things that concern me most.

The thing that concerns me most is this:

What if the very tools that I use to make decisions are flawed?

I stumbled upon Bayesian techniques as a young child; I was lucky enough to have the opportunity to perform a lot of self-guided artificial intelligence “research” in Junior High and High School, due to growing up in a time and place when computers were utterly mysterious, so no one could really tell me what I was “supposed” to be doing with them—so I started making simple video games, had no opponents to play them against due to the aforementioned failures to correctly perform pack-bonding rituals, decided to create my own, became dissatisfied with the quality of my opponents, and suddenly found myself chewing on Hopfstaedter and Wiener and Minsky.

I’m filling in that bit of detail to explain that I have been attempting to operate as a rational intelligence for quite some time, so I believe that I’ve become very familiar with the kinds of “bugs” that I will tend to exhibit.

I’ve spent a very long time attempting to correct for my cognitive biases, edit out tendencies to seek comfortable-but-misleading inputs, and otherwise “force” myself to be rational, and often, the result is that my “will” will crack under the strain. My entire utility-table will suddenly flip on its head, and attempt to maximize my own self-destruction rather than allow me to continue to torture it with endlessly recursive, unsolvable problems that all tend to boil down to “you do not have sufficient social power, and humans are savage and cruel no matter how much you care about them.”

Most of my energy is spent attempting to maintain positive, rational, long-term goals in the face of some kind of regedit-hack of my utility table itself, coming from somewhere in my subconscious that I can’t seem to gain write-access to.

Clearly, the transhumanist solution would be to identify the underlying physical storage where the bug is occurring, and replace it with a less-malfunctioning piece of hardware.

Hopefully someday someone with more self-control, financial resources, and social resources than I will invent a method to do that, and I can get enough of a partial personectomy to create something viable with the remaining subroutines.

In the meantime, what is someone who wishes to be rational supposed to do, when the underlying hardware simply won’t cooperate?