A question on rationality.

My long runs on Saturdays give me time to ponder the various material at lesswrong. Recently my attention has been kept busy pondering a question about rationality that I have not yet resolved and would like to present to lesswrong as a discussion. I will try to be as succinct as possible, please correct me if I make any logical fallacies.

Instrumental rationality is defined as the art of choosing actions that steer the future toward outcomes ranked higher in your preferences/​values/​goals (PVGs)

Here are my questions:

1. If rationality is the function of achieving our preferences/​values/​goals, what is the function of choosing our PVGs to begin with, if we could choose our preferences? In other words, is there an “inherent rationality” absence of preference or values? It seems as if the definition of instrumental rationality is saying that if you have a PVG, that there is a rational way to achieve it, but there is not necessarily rational PVGs.

2.If the answer is no, there is no “inherent rationality” absence of a PVG, then what would preclude the possibility that a perfect rationalist, given enough time and resources, will eventually become a perfectly self interested entity with only one overall goal which is to perpetuate his existence, at the sacrifice of everything and everyone else?

Suppose a superintelligence visits Bob and grants him the power to edit his own code. Bob can now edit or choose his own preferences/​values/​goals.

Bob is a perfect rationalist.

Bob is genetically predisposed to abuse alcohol, as such he rationally did everything he could to keep alcohol off his mind.

Now, Bob no longer has to do this, he simply goes into his own code and deletes this code/​PVG/​meme for alcohol abuse.

Bob continues to cull his code of “inefficient” PVGs.

Soon Bob only has one goal, the most important goal, self preservation.

3. Is it rational for Bob, having these powers, to rid himself of humanity, and rewrite his code to only want to support one meme, that is the meme to ensure his existence. Everything he will do will goes to support this meme. He will drop all his relationships, his hobbies, all his wants and desires into concentrate on a single objective. How does Bob not become a monster superintelligence hell bent on using all the energy in the universe for his own selfish reasons?

I have not resolved any of these questions yet, and look forward to any responses I may receive. I am very perplexed at Bob’s situation. If there are some sequences that would help me better understand my questions please suggest them.