Another great post, thanks Eliezer!
But, if rationality is for you to win, shouldn’t you try to keep it a secret from others? Like if you knew a way to make money in the stock market would you spread it if that nullified your advantage?
Advantage over others is not the only thing people care about.
The “rationality” developed in secret is unlikely to grow more powerful than whatever technology a single farmer from Dark Ages could develop in a lifetime, that is to say not impressive at all.
Regarding the second point: that’s why Guilds were created, and they were quite powerful in their day. Why do you think they’re called ‘trade secrets’?
Don’t mistake me, and think that I’m talking about the Hollywood Rationality stereotype that rationalists should be selfish or shortsighted. If your utility function has a term in it for others, then win their happiness. If your utility function has a term in it for a million years hence, then win the eon.
So yes, if for you winning means making money, and your best strategy to do that is to take advantage of irrationality in the stock market, then you will be motivated to keep your methods of rationality secret.
If, on the other hand, your utility function has a term for others, then you will want to teach them to be rational and win.
Eliezer is trying to win by creating a Friendly AI. If he gets more people to help him, this will help him win. If he spreads rationality, this will get more people to help him. Thus, he is spreading rationality to help him win.
Another great post, thanks Eliezer! But, if rationality is for you to win, shouldn’t you try to keep it a secret from others? Like if you knew a way to make money in the stock market would you spread it if that nullified your advantage?
Winning isn’t necessarily zero-sum.
Two things:
Advantage over others is not the only thing people care about.
The “rationality” developed in secret is unlikely to grow more powerful than whatever technology a single farmer from Dark Ages could develop in a lifetime, that is to say not impressive at all.
Regarding the second point: that’s why Guilds were created, and they were quite powerful in their day. Why do you think they’re called ‘trade secrets’?
But modern technological civilisisation didn’t take of until the guild system (keep it secret) was replaced by the patent system (publish it)
From Newcomb’s Problem and Regret of Rationality (with emphasis added):
So yes, if for you winning means making money, and your best strategy to do that is to take advantage of irrationality in the stock market, then you will be motivated to keep your methods of rationality secret.
If, on the other hand, your utility function has a term for others, then you will want to teach them to be rational and win.
Eliezer is trying to win by creating a Friendly AI. If he gets more people to help him, this will help him win. If he spreads rationality, this will get more people to help him. Thus, he is spreading rationality to help him win.