Note: A main reason I expounded upon this idea is because if somebody else has the idea but doesn’t realize that there are pitfalls to it, that could turn out bad. If you guys vote this down till it’s hidden, then people searching to see whether anyone has had this idea before won’t be able to find my description of the pitfalls (I use ctrl-f on pages this long, myself, and it doesn’t find things in hidden comments).
Patent every conceivable method of making an AGI in every country where it is possible to hold a patent and refuse to allow anyone to produce an AGI similar to your patented design until safety is assured. This idea has various flaws and could potentially backfire (a TLDR version is in the conclusion section) but I am posting it anyway for a variety of reasons, explained in the last section. I also posted several ideas that I think are better than this one.
This idea has the following flaws:
Patents expire after 14 years (in the US), so their protection is temporary.
Patenting something doesn’t keep people from building it, it just gives you the right to sue them if they do. At best, it would be a deterrent, not a guarantee.
This idea could backfire in the following ways:
Patenting an idea has the effect of releasing some information about it to the public. If someone were to decide to learn from the ideas described in your patent and ignore the fact that you patented it—say, a government that wants to use AI to have more military power, or China because they don’t view intellectual property the same way, you’d only be giving them an information advantage, not slowing them down.
If somebody goes out of their way to patent the best AGI ideas they can think of, anyone who wants to make an AGI will have to top them. If there aren’t people capable and motivated enough to top the ideas being produced, that would offer some protection for the best ideas. If not, and some AGI researcher/s are both capable and motivated enough to top them, this could just contribute to and/or expediate an arms race rather than slowing it down. However, if you got all of the best AGI researchers to form a group that makes patents for the purpose of preventing unsafe AGIs from being used, this is likely to have an influence on whether other researchers are legally allowed to use those ideas.
I’ve been told (though am not a lawyer) that only the best design out of the options is patentable. If this is true, it means that even if you did manage to come up with a design better than all your competitors which is therefore patentable, you haven’t stopped anyone from making an inferior AGI. In fact, all you’ve done is to ensure that all the inferior variations on the design are now legal for everybody else to use. The effect may be more that you’ve essentially made the safest one forbidden while encouraging people to use less safe ideas. If I am incorrect about this and any idea can be patented regardless of whether it is the best, then this flaw does not apply.
Because patenting AGI plans could result in an environment in which the AGIs that get built are much more likely to either be inferior designs or built by people who don’t care for laws, it may result in the AGI being more dangerous instead of resulting in AGI being delayed.
Conclusion:
Patenting AGI ideas has a chance to slow down Moore’s law, but also has a chance to make it more likely for the following things to occur which are pretty bad separately but would be worse in combination:
People that don’t heed laws will still be able to make AGI.
People that don’t heed laws may glean information from the patents filed and be more likely to make an AGI.
Law abiding people will be less likely to make AGI. This means a lower chance that AGIs created by law-abiding people will counteract the effects of bad AGI.
AGIs with inferior designs are more likely.
Why am I posting this idea if I can see it’s flaws?
What if someone else has the same idea but does not know the ways in which it might backfire? If they check the internet to see whether someone had previously posted the idea, this may prevent them from triggering the consequences.
Often a flawed idea is very useful as part of a comprehensive strategy to solve a problem. Some big problems require several solutions, not just one.
Often, ideas are inspired by other ideas. The more ideas are posted, the more opportunity there is for this type of inspiration.
Sometimes an idea that’s risky can be made safer in a way not obvious to the idea’s originator.
Some people have a hard time imagining how difficult problems such as this one could be solved. Simply seeing that there are a bunch of ideas that could influence things or that somebody is working on it may help the person avoid feeling that the problem is hopeless. This is important because people do not try to solve problems they think are hopeless.
New Idea:
Note: A main reason I expounded upon this idea is because if somebody else has the idea but doesn’t realize that there are pitfalls to it, that could turn out bad. If you guys vote this down till it’s hidden, then people searching to see whether anyone has had this idea before won’t be able to find my description of the pitfalls (I use ctrl-f on pages this long, myself, and it doesn’t find things in hidden comments).
Patent every conceivable method of making an AGI in every country where it is possible to hold a patent and refuse to allow anyone to produce an AGI similar to your patented design until safety is assured. This idea has various flaws and could potentially backfire (a TLDR version is in the conclusion section) but I am posting it anyway for a variety of reasons, explained in the last section. I also posted several ideas that I think are better than this one.
This idea has the following flaws:
Patents expire after 14 years (in the US), so their protection is temporary.
Patenting something doesn’t keep people from building it, it just gives you the right to sue them if they do. At best, it would be a deterrent, not a guarantee.
This idea could backfire in the following ways:
Patenting an idea has the effect of releasing some information about it to the public. If someone were to decide to learn from the ideas described in your patent and ignore the fact that you patented it—say, a government that wants to use AI to have more military power, or China because they don’t view intellectual property the same way, you’d only be giving them an information advantage, not slowing them down.
Patenting an AGI plan under your name would be an effective way of notifying advanced hackers who want to steal your data that you are an interesting target to them.
If somebody goes out of their way to patent the best AGI ideas they can think of, anyone who wants to make an AGI will have to top them. If there aren’t people capable and motivated enough to top the ideas being produced, that would offer some protection for the best ideas. If not, and some AGI researcher/s are both capable and motivated enough to top them, this could just contribute to and/or expediate an arms race rather than slowing it down. However, if you got all of the best AGI researchers to form a group that makes patents for the purpose of preventing unsafe AGIs from being used, this is likely to have an influence on whether other researchers are legally allowed to use those ideas.
I’ve been told (though am not a lawyer) that only the best design out of the options is patentable. If this is true, it means that even if you did manage to come up with a design better than all your competitors which is therefore patentable, you haven’t stopped anyone from making an inferior AGI. In fact, all you’ve done is to ensure that all the inferior variations on the design are now legal for everybody else to use. The effect may be more that you’ve essentially made the safest one forbidden while encouraging people to use less safe ideas. If I am incorrect about this and any idea can be patented regardless of whether it is the best, then this flaw does not apply.
Because patenting AGI plans could result in an environment in which the AGIs that get built are much more likely to either be inferior designs or built by people who don’t care for laws, it may result in the AGI being more dangerous instead of resulting in AGI being delayed.
Conclusion:
Patenting AGI ideas has a chance to slow down Moore’s law, but also has a chance to make it more likely for the following things to occur which are pretty bad separately but would be worse in combination:
People that don’t heed laws will still be able to make AGI.
People that don’t heed laws may glean information from the patents filed and be more likely to make an AGI.
Law abiding people will be less likely to make AGI. This means a lower chance that AGIs created by law-abiding people will counteract the effects of bad AGI.
AGIs with inferior designs are more likely.
Why am I posting this idea if I can see it’s flaws?
What if someone else has the same idea but does not know the ways in which it might backfire? If they check the internet to see whether someone had previously posted the idea, this may prevent them from triggering the consequences.
Often a flawed idea is very useful as part of a comprehensive strategy to solve a problem. Some big problems require several solutions, not just one.
Often, ideas are inspired by other ideas. The more ideas are posted, the more opportunity there is for this type of inspiration.
Sometimes an idea that’s risky can be made safer in a way not obvious to the idea’s originator.
Some people have a hard time imagining how difficult problems such as this one could be solved. Simply seeing that there are a bunch of ideas that could influence things or that somebody is working on it may help the person avoid feeling that the problem is hopeless. This is important because people do not try to solve problems they think are hopeless.
Patents cost money.
http://askville.amazon.com/international-patent-application-cost/AnswerViewer.do?requestId=3793448