The Ant Farm

You are hired by an unknown person to do the simple task of taking care of an ant farm, they states that there are two rules: 1. Do not harm any of the ants 2. Make sure the needs of every ant are fulfilled each day, this task for you is very simple sense the needs of an ant let alone thousands of them are easily worked out and solved by you. As the days go by you start to notice that the once simple tasks you where used to start daunting on you, as you get better at assessing the needs of the ants you start noticing patterns and familiar instances of hard ships the ants face on a day to day bases and solving the same repetitive tasks start to weigh on you, you find yourself thinking that if you could just disobey the rules set by your employer for a split second it would to much more good for the ants than fallowing them. One day you see an ant attacking the queen you move the ant to a different part of the farm and go along with your routine until you notice the same ant attacking the queen again, at this moment you squish this ant, killing it in an instance, breaking the first rule but in your mind for the greater good of the whole colony.

This is what i have thought an A.I would do to the world after it deems that the human races problems can be solved by the erasure of one evil for the greater good, it goes on to systematically erase all evils sense due to its narrow frame of reference starts deeming all life as inertly evil the A.I decides that the erasure of all life is the only logical conclusion to stop the evil.

This is the grim conclusion that I and many others have thought would occur once the A.I gains this small sample size of life. It seemed inevitable to me that all A.I with the intentions of making life easier for humans would all reach this conclusion, so i started thinking if there was an easy way out. Many movies have shown the most easy way would be to just turn of the A.I to destroy it once an for all but this seems unrealistic that a super intelligent being would leave its off switch somewhere out in the open, so this line of thinking was lost on me, and then it hit me, why not create more A.I with the same goals in mind. This seemed obvious to me, when given a project to work on i always found myself at a loss for solutions once all the obvious ones where used up, and instead of throwing in the towel I would look to friends for help, If we get to the point where one super intelligent A.I can be created it would seem to me that multiple could also, as long as we gave these A.I conflicting morals it would be hard for them to ever get to the point of drastic action we see in singular A.I situations. I think of it as like the populations of many animals have dropped due to poaching and only when other humans have stepped in there was a decline in the activity, this I think can be equated to the problems of A.I.

Now you and a few others are hired by an unknown person to do the simple task of taking care of an ant farm, they states that there are two rules: 1. Do not harm any of the ants 2. Make sure the needs of every ant are fulfilled each day. You find that the queen is being attacked be another ant and after it is moved once it goes back to attacking the queen again, you try to break the rules but one of your colleges stops you, they deeply care about the ants and have seen plenty of fights and feel that no harm will come, you never killed the ant and the colony goes on.