Could the lines in the system prompt just be a result of a tester encountering the model referring to bugs in the code as “gremlins” a few times- just because the phrase was a pretty common way of referring to mysterious problems in older technical writing, rather than out of any sort of unusual fixation on goblin-like creatures?
Maybe the tester and whoever wrote the system prompt were unfamiliar with the history of the word “gremlin” and assumed the model was just referring to mischievous creatures as some kind of bizarre stylistic flourish or hallucination, and tried to correct that via the system prompt?
It might explain those as well. The idea is that the person in charge of writing the prompt heard a few reports from testers of the model mentioning “gremlins”- maybe occasionally mis-reported as “goblins”- and not being familiar with the old aviation industry meme, thought “the model has a bizarre habit of referring to goblin-like creatures when discussing bugs; while discouraging that in the prompt, I’d better mention a range of similar creatures so that it doesn’t just switch to a slightly different variation of the inexplicable behavior.”
Or alternatively: maybe the prompt was written by the LLM itself, and it came up with that list after a request from the user along the lines of “make sure it doesn’t mention gremlins or similar creatures”.
Could the lines in the system prompt just be a result of a tester encountering the model referring to bugs in the code as “gremlins” a few times- just because the phrase was a pretty common way of referring to mysterious problems in older technical writing, rather than out of any sort of unusual fixation on goblin-like creatures?
Maybe the tester and whoever wrote the system prompt were unfamiliar with the history of the word “gremlin” and assumed the model was just referring to mischievous creatures as some kind of bizarre stylistic flourish or hallucination, and tried to correct that via the system prompt?
That doesn’t really explain the pigeons, owls, or raccoons.
Could just be an easter egg.
It might explain those as well. The idea is that the person in charge of writing the prompt heard a few reports from testers of the model mentioning “gremlins”- maybe occasionally mis-reported as “goblins”- and not being familiar with the old aviation industry meme, thought “the model has a bizarre habit of referring to goblin-like creatures when discussing bugs; while discouraging that in the prompt, I’d better mention a range of similar creatures so that it doesn’t just switch to a slightly different variation of the inexplicable behavior.”
Or alternatively: maybe the prompt was written by the LLM itself, and it came up with that list after a request from the user along the lines of “make sure it doesn’t mention gremlins or similar creatures”.