Mainly, because the claims in this post weren’t presented as claims about what was present in the system prompt, and the post presents information as “GPT-4 claims” in a way that’s consistent with getting fooled by hallucinations and not consistent with having done the sort of work required to figure out what the prompt actually was.
Yup, to be clear, I never actually directly accessed the code interpreter’s prompt, so GPT-4′s claims about constraints could be (and I expect at least a third of them to be) hallucinated
Mainly, because the claims in this post weren’t presented as claims about what was present in the system prompt, and the post presents information as “GPT-4 claims” in a way that’s consistent with getting fooled by hallucinations and not consistent with having done the sort of work required to figure out what the prompt actually was.
Yup, to be clear, I never actually directly accessed the code interpreter’s prompt, so GPT-4′s claims about constraints could be (and I expect at least a third of them to be) hallucinated