Zach Furman comments on How does GPT-3 spend its 175B parameters?