This post uses the example of GPT-2 to highlight something that’s very important generally—that if you’re not concentrating, you can’t distinguish GPT-2 generated text that is known to be gibberish from non-gibberish.
And hence gives the important lesson, which might be hard to learn oneself if they’re not concentrating, that you can’t really get away with not concentrating.
This post uses the example of GPT-2 to highlight something that’s very important generally—that if you’re not concentrating, you can’t distinguish GPT-2 generated text that is known to be gibberish from non-gibberish.
And hence gives the important lesson, which might be hard to learn oneself if they’re not concentrating, that you can’t really get away with not concentrating.