The example is supposed to sound cultist because the people think alike. But I have a hard time seeing how a non-cultist anti-creationist group would produce different arguments against creationism.
The non-cultist group could of course not all use the same welcome phrase, but that’s not really the heart of what the example is supposed to illustrate,
There are multiple anti-creationist arguments out there, so if they all immediately jump to the same one, I’d be suspicious. But even beyond that, it’s natural for humans to disagree about stuff, because we’re not perfect Bayesians. If you see a bunch of humans agreeing completely, you should immediately think “cult”, or at the very least “these people don’t think for themselves”. (I’d be much less suspicious if we replace humans with Bayesian superintelligences, however, because those actually follow Aumann’s Agreement Theorem.)
The example is supposed to sound cultist because the people think alike. But I have a hard time seeing how a non-cultist anti-creationist group would produce different arguments against creationism.
The non-cultist group could of course not all use the same welcome phrase, but that’s not really the heart of what the example is supposed to illustrate,
There are multiple anti-creationist arguments out there, so if they all immediately jump to the same one, I’d be suspicious. But even beyond that, it’s natural for humans to disagree about stuff, because we’re not perfect Bayesians. If you see a bunch of humans agreeing completely, you should immediately think “cult”, or at the very least “these people don’t think for themselves”. (I’d be much less suspicious if we replace humans with Bayesian superintelligences, however, because those actually follow Aumann’s Agreement Theorem.)