I think the term means something like “you demonstrated truth-seeking character/virtue”.
Example: Someone (I forget who it was, sorry) came up with a novel AI alignment theory, and then they wrote a long post about how their own theory was deeply flawed. That post earned them Bayes points.
Like, if we’ve just seen some evidence which Hypothesis A predicted with twice as much probability as Hypothesis B, then the probability of Hypothesis A grows by a factor of two relative to Hypothesis B. This doubling adds one bit in logspace, and we can think of this bit as a point scored by Hypothesis A.
By analogy, if Alice predicted the evidence with twice as much probability as Bob, we can pretend we’re scoring people like hypotheses and give Alice one ‘Bayes point’. If Alice and Bob each subscribe to a fixed hypothesis about How Stuff Works then this is not even an analogy, we’re just Bayesian updating about their hypotheses.
Maybe I’ve been misusing it or seeing it misused, but I thought it meant something more like “called a thing ahead of time” or “made a good prediction” and therefore treated as more credible in the future?
I think the term means something like “you demonstrated truth-seeking character/virtue”.
Example: Someone (I forget who it was, sorry) came up with a novel AI alignment theory, and then they wrote a long post about how their own theory was deeply flawed. That post earned them Bayes points.
I’ve always interpreted it more literally.
Like, if we’ve just seen some evidence which Hypothesis A predicted with twice as much probability as Hypothesis B, then the probability of Hypothesis A grows by a factor of two relative to Hypothesis B. This doubling adds one bit in logspace, and we can think of this bit as a point scored by Hypothesis A.
By analogy, if Alice predicted the evidence with twice as much probability as Bob, we can pretend we’re scoring people like hypotheses and give Alice one ‘Bayes point’. If Alice and Bob each subscribe to a fixed hypothesis about How Stuff Works then this is not even an analogy, we’re just Bayesian updating about their hypotheses.
Maybe I’ve been misusing it or seeing it misused, but I thought it meant something more like “called a thing ahead of time” or “made a good prediction” and therefore treated as more credible in the future?
Or maybe I’m the one who’s been misunderstanding it! I don’t think I have a great understanding of the term tbh so you’re probably right.
If that’s what it means then instead of “Bayes points”, Quinn could call it “credibility” or “predictive accuracy” or something.