I don’t think this is generally correct. Induction is about moving probability mass to both parameters and models that best explain the evidence
Is it? That isn’t the classic definition. The classic definition is fairly limited, more like:-
Example: “For the past 7 days it has been raining. Therefore, tomorrow it will probably also rain.”
Ie, just more of the same , not an infinite variety of models
It does sounds like Bayes,...but Bayes could be a superset of induction.
And where are you getting your models from? If you are creating them , that’s abduction, even if you are calling it induction. If they they are already there, in some oracular database, that’s uncomputable ideal reasoning.
This is a Machine Learning friendly way to see induction
Or it’s something in ML that has been mislabelled “induction”, like “hallucination”
Bayes is complete: there is no other theorem or formula required to achieve the most accurate estimate of probability for beliefs
Complete what? It isn’t complete epistemology as I have shown.
(Also, Bayesian statistics is considered a subset of inferential statistics, which is the formal mathematics associated with induction
What does “associated with” mean? If inferential stats is a superset of induction it’s pretty unsurprising that it could contain abduction. If your models are being created on the fly, it actually does.
I was originally considering the standpoint of an “optimal Bayesian” who simultaneously evaluates all hypotheses at once by shifting probability mass, but this is far from the human experience
Indeed. If ideal Bayesians don’t need abduction, that doesn’t mean humans don’t.
focus on generating hypotheses/explanations/models
Why is that a bad thing?
That implies different permissible levels of making and breaking assumptions, choosing and changing models. It’s more fluid, less rule-bound, more willing to accept being knowingly wrong in some ways, less tied to formalisms and precise methods.
Yes. Hypothesis generation isn’t mechanical or algorthmic. That may be “bad”, but there’s not much alternative—you can’t actually use Solomonoff Induction, or whatever.
Is it? That isn’t the classic definition. The classic definition is fairly limited, more like:-
Ie, just more of the same , not an infinite variety of models
It does sounds like Bayes,...but Bayes could be a superset of induction.
And where are you getting your models from? If you are creating them , that’s abduction, even if you are calling it induction. If they they are already there, in some oracular database, that’s uncomputable ideal reasoning.
Or it’s something in ML that has been mislabelled “induction”, like “hallucination”
Complete what? It isn’t complete epistemology as I have shown.
What does “associated with” mean? If inferential stats is a superset of induction it’s pretty unsurprising that it could contain abduction. If your models are being created on the fly, it actually does.
Indeed. If ideal Bayesians don’t need abduction, that doesn’t mean humans don’t.
@AnthonyC
Why is that a bad thing?
Yes. Hypothesis generation isn’t mechanical or algorthmic. That may be “bad”, but there’s not much alternative—you can’t actually use Solomonoff Induction, or whatever.
I don’t think I implied it was a bad thing? I certainly didn’t intend to imply that.