I understand some. But I think you’re mistaken and I don’t see a lot to like when judged by the standards of good philosophy. Philosophy is important. Your projects, like inventing an AI, will run into obstacles you did not foresee if your philosophy is mistaken.
Of course I have the same criticism about people in all sorts of other fields. Architects or physicists or economists who don’t know philosophy run into problems too. But claiming to have an epistemology, and claiming to replace Popper, those are things most fields don’t do. So I try to ask about it. Shrug.
I think I figured out the main idea of Bayesian epistemology. It is: Bayes’ theorem is the source of justification (this is intended as the solution to the problem of justification, which is a bad problem).
But when you start doing math, it’s ignored, and you get stuff right (at least given the premises, which are often not realistic, following the proud tradition of game theory and economics). So I should clarify: that’s the main philosophical claim. It’s not very interesting. Oh well.
I think I figured out the main idea of Bayesian epistemology. It is: Bayes’ theorem is the source of justification (this is intended as the solution to the problem of justification, which is a bad problem).
No. See here, where Eliezer specifically says that this is not the case. (“But first, let it be clearly admitted that the rules of Bayesian updating, do not of themselves solve the problem of induction.”)
I understand some. But I think you’re mistaken and I don’t see a lot to like when judged by the standards of good philosophy. Philosophy is important. Your projects, like inventing an AI, will run into obstacles you did not foresee if your philosophy is mistaken.
Of course I have the same criticism about people in all sorts of other fields. Architects or physicists or economists who don’t know philosophy run into problems too. But claiming to have an epistemology, and claiming to replace Popper, those are things most fields don’t do. So I try to ask about it. Shrug.
I think I figured out the main idea of Bayesian epistemology. It is: Bayes’ theorem is the source of justification (this is intended as the solution to the problem of justification, which is a bad problem).
But when you start doing math, it’s ignored, and you get stuff right (at least given the premises, which are often not realistic, following the proud tradition of game theory and economics). So I should clarify: that’s the main philosophical claim. It’s not very interesting. Oh well.
No. See here, where Eliezer specifically says that this is not the case. (“But first, let it be clearly admitted that the rules of Bayesian updating, do not of themselves solve the problem of induction.”)
I had already seen that.
Note that I said justification not induction.
I don’t want to argue about this. If you like the idea, enjoy it. If you don’t, just forget about it and reply to something else I said.