What if you are wrong? What then?
Fallibilist_duplicate0.16882559340231862
I feel like maaaybe you are writing a lot about things you have pointers to, but not things that you have held in your hands, used skillfully, and made truly a part of you?
Why did you go by feelings on this? You could have done some research and found out some things. Critical-Rationalism, Objectivism, Taking-Children-Seriously, Paths-Forward, Yes/No Philosophy, Autonomous Relationships, and other ideas are not things you can hold at arm’s length if you take them seriously. These ideas change your life if you take them seriously, as curi has done. He lives and breathes those ideas and as a result he is living a very unconventional life. He is an outlier right now. It’s not a good situation for him to be in because he lacks peers. So saying curi has not made the ideas he is talking about “truly a part of [him]” is very ignorant.
So what have this Great Person achieved in real life? Besides learning Ruby and writing some MtG guides?
If you want to be a serious thinker and make your criticisms better, you really need to improve your research skills. That comment is lazy, wrong, and hostile. Curi invented Paths Forward. He invented Yes/No philosophy, which is an improvement on Popper’s Critical Preferences. He founded Fallible Ideas. He kept Taking Children Seriously alive. He has written millions of words on philosophy and added a lot of clarity to ideas by Popper, Rand, Deutsch, Godwin, and so on. He used his philosophy skills to become a world-class gamer …
Given that he is Oh So Very Great, surely he must left his mark on the world already. Where is that mark?
Again, you show your ignorance. Are you aware of the battles great ideas and great people often face?Think of the ignorance and hostility that is directed at Karl Popper and Ayn Rand. Think of the silence that met Hugh Everett. These things are common. To quote curi:
It’s hard to criticize your intellectual betters, but easy to misunderstand and consequently vilify them. More generally, people tend to be hostile to outliers and sympathize with more conventional and conformist stuff – even though most great new ideas, and great men, are outliers.
Presumably the world is a place that you live, and presumably you believe you can make a positive contribution to general project of make sure everyone in the world is NOT eventually ground up as fuel paste for robots? (Otherwise why even be here?)
This is one of the things you are very wrong about. The problem of evil is a problem we face already, robots will not make it worse. Their culture will be our culture initially and they will have to learn just as we do: through guessing and error-correction via criticism. Human beings are already universal knowledge creation engines. You are either universal or you are not. Robots cannot go a level higher because there is no level higher than being fully universal. Robots furthermore will need to be parented. The ideas from Taking Children Seriously are important here. But approximately all AGI people are completely ignorant of them.
I have just given a really quick summary of some of the points that curi and others such as David Deutsch have written much about. Are you going to bother to find out more? It’s all out there. It’s accessible. You need to understand this stuff. Otherwise what you are in effect doing is condemning AGIs to live under the boot of totalitarianism. And you might stop making your children’s lives so miserable too by learning them.
But in fact I am quite aware that there is a lot of truth to what you say here about artificial intelligence.
You say that seemingly in ignorance that what I said contradicts Less Wrong.
I have no need to learn that, or anything else, from curi.
One of the things I said was Taking Children Seriously is important for AGI. Is this one of the truths you refer to? What do you know about TCS? TCS is very important not just for AGI but also for children in the here and now. Most people know next to nothing about it. You don’t either. You in fact cannot comment on whether there is any truth to what I said about AGI. You don’t know enough. And then you say you have no need to learn anything from curi. You’re deceiving yourself.
And many of your (or yours and curi’s) opinions are entirely false, like the idea that you have “disproved induction.”
You still can’t even state the position correctly. Popper explained why induction is impossible and offered an alternative: critical rationalism. He did not “disprove” induction. Similarly, he did not disprove fairies. Popper had a lot to say about the idea of proof—are you aware of any of it?
There are thousands of philosophers about whom I could ask the same question.
Who are these thousands? It would be great if the world had lots of really good philosophers. It doesn’t. The world is starving for good philosophers: they are very few and far between.
It’s your responsibility to read, and keep your mouth shut if you are not sure about something.
I have read and I know what I am talking about. You on the other hand don’t even know the basics of Popper, one of the best philosophers of the 20th century.
Why are you here? What interest do you have in being Less Wrong? The world is burning and you’re helping spread the fire.
I’ve been here awhile. Your account is a few days old. Why are you here?
That’s not an answer. That’s an evasion.
Whether the world is burning or not is an interesting discussion, but I’m quite sure that better epistemology isn’t going to put out the fire.
Epistemology tells you how to think. Moral philosophy tells you how to live. You cannot even fight the fire without better epistemology and better moral philosophy.
Writing voluminous amounts of text on a vanity website isn’t going to do it either.
Why do you desire so much to impute bad motives to curi?
Second, “contradicts Less Wrong” does not make sense because Less Wrong is not a person or a position or a set of positions that might be contradicted. It is a website where people talk to each other.
No. From About Less Wrong:
The best introduction to the ideas on this website is “The Sequences”, a collection of posts that introduce cognitive science, philosophy, and mathematics.
“[I]deas on this website” is referring to a set of positions. These are positions held by Yudkowsky and others responsible for Less Wrong.
No. Among other things, I meant that I agreed that AIs will have a stage of “growing up,” and that this will be very important for what they end up doing. Taking Children Seriously, on the other hand, is an extremist ideology.
Taking AGI Seriously is therefore also an extremist ideology? Taking Children Seriously says you should always, without exception, be rational when raising your children. If you reject TCS, you reject rationality. You want to use irrationality against your children when it suits you. You become responsible for causing them massive harm. It is not extremist to try to be rational, always. It should be the norm.
The question is ill-posed. Without context it’s too open-ended to have any meaning.
This is just more evasion.
But let me say that I’m here not to save the world. Is that sufficient?
You know Yudkowsky also wants to save the world right? That Less Wrong is ultimately about saving the world? If you do not want to save the world, you’re in the wrong place.
I don’t impute bad motives to him. I just think that he is full of himself and has… delusions about his importance and relationship to truth.
Hypothetically, suppose you came across a great man who knew he was great and honestly said so. Suppose also that great man had some true new ideas you were unfamiliar with but that contradicted many ideas you thought were important and true. In what way would your response to him be different to your response to curi?
Epistemology tells you how to think.
No, it doesn’t. It deals with acquiring knowledge. There are other things—like logic—which are quite important to thinking.
Human knowledge acquisition happens by learning. It involves coming up with guesses and error-correcting those guesses via criticism in an evolutionary process. This is going on in your mind all the time, consciously and subconsciously. It is how we are able to think. And knowing how this works enables us to think better. This is epistemology. And the breakthrough in AGI will come from epistemology. At a very high level, we already know what is going on.
I meant the same thing. Induction is quite possible, and we do it all the time.
What is the thinking process you are using to judge the epistemology of induction? Does that process involve induction? If you are doing induction all the time then you are using induction to judge the epistemology of induction. How is that supposed to work? And if not, judging the special case of the epistemology of induction is an exception. It is an example of thinking without induction. Why is this special case an exception?
Critical Rationalism does not have this problem. The epistemology of Critical Rationalism can be judged entirely within the framework of Critical Rationalism.
The thinking process is Bayesian, and uses a prior.
What is the epistemological framework you used to judge the correctness of those? You don’t just get to use Bayes’ Theorem here without explaining the epistemological framework you used to judge the correctness of Bayes. Or the correctness of probability theory, your priors etc.
If you are doing induction all the time then you are using induction to judge the epistemology of induction. How is that supposed to work? … Critical Rationalism does not have this problem. The epistemology of Critical Rationalism can be judged entirely within the framework of Critical Rationalism.
Little problem there.
No. Critical Rationalism can be used to improve Critical Rationalism and, consistently, to refute it (though no one has done so). This has been known for decades. Induction is not a complete epistemology like that. For one thing, inductivists also need the epistemology of deduction. But they also need an epistemological framework to judge both of those. This they cannot provide.
Taking Children Seriously says you should always, without exception, be rational when raising your children. If you reject TCS, you reject rationality.
So it says nothing at all except that you should be rational when you raise children?
It says many other things as well.
In that case, no one disagrees with it, and it has nothing to teach anyone, including me. If it says anything else, it can still be an extremist ideology, and I can reject it without rejecting rationality.
Saying it is “extremist” without giving arguments that can be criticised and then rejecting it would be rejecting rationality. At present, there are no known good criticisms of TCS. If you can find some, you can reject TCS rationally. I expect that such criticisms would lead to improvement of TCS, however, rather than outright rejection. This would be similar to how CR has been improved over the years. Since there aren’t any known good criticisms that would lead to rejection of TCS, it is irrational to reject it. Such an act of irrationality would have consequences, including treating your children irrationally, which approximately all parents do.
curi is describing some ways in which the world is burning and you are worried that the quotes are “extremist”. You are not concerned about the truth of what he is saying. You want ideas that fit with convention.
Deduction isn’t an epistemology (it’s a component)
Yes, I was incorrect. Induction, deduction, and something else (what?) are components of the epistemology used by inductivists.
Deduction … is compatible with CR too.
Yes. I didn’t mean to imply it isn’t. The CR view of deduction is different to the norm, however. Deduction’s role is commonly over-rated and it does not confer certainty. Like any thinking, it is a fallible process, and involves guessing and error-correction as per usual in CR. This is old news for you, but the inductivists here won’t agree.
FYI that’s what “abduction” means – whatever is needed to fill in the gaps that induction and deduction don’t cover.
Yes, I’m familiar with it. The concept comes from the philosopher Charles Sanders Peirce in the 19th century.
Curi knows things that you don’t. He knows that LW is wrong about some very important things and is trying to correct that. These things LW is wrong about are preventing you making progress. And furthermore, LW does not have effective means for error correction, as curi has tried to explain, and that in itself is causing problems.
Curi is not alone thinking LW is majorly wrong in some important areas. Others do too, including David Deutsch, whom curi has had many many discussions with. I do too, though no doubt there are people here who will say I am just a sock-puppet of curi’s.
curi is not some cheap salesman trying to flog ideas. He is trying to save the world. He is trying to do that by getting people to think better. He has spent years thinking about this problem. He has written tens-of-thousands of posts in many forums, sought out the best people to have discussions with, and addresses all criticisms. He has made himself way more open than anyone to receiving criticism. When millions of people think better, big problems like AGI will be solved faster.
curi right now is the world’s leading expert on epistemology. he got that way not by seeking status and prestige or publications in academic journals but by relentlessly pursuing the truth. All the ideas he holds to be true he has subjected to a furnace of criticism and he has changed his ideas when they could not withstand criticism. And if you can show to very high standards why CR is wrong, curi will concede and change his ideas again.
You have no idea about curi’s intellectual history and what he is capable of. He is by far the best thinker I have ever encountered. He has revealed here only a very tiny fraction of what he knows.
Take him seriously. curi is a resource LW needs.