For those who might not understand, Thomas is treating agreement-or-not on each bullet point as a 1 or 0, and stringing them together as a binary number to create a bitmask.
(I’m using the 0b prefix to represent a number written in its binary format.)
This means that 127 = 0b1111111 corresponds to agreeing with all seven bullet points, and 25 = 0b0011001 corresponds to agreeing with only the 3rd, 4th and 7th bullet points.
(Note that the binary number is read from left-to-right in this case, so bullet point 1 corresponds to the “most-significant” (left-most) bit.)
Most of the opinions in the list sound so whacky that 0 is likely the default position of someone outside Less Wrong. I’ve been here a few months, and read most of the Sequences, but none of the bits in my own bitmap has flipped. Sorry Eliezer!
The odd thing is that I find myself understanding almost exactly why Eliezer holds these opinions, and the perfectly lucid reasoning leading to them, and yet I still don’t agree with them. A number of them are opinions I’d already considered myself or held myself at some point, but then later rejected. Or I hold a rather more nuanced or agnostic position than I used to.
What is the number of the most relevant points from the Sequences? The Grognor’s selection of those 7 may not be the best. Let me try:
BitNumber Statement
0 Intelligence explosion is likely in a (near) future
1 FOOM is possible to occur
2 Total reductionism
3 Bayesism is greater than science
4 Action to save the world is a must
5 No (near) aliens
6 FAI or die
7 CEV is the way to go
8 MWI
9 Evolution is stupid and slow
Now, I agree with those from 0 to 5 (first six) in this list I’ve select. The binary number wold be “111111” or 63 in the decimal notation. They were not new to me, all 10 of them.
Yudkowsky’s fiction is just great, BTW. The “Three world collide” may the the best story I have ever read.
I’d like to point out that CEV is not in the sequences, and it looks mostly like a starting point idea from which to springboard to the “true” way to build an FAI.
Whoops. I wasn’t counting the sub-bullet as a power-of-two position; gotcha. FWIW, I still think the agreement bitmask is a fun perspective, even though I got it wrong (and there’s the whole big-endian/little-endian question).
You are stating 7 points. It’s at least 128 different world views from there, depending on what one agrees.
I am number 25 school member, since I agree with the last and two more.
I doubt, that there is a lot of school number 127 members here, you may be the only one.
(This isn’t addressed at you, Thomas.)
For those who might not understand, Thomas is treating agreement-or-not on each bullet point as a 1 or 0, and stringing them together as a binary number to create a bitmask.
(I’m using the
0b
prefix to represent a number written in its binary format.)This means that 127 = 0b1111111 corresponds to agreeing with all seven bullet points, and 25 = 0b0011001 corresponds to agreeing with only the 3rd, 4th and 7th bullet points.
(Note that the binary number is read from left-to-right in this case, so bullet point 1 corresponds to the “most-significant” (left-most) bit.)
Excellently explained. Now, do we have 127?
Anyone else voting for 0?
Most of the opinions in the list sound so whacky that 0 is likely the default position of someone outside Less Wrong. I’ve been here a few months, and read most of the Sequences, but none of the bits in my own bitmap has flipped. Sorry Eliezer!
The odd thing is that I find myself understanding almost exactly why Eliezer holds these opinions, and the perfectly lucid reasoning leading to them, and yet I still don’t agree with them. A number of them are opinions I’d already considered myself or held myself at some point, but then later rejected. Or I hold a rather more nuanced or agnostic position than I used to.
What is the number of the most relevant points from the Sequences? The Grognor’s selection of those 7 may not be the best. Let me try:
BitNumber Statement
0 Intelligence explosion is likely in a (near) future
1 FOOM is possible to occur
2 Total reductionism
3 Bayesism is greater than science
4 Action to save the world is a must
5 No (near) aliens
6 FAI or die
7 CEV is the way to go
8 MWI
9 Evolution is stupid and slow
Now, I agree with those from 0 to 5 (first six) in this list I’ve select. The binary number wold be “111111” or 63 in the decimal notation. They were not new to me, all 10 of them.
Yudkowsky’s fiction is just great, BTW. The “Three world collide” may the the best story I have ever read.
I’d like to point out that CEV is not in the sequences, and it looks mostly like a starting point idea from which to springboard to the “true” way to build an FAI.
I don’t care, if the teaching is divided between Sequences and elsewhere.
If I intended to encode my beliefs (which I don’t), I couldn’t, because I don’t:
know what’s the precise difference between 0 and 1
understand 2 - what’s total reductionism, especially in contrast to ordinary reductionism
see any novel insight in 9, which leads me to suspect I am missing the point
Cryonics is good, and bayes is better than science? An agreement bitmask is a fun perspective; I dunno why you got downvoted.
Bayes is better than science, yes. but it’s not the cryonics that I like but
As dbaupp explained.
Whoops. I wasn’t counting the sub-bullet as a power-of-two position; gotcha. FWIW, I still think the agreement bitmask is a fun perspective, even though I got it wrong (and there’s the whole big-endian/little-endian question).
That’s cleared up by: