Rational Animations’ main writer and helmsman
Writer
Introducing Rational Animations
If You Want to Find Truth You Need to Step Into Cringe
I’ve thought about your comment, and I think you are basically correct about everything, but my personal experience differs on how easy it is to actually hide the truths you know (either about yourself or the external world).
“If you honestly seek truth, and if you decide to tell 100% of the truth you’ve found, and if you decide to tell 100% of the truth you’ve found to anyone you can, to become the leader of truth-tellers, to make telling the truth your reason for existence, to make Telling All Of The Truths your livelihood, then you will appear cringe to the eyes of most people”.
From this paragraphs it seems like you make the case that you could say and do 90% of what you want to say and do and get away with it (remember that truth doesn’t only influence what you say, but also what you do, and that might be very difficult to hide). In my experience it was more like 20%. I guess it varies depending on who you are and the social context you are in.
It seems likely to me that for most people, most of the time, a better ideal to aspire to is: you can honestly seek truth and yet decide to tell the truth judiciously, selectively (without a requirement of lies!) and thus appear non-cringeworthy to most people. (Probably cringe to some small subset.)
I feel like that this applies to those truths that are absolutely impossible to say without alienating everyone. In real life you can get away with everything else by just being confident (I believe this with more or less 70% probability).
I think your definition of “reality” conveniently excludes the society we are embedded within, but the society we are embedded within is perhaps the most important part of the reality we need to navigate and influence to meet our goals.
I actually didn’t want to define reality in this way. The distinction social reality vs. the rest of reality is useful in this context but clearly social reality is reality. I don’t know if social reality is “the most important part” (for some people it surely is, I don’t think it is for me), but it is definitely something that has value and factors in decisions. I may have been too extreme in my script and gave the wrong impression on a few points. My script wring skills have margin of improvement.
“When speaking truth, you shouldn’t worry about if the truths you are revealing will be laughed at.”
As described above I do not think this is absolutely true.
You wouldn’t be like a swordsman who keeps glancing away to see if anyone might be laughing at him, you’d be like a swordsman weighing which stance to take. And/or building the muscle memory to let the correct stances and moves to “flow”. (Or something like that...I’m not a swordsman!)
You are definitely correct here. It is not absolutely true. I think I have made the mistake of generalizing my experience too much.
I think that cringe is probably a particular case of ugh fields. Good catch.
A lateral way of thinking about Cause X: barely visible but potentially enormous categories of value
An animated introduction to longtermism (feat. Robert Miles)
Well, there was some love for the person affecting view at the end of the video. Note that one that ascribes to the totalist view might not only mourn every sperm but every potential worthwhile mind.
When beliefs become identities, truth-seeking becomes hard
There are transparent monsters in the world—part 1
Answering both of you: I agree that the topic is not as simple as it may seem. We may not like some things about future morality. I also think that if you hear large portions of the population screaming about something, then you’re probably not dealing with a transparent monster, or even necessarily something very important.
Transparent monsters at some point may become visible, and people will be loud about them. But the overwhelming majority of situations in which people are loud about something are not about transparent monsters.
We are failing to see how much better off humanity could be (transparent monsters part 2)
Sensory organs are just a necessary part of what would be needed to get a new kind of sensory experience that correlates with something in the outside world. Otherwise, if you don’t care about the outside, they shouldn’t be necessary.
But I don’t understand 90% of your comment. Seriously confused about what this means:When I for example feel where the feet of my dance partner happen to be it doesn’t feel like there’s a new sensory organ. The same goes for other cases where information that’s someone in my brain gets integrated in what I perceive.
Robin Hanson’s Grabby Aliens model explained—part 1
The only way to explain that is some hypothesis where he’s not actually that early amongst the total humans to ever exist which means we turn out not to be “grabby”.
You’ve rediscovered the doomsday argument! Fun fact: According to Wikipedia, this argument was first formally proposed by Brandon Carter, the author of the hard-steps model. He has also given name to the anthropic principle.
Edit: note that us not becoming grabby doesn’t contradict the model. There’s a chance that we will not. Plus, the model tells us that hearing alien messages or discovering alien ruins would be terrible news in that regard. I’ll explain the reason in the next part.
Robin Hanson, you know nothing about Robin Hanson. You first wrote the paper in 1996 and then last updated it in 1998.
… or so says Wikipedia, that’s why I wrote 1996. I just made this clear in the video description anyway, tell me if Wikipedia got this wrong.
Btw, views have nicely snowballed from your endorsement on Twitter, so thanks a lot for it.
Robin Hanson’s Grabby Aliens model explained—part 2
Congrats on finally finishing it!!
One way to preserve the pictures and the general format while making the story more easily readable is simply converting the books into PDF. At least you could read them on a tablet, which is easier and less distracting than PCs. Tablets’ screens are also much better for reading than PC monitors, even if they aren’t as good as Kindles. It would be a good compromise, I think.
I tried to convey a similar sentiment here, in the pinned comment:
Many people say that it’s impossible to predict the future. And yet predictions are at the core of science. If a belief says something about the world, then it makes predictions. The more accurate your models are, the better your predictions will be. Under this view, everyone can do magic. Your models of the world are your clairvoyance powers. Your actions, informed by your models, are your spells. The better your predictions are, the more powerful your spells.
Err… the comments here and on the EA Forum seem neutral to positive. So why am I being downvoted so much? Is it the post or the channel? If it is the channel, I swear that better stuff is coming D: Maybe I should have just done a post after that stuff. Too bad I guess. I didn’t expect this reception.