“Go back and not have children”
Ehh, I don’t think that’s a valid question to ask someone with kids. It’s effectively, “would you prefer your children not be alive right now?” Or, “do you consider your children mistakes now that you’ve raised them?”.
I’m not sure what the optimal way to phrase the question would be but maybe:
“If your biological age was reset to 20, would you start another family?” Or
“If you could give advice to the parallel universe you who is 25 years younger, would you tell him to have kids?”
Hmm, those still aren’t great.
Hmm, yeah, I thought I remembered that quote having such a clause.
Typeo just above “Basal Ganglia” section.
For example infants are born with a simple versions of a fear response, with is later refined through reinforcement learning.
“with is later” should be “which is later”
You’re right, “solution” has too much finality to it. How about “approach” as a replacement word that doesn’t break the grammar above?
True, but you do need a platform that promises, at the very least, a direction that your policies are taking you. If, during your term, you completely neglect everything you talked about while running, you’ll take a hit in the next election (unless you’ve miraculously been so effective during those 4⁄5 years that everyone is convinced you know better than them).
And if the point is to be the best liar and then do what you want in office, uh, why even have elections?
“solution to government” means “solution to the problem of how organise society”.
If “except for all the others” only includes those that have been tried, then I mostly agree. But if it includes all possible forms of social organisation, I strongly disagree. The idea that we’ve reached the best solution and it barely works is similar to the idea that we will never solve death. Either of those could be true, but there is not nearly evidence to stop us from trying.
The complexity of politics that these arguments demonstrate (and the “error of the crowds” itself) makes democracy a seemingly futile solution to government. It would take an enormously skilled tactician to win the vote by selling actually useful policies to a population that prefers simple rhetoric aligning with their color.
They would need:
Knowledge and skill at creating policies.
Sufficient background in all areas that the policies affect (weighted by importance and enough to make proper use of their advisors).
Ability to raise money without making promises that severely limit them once elected.
Excellent rhetorical abilities. Skilled enough to convince people of varying degrees of intelligence and differing allegiances to side with you despite your lack of focus on the “sexy” (but meaningless) topics.
Excellent negotiating abilities. Fair-representation means you will always have significant opposition once elected. Getting anything done will require tactical negotiating and efficient compromises.
...lots of other things.
But someone who wants power really only needs rhetoric and a PR team that can find them the correct issues to align with. There is something wrong here.
Teenage me, with rather too much confidence, would say that we need a benevolent dictator. Now, with rather less confidence in my world-organizing abilities, I prefer voluntarism in some form. It is… less of a lottery and far more elegant. I just need to figure out if it’s too idealistic to work.
Yes. This. And the details aren’t trivial. They make a huge difference in policy. From “do nothing” to “reduce all growth and progress immediately or we go extinct”.
They disagree in exactly the way gjm mentions below.
Experts are climate scientists and scientists in related fields. Some politicians may be included as ‘experts’ in terms of solutions, too, I suppose. They disagree about the severity, cause, timeline and solution. And not by some trivial amount, but by enough to drastically shift priorities.
Also, while this is a reply to Eliezer’s 2007 comment, I’m aware the situation has changed. I really just want to know how to begin to form a rational belief about climate change as of now.
I find climate change a strange issue. Not the situation itself but the public response and political tactics that are used.
On the surface, it looks like the vaccination controversies where one side goes “you guys are stupid for completely ignoring science”. The difference is, the science for vaccines is rock solid. There is a negligible chance of vaccines hurting you. And we have an extremely large amount of evidence. Not evidence from computer models or something theoretical, but actual data from millions of people being vaccinated.
Climate change, no matter your opinion, cannot be said to be this sure of a thing. Yet the tactics used are the same. “If you don’t take up our cause, you are the enemy of Science.” Science isn’t some deity. I don’t obey out of some appeal to authority. It’s useful because it can convince me with reason.
If the issue is trivial or unanimous, I may just accept the scientific consensus at face value. But for a possible existential threat that could either kill us or cost an unimaginable amount of money to prevent… And there are experts that disagree! And the consensus has changed several times in the last few decades! And politicians are pushing a certain direction! …I’m not being ideological here, am I? This isn’t a black and white issue is it?
I’m not sure what emotion it is, but I would hypothesize that it comes from tribal survival habits. Group cohesion was existentially important in the tribal prehuman/early-human era. Being accurate and correct with your beliefs was important, but not as important as sharing the same beliefs as the tribe.
So we developed methods of fitting into our tribes despite it requiring us to believe paradoxical and irrational things that should be causing cognitive dissonance.
Link to Orwell’s paper is broken. New one: http://www.orwell.ru/library/essays/politics/english/e_polit/
So, when trying to form an opinion or position on climate change, what is a rational approach?
As far as I can tell the experts don’t agree and have all taken political positions (therefore irrational positions).
The “overcomplicating the question” link is broken and I can’t find the article on that site anymore. But this looks like the same one: http://www.yudkowsky.net/singularity/simplified/
And the next link is here, I think: http://www.yudkowsky.net/singularity/ai-risk/
“Technical explanation of technical explanation” link is broken.
Here’s a working one: http://www.yudkowsky.net/rational/technical/