Tyler Cowen reviews The Gridlock Economy here. Alex Tabarrok reviews Against Intellectual Monopoly here. I was sympathetic to the point the latter was making, but I liked Alex’s contrasting point about finding an optimum.
teageegeepea
I like Eliezer’s writing, but I think he himself has described his work as “philosophy of AI”. He’s been a great popularizer (and kudos to folks like him and Dawkins), but that’s different from having “produced significant insights”. Or perhaps his insight is supposed to be “We are really screwed unless we resolve certain problems requiring significant insights!”.
If you believe morality is impossible without God, you have a strong disincentive to become an atheist
That idea is distinct from whether or not God exists. In arguments about evolution I have made the point that it is compatible with both the existence and non-existence of God. I did not say that to “leave a line of retreat” for the Christian anti-Darwinist I was talking to, but because I believed he held an incorrect notion of what Darwinian evolution is. The notion of leaving a line of retreat for others seems less rationalist, for it implies valuing certain beliefs themselves rather than the truth per se. The reason to leave a line of retreat for yourself is because it is always possible you are wrong about something and so it is wise to be prepared.
I’m also sympathetic to logical positivism. I mentioned in the OB thread on behaviorism that inclined toward the theory being scoffed at, but the experiments regarding mental rotation that Pinker has written about show it to be an inferior theory of cognitive science to a computational one. Still better than psychoanalyses though!
Eliezer’s post on entangled lies and the pebble make me skeptical of whether the cake-in-sun example qualifies. The photons-outside-light-cone example is better. Quine’s Two Dogmas of Empiricism convinced a number of people that pragmatism was better than positivism, and on a pragmatic level we could say we just don’t care about those photons.
As a Stirnerite too apathetic and unsociable to pursue even a Union of Egoists, I have no helpful advice to give rather than nitpicks.
It seemed very odd to me that Eliezer seemed to imagine hunter-gatherer bands as intentional communities (which I admit to also being interested in on an abstract level) rather than tribes of related individuals, a sort of proto-clan. More like the ideal of the National Anarchists than Seasteaders, however less appealing we may find the former. Eliezer seems to endorse something like antinatalism, which runs contrary to successful tribalism. The Shakers disappeared pretty quickly, because you can’t just rely on converts and people are naturally going to be attracted to more pro-natalist institutions.
I agree with Brad Taylor on certain factors we might consider irrational being integral to the success of religious institutions. Using one of Hopefully Anonymous’ favorite phrases, succesful institutions are non-transparently about self-perpetuation and will sacrifice other ideals (seeming irrational from that idealistic perspective) to serve that purpose.
The idea of infiltrating an institution to take it over is known as “entryism” and is most closely associated with Trotskyites.
and India the acknowledged master of the third definition I thought that was more pop culture cliche than actually conventional wisdom, and even in pop culture east asian buddhists might come out ahead. Measured in terms of simply being poor, India is fortunately ahead (or behind, if we consider poverty good) of a number of countries.
herefore, there is minimal possibility that any Indian people ever discovered interesting mental techniques. You’re turning a generalization into an absolute claim. A possible belief is that only “western/enlightenment” thinking produces interesting mental techniques and so any Indians (of which there are a very large number) would be part of the minority that were westernized. The majority of Indians should generally look westward, even though there are also many westerners who are unenlightened (though perhaps spiritual creationists).
These lesions of the mind will in time also give lesions of the brain Part of Szasz’ objection to much of psychiatry was that they didn’t look for brain lesions, and without that he claimed there was no “disease”. I would be interested to know if there was evidence of disorders of the mind leading to lesions of the brain rather than the other way round. Best case for that sort of thing discussed here.
I also knew someone whose family won the lottery, though I don’t remember how much.
This problem seems even to afflict Mencius Moldbug. His ideology of formalism seems to be based on ensuring absolute unquestionable authority in order to avoid any violence (whether used to overthrow an authority or cement the hold of an existing one). At the same time he tries to base the appeal of his reactionary narrative by pointing highlighting how reactionaries are “those who lost” (in the terms of William Appleman Williams, whom Mencius would rather not mention) and the strong horse is universalism/antinomianism.
I wonder if Dan McCarthy read this post.
The me of the future is a different person. But screw that guy, I’m sticking up for the me of now!
Would that make them “not even wrong”?
Pete Boettke, one of Hanson’s colleagues, has a quick reaction to this post here. He links to a summary of “what they [Austrians] are claiming”, though I imagine his Austrian-critical GMU colleagues are already familiar with it.
Models are used for prediction in all sorts of domains. Each of us has a mental model (or “theory of mind”) of how others behave to a significant degree of accuracy. Economics often covers situations well outside the range of the evolutionary adaptive era for which our intuitive mental models don’t work as well. If modeling were truly useless, it wouldn’t matter if it was used “for the purpose of control” because it wouldn’t get you anywhere.
I wish Matthew Mueller’s Post-Austrian Economics blog was still up, because he made a good point about the unfortunate entanglement of austrian economics with political libertarianism since Rothbard. This results in some of its adherents viewing people who think their method is flawed as political enemies. For the record, I still read sites like mises.org & Lew Rockwell (though to a lesser extent recently due to all the competing distractions on the internet and my banning from the comments section of the former) and appreciate the work they do in bringing economics to a wider audience even if they can exhibit the flaws they point out in Rand’s circle.
What communities actually die in that way? If they don’t actually end but continue differently then it’s like saying science fiction died because new authors with their newfangled take on the genre changed things (disclaimer: I don’t really know anything about science fiction).
In the case of spam there is a problem of high volume (raivo pommer estee is a good counter-example, as there’s generally no more than 1 per thread and it’s short) but otherwise I don’t really see the harm in idiots posting. Anybody is free to skip past stuff they don’t care about (I do it all the time) and people get value out of even reading stupid comments, so I don’t see what’s so terrible that it outweighs it. I’m with Hopefully Anonymous on how I rate blogs by their comment policies.
It’s my impression that 4chan is about anime and lolcats. I hate both. It is also my impression that there are more people who are at 4chan rather than here compared to here rather than 4chan. I think 4chan was set up to be just what it is. Is there a Less Wrong analogue that got turned into a 4chan.
Surprised no one linked to this yet on Somalia.
Sounds like what I’ve termed the apocalyptic imperative.
You seem to be taking the opposite tack as in this video, where rationality was best for everyone no matter their cause.