I’m not confident I know what you mean by “social truth”. Can you break that apart?
fiddlemath
Meetup : Madison: Prospect Theory
Meetup : Madison: Generating More Ideas
It wouldn’t be “barging in”, new folks are welcome!
On the other hand, if it’s uncomfortable for you to first show up when someone’s hosting it at their apartment, that’s pretty understandable. For exactly that reason, some next weeks’ meetups are at a public place—usually Michelangelo’s coffee shop on State St. Next week, for instance.
Also, go ahead and sign up for our mailing list; some local stuff is posted there that doesn’t make its way to the main LW page.
Meetup : Madison: Reading Group, Seeing with Fresh Eyes
Oops, yes. Edited in original; thanks!
it has no syntax.
I’ve usually heard that as the reason to give Lisp to a new programmer. You don’t want them thinking about fine details of syntax; you want them thinking about manipulations of formal systems. Add further syntax only when syntax helps, instead of hinders.
What’s the argument for preferring a more syntax-ful language?
Certainly! As such, we should figure out how to turn geekdoms into ask cultures, when they aren’t already. Putting even marginally socially-awkward people in situations where they have to guess other people’s intentions, when everyone is intentionally avoiding making their intentions common knowledge, well, that’s sort of cruel.
So, this become a problem we can actually try to solve. In a relatively small environment, like a group of a dozen or so, what can one do to induce “ask culture”, instead of “guess culture”?
(This should probably be a discussion post of its own… hm.)
Understood—but essentially no humans consider their own status hits as of extremely low importance. this is so strong that directing other people to lower their status—even if it’s in their best long-term interest—is only rarely practical advice.
To try to answer the title’s question, rather than directly answer the post’s problem:
For the general problem of discerning pseudo-science from science, there’s Massimo Pigliucci’s Nonsense on Stilts. What I’ve read (and heard) by him seems like pretty sound stuff, but I haven’t read the book itself. Does anyone have strong opinions about this book?
Yes, I think so. It surely depends on exactly how I extrapolate to my “transhuman self,” but I suspect that its goals will be like my own goals, writ larger
Not quite so! We could presume that value isn’t restricted to the reals + infinity, but say that something’s value is a value among the ordinals. Then, you could totally say that life has infinite value, but two lives have twice that value.
But this gives non-commutativity of value. Saving a life and then getting $100 is better than getting $100 and saving a life, which I admit seems really screwy. This also violates the Von Neumann-Morgenstern axioms.
In fact, if we claim that a slice of bread is of finite value, and, say, a human life is of infinite value in any definition, then we violate the continuity axiom… which is probably a stronger counterargument, and tightly related to the point DanielLC makes above.
Meetup : Madison: Cached Selves
In that case, it sounds very, very similar to what I’ve learned to deal with—especially as you describe feeling isolated from the people around you. I started to write a long, long comment, and then realized that I’d probably seen this stuff written down better, somewhere. This matches my experience precisely.
For me, the most important realization was that the feeling of nihilism presents itself as a philosophical position, but is never caused or dispelled by philosophy. You can ruminate forever and find no reason to value anything; philosophical nihilism is fully internally consistent. Or, you can get exercise, and spend some time with friends, and feel better due not to philosophy, but to physiology. (I know this is glib, and that getting exercise when you just don’t care about anything isn’t exactly easy. The link above discusses this.)
That above post, and Alicorn’s sequence on luminosity—effective self-awareness—probably lay out the right steps to take, if you’d like to most-effectively avoid these crappy moods.
Moreover, if you’d like to chat more, over skype some time, or via pm, or whatever, I’d be happy to. I’m pretty busy, so there may be high latency, but it sounds like you’re dealing with things that are very similar to my own experience, and I’ve partly learned how to handle this stuff over the past few years.
It has also been depressing, though, because I’ve since realized many of the “problems” in the world were caused by the ineptitude of the species and aren’t easily fixed. I’ve had some problems with existential nihilism since then and if anyone has any advice on the matter, I’d love to hear it.
You describe “problems with existential nihilism.” Are these bouts of disturbed, energy-sucking worry about the sheer uselessness of your actions, each lasting between a few hours and a few days? Moreover, did you have similar bouts of worry about other important seeming questions before getting into LW?
I did, at first; and rethought it before I posted. And I figured that the same response was also roughly correct if it was a “dig at Alicorn.” Doing useful drudgery despite bystander effects is remarkable and surprising, so arch comments about someone not doing so would be silly.
Given that everyone around here is usually pretty reasonable, if prone to fallacies of transparency, I therefore assume that Eliezer’s actually giving straightforward applause, rather than being ironic. (If I’m wrong … well, that’d be useful to learn.)
If it is a dig, it ought not be. Doing useful drudgery despite bystander effects is remarkable and surprising, and should be applauded!
Wear comfortable shoes and, if you have one, a watch!
I get all this, I think. I didn’t realize you were equating “socially useful” and “socially true.”
I guess those might feel very similar; that one’s experience of the social use of a belief could feel a lot like truth. In fact, a belief seeming socially useful, a belief seeming not to cause cognitive dissonance, and a belief seeming epistemically true might be the same experience in other people’s heads—say, a belief feeling “right.”