Name: Alex Hedtke Age: 27
Came here via HPMOR, stayed for the rationality. Organizer for the Kansas City Rationalists. Founder and co-organizer for Kansas City Effective Altruism. Founder and co-CEO of ‘The Guild of the ROSE’.
Name: Alex Hedtke Age: 27
Came here via HPMOR, stayed for the rationality. Organizer for the Kansas City Rationalists. Founder and co-organizer for Kansas City Effective Altruism. Founder and co-CEO of ‘The Guild of the ROSE’.
A strong correlation between adopting the virtues and established methods of rationality, and an increased quality of life, but yeah; more handwavey. I don’t even know what calculations could be made. That’s sorta why I’m here.
Yes, but they could all be explained by the fact I just sat down and bothered to think about the problem, which wouldn’t exactly be an amazing endorsement of rationality as a whole.
I also don’t look at rationality as merely a set of tools; it’s an entire worldview that emphasizes curiosity and a desire to know the truth. If it does improve lives, it might very well simply be making our thinking more robust and streamlined. If so, I wouldn’t know how to falsify or quantify that.
I ask the question this way to hopefully avoid stepping on toes. I’m fully open to the idea that the answer is “we have none”. Also, I am primarily addressing the people who are making a claim. I am not necessarily making a claim myself.
I’m convinced mostly due to its effects on my own life, as stated in the opening paragraph. But I’m unsure of how to test and demonstrate that claim. My question is for my benefit as well as others.
I just realized that I work tomorrow, so we are not doing the hike. Instead, we are doing our usual 6pm meetup at the Johnson County Central Resource Library. We will do our hike next week (June 25th, 9am, Shawnee Lake Dog Park).
If I find that it does have actual impact on the podcast’s effectiveness, then I absolutely will seriously consider changing it. Your criticism has updated me marginally in that direction, but it’s not quite enough for me to act on it, particularly since you’re the only person to mention it. Thank you for your feedback!
I’m sure that there are Street Epistemologists that are guilty of this, but that’s literally opposite of what I encourage or practice.
At its core, SE is merely coaching people in asking the Fundamental Question of Rationality. As an SE-er, it’s my way of Raising the Sanity Waterline. It’s excellent at circumventing the Backfire Effect.
There are as many motivations for SE as there are practitioners.
The Bayesian Conspiracy needs to be updated with Jess as a new host. :)
I didn’t even know about this resource. Thanks!
I will be sure to include a transcript in all future episode descriptions/show notes.
Done! Link to the transcript has been posted in the description, and also here: https://docs.google.com/document/d/1MjTM4revF1upDvO00y0v8jF8G6HUbcABtFDxVYiLyPc/edit?usp=sharing
Is there a different venue/format for the notes you had in mind?
I’ll do that tonight!
Due to the nature of assembling a bug list, there might not be a whole lot to discuss and do. If we are finished significantly early, we might simply move on to Day 2: ‘Yoda Timers’
The Kansas City Rationalists are putting together a dojo, for the purpose of improving our cognitive abilities, happiness, and efficiency. For content, we will be using the ‘Hammertime’ sequence. Attendees are expected to read the introduction (‘Hammers and Nails’) and Day 1 (‘Bug Hunt’), as well as put together their bug list. The meeting will consist of meta-discussion about the content, and discussion about our experience putting together our bug lists. Bonus points if you are willing to share the bugs you found!
We will be meeting weekly, at the same time and location.
It works perfectly. Thanks again!
That looks perfect! I’ll test it out later.
My IRL rationality group is preparing to test that sequence. It looks promising, although we do have some quibbles with it. If we successfully finish testing, we’ll publish the details.
This may seem stupid, but I didn’t even think about doing odds calibration on such a small scale. That’s a great idea. Making a pinned Google Keep note now.
I see Bayesian Rationality as a methodology as much as it is a calculation. It’s being aware of our own prior beliefs, the confidence intervals of those beliefs, keeping those priors as close to the base rates as possible, being cognizant of how our biases can influence our perception of all this, trying to mitigate the effects of those biases, and updating based on the strength of evidence.
I’m trying to get better at math so I can do better calculations. It’s a major flaw in my technique I acknowledge and am trying to change.
But as you noted earlier, none of this answers my question. If I am not currently practicing your art, and you believe your art is good, what evidence do you have to support that claim?