To what extent does LessWrong expect knowledge in the STEM fields? My understanding is that rationalist thinking is based on Bayesian probability calculation and being self-aware of cognitive biases to reach the truth. The problem is I’m more on the philosophical side if anything (I suppose I can do fuzzy logic as well as classical logic), as I’ve been trying to read as much of the philosophical cannon starting from Plato a couple years ago. And to be honest, I have to say that mathematics is arguably the subject that I’m worst at. Given this, how appropriate is it for me to enter this community?
To what extent does LessWrong expect knowledge in the STEM fields?
I mean, it helps? I wouldn’t say it’s required.
My understanding is that rationalist thinking is based on Bayesian probability calculation
It’s less to do with Bayes as in actually-doing-the-calculation and more to do with Bayes as in recognizing-there’s-an-ideal-to-approximate. Letting evidence shift your position at all is the main thing. (If you do an explicit Bayesian calculation about something real, you’ll have done about as many explicit Bayesian calculations as the median LW user has this year.)
The problem is I’m more on the philosophical side if anything
If you pick two dozen or so posts at random, I’d expect you’ll get more Philosophical ones than STEMmy ones. (AI posts don’t count for either column imo; also, they usually don’t hard-require technical background other than “LLMs are a thing now” and “inhuman intellects being smarter than humans is kinda scary”.)
Given this, how appropriate is it for me to enter this community?
I have a question:
To what extent does LessWrong expect knowledge in the STEM fields? My understanding is that rationalist thinking is based on Bayesian probability calculation and being self-aware of cognitive biases to reach the truth. The problem is I’m more on the philosophical side if anything (I suppose I can do fuzzy logic as well as classical logic), as I’ve been trying to read as much of the philosophical cannon starting from Plato a couple years ago. And to be honest, I have to say that mathematics is arguably the subject that I’m worst at. Given this, how appropriate is it for me to enter this community?
I mean, it helps? I wouldn’t say it’s required.
It’s less to do with Bayes as in actually-doing-the-calculation and more to do with Bayes as in recognizing-there’s-an-ideal-to-approximate. Letting evidence shift your position at all is the main thing. (If you do an explicit Bayesian calculation about something real, you’ll have done about as many explicit Bayesian calculations as the median LW user has this year.)
I mean, we were. Then we found out that most published scientific findings, including the psych literature which includes the bias literature, don’t replicate. And AI, the topic most of us were trying to de-bias ourselves to think about, went kinda crazy over the last five years. So now we talk about AI more than biases. (If you can find something worthwhile to say about biases, please do!)
If you pick two dozen or so posts at random, I’d expect you’ll get more Philosophical ones than STEMmy ones. (AI posts don’t count for either column imo; also, they usually don’t hard-require technical background other than “LLMs are a thing now” and “inhuman intellects being smarter than humans is kinda scary”.)
Extremely. Welcome aboard!