I am surprised by this, for example. Can you give examples of some of your controversial takes on any issues? I am wondering if you just do not have very controversial takes.
Controversial is obviously relative to the audience, but I have lots of opinionated beliefs that might make various audiences mad at me. Some different flavors include
I am roughly a total utilitarian, which involves lots of beliefs about what actions are moral that all kinds of people might strongly disagree with. For example, I don’t agree that inequality is intrinsically bad.
I roughly agree with (my understanding of) Zack Davis’s arguments about the superiority of cluster-of-traits-based definitions of gender words, rather than self-ID based definitions, which I am sure would make many trans people mad.
I think it’s ridiculous for suicide to be illegal and marginal efforts to increase the availability of suicide seem great.
I frequently criticize my coworkers’ ideas of what to work on as being bad or not worth doing.
Stuff like correlation between IQ and ethnicity is a bit more controversial, but my takes are usually much more controversial than that. I often wonder what would have happened if the US had wiped out USSR’s main cities post WW2 and established global hegemony (wipe out any nation that doesn’t submit, maintain nuclear monopoly). I have genuine respect and admiration for people like hitler or the unabomber, more than for a lot of the people I see around me, despite disagreeing with their object level opinions (I’m not a nazi or an anarchoprimitivist).
I am not very knowledgeable about or interested in history or social science, so I have less strong opinions about things like this, and don’t talk about them very often. For example, my opinion about IQ and ethnicity is that the obvious group differences seem to obviously suggest some kind of genetic difference, but I know psychologists have some complicated statistical argument for why that may not be the case, so therefore I don’t know.
I note, however, that I can’t think of the last time before now that I have ever been in a conversation where it seemed like my views on IQ and ethnic groups were relevant, so I don’t have a problem with pissing people off by expressing them. Is this different for you? How do you end up in discussions about it with people who will then be offended when you say your opinion? Is it some kind of thing where you participate in social media conversations about it which then broadcast your opinions to basically random people? (I don’t use any platform like that.)
Do you expect to ever become at all famous in your life?
Definitely not. It sounds very annoying. I am not altruistic enough to want to do something that involves being substantially famous.
I can send a list of examples of people whose lives have been ruined by this. Do you claim I am misjudging the probability this happens to me personally?
Probably, if it’s a big consideration to you. I think it seems like a tail risk that isn’t very substantial, unless your life depends on the approval of others in a somewhat atypical way. (Perhaps it does, if your life involves being famous.)
Do you have actual experience in bio security? I doubt most people in EA circles or even many academics would provide you with any of the funding or connections required to work in bio security if this is your current stance on the matter.
No, I just quoted this because it was the example you gave. I know little about biosecurity and I don’t intend my remarks to extend to “infohazard” kinds of information. Perhaps you know things about the biosecurity nonprofit world that I don’t. However, I know something about the kinds of things that some EA grantmakers like SFF consider, and I don’t see why being the kind of person who speaks their mind about controversial beliefs would make them less likely to fund you.
For example, my opinion about IQ and ethnicity is that the obvious group differences seem to obviously suggest some kind of genetic difference, but I know psychologists have some complicated statistical argument for why that may not be the case, so therefore I don’t know.
Yeah this seems fair.
Is this different for you? How do you end up in discussions about it with people who will then be offended when you say your opinion?
In the past I have actively brought up political topics to discuss with my close circle of trust. If you discuss enough political topics you can easily end up hitting on this particular topic (IQ and group differences). I have distanced from people over similar topics though not this exact one. I can imagine the stakes being much higher once I am in a position of influence (which I aspire to be in).
Multiple such experiences are part of what made me realise there are pros and cons to having even an innocent discussion with your closest friend.
I am curious how you navigate this in discussions with people close to you.
unless your life depends on the approval of others in a somewhat atypical way. (Perhaps it does, if your life involves being famous.)
Most big ways of influencing the world route through acquiring approval of others. Maybe not 100 million people, but atleast 1000 people.
I know little about biosecurity and I don’t intend my remarks to extend to “infohazard” kinds of information.
Yes my point in bringing up that example was infohazardous information. And not just, say knowing DNA sequences of unreleased pathogens worse than covid, but lots of lower stakes information like knowledge of protocols and operating equipment, knowing how to procure cultures and equipment anonymously, knowing who knows what inside the biosecurity world, etc. Even one sufficiently agentic and trusted PhD going rogue can cause meaningful damage IMO.
The resulting secrecy-focussed culture has implications for the personality traits of the senior people in the space, how big their circles of trust are, how they look at and treat other people, and so on. (I don’t know as much as I’d like about what the implications are, but I know they’re non-trivial.)
Also, not everyone who is quite open in the biosecurity world should necessarily be as open as they are, that is a whole another discussion. It’s not obvious to me anyone has sufficiently figured this stuff out to conclusively say that for biosecurity, openness policy X is Good and Y is Bad, end of discussion. Which is why I want to discuss it.
that some EA grantmakers like SFF consider, and I don’t see why being the kind of person who speaks their mind about controversial beliefs would make them less likely to fund you.
I think this is generally true, UNTIL you hit one of the big red flags they have secretly written down in their google doc.
(Or them just not liking you as a person; EA leadership sometimes claims to be high-trust which means its implicit reliance on “vibes” and friends of friends as a short circuit for trust is non-zero. But this is a less important point so I won’t argue it a lot.)
More importantly though, doing important stuff in the world requires gaining approval of more people than just SFF grantmakers.
I guess this comes back to my earlier point about—do you want to blindly execute the implications of whatever culture SFF grantmakers want to propagate (which is ultimately traceable back to Yudkowsky in the year 2000) or do you want to create your own culture? Culture is billion-dimensional.
This is kinda vague and I haven’t explained it very well, but it is something I think about a lot.
Controversial is obviously relative to the audience, but I have lots of opinionated beliefs that might make various audiences mad at me. Some different flavors include
I am roughly a total utilitarian, which involves lots of beliefs about what actions are moral that all kinds of people might strongly disagree with. For example, I don’t agree that inequality is intrinsically bad.
I roughly agree with (my understanding of) Zack Davis’s arguments about the superiority of cluster-of-traits-based definitions of gender words, rather than self-ID based definitions, which I am sure would make many trans people mad.
I think it’s ridiculous for suicide to be illegal and marginal efforts to increase the availability of suicide seem great.
I frequently criticize my coworkers’ ideas of what to work on as being bad or not worth doing.
I am not very knowledgeable about or interested in history or social science, so I have less strong opinions about things like this, and don’t talk about them very often. For example, my opinion about IQ and ethnicity is that the obvious group differences seem to obviously suggest some kind of genetic difference, but I know psychologists have some complicated statistical argument for why that may not be the case, so therefore I don’t know.
I note, however, that I can’t think of the last time before now that I have ever been in a conversation where it seemed like my views on IQ and ethnic groups were relevant, so I don’t have a problem with pissing people off by expressing them. Is this different for you? How do you end up in discussions about it with people who will then be offended when you say your opinion? Is it some kind of thing where you participate in social media conversations about it which then broadcast your opinions to basically random people? (I don’t use any platform like that.)
Definitely not. It sounds very annoying. I am not altruistic enough to want to do something that involves being substantially famous.
Probably, if it’s a big consideration to you. I think it seems like a tail risk that isn’t very substantial, unless your life depends on the approval of others in a somewhat atypical way. (Perhaps it does, if your life involves being famous.)
No, I just quoted this because it was the example you gave. I know little about biosecurity and I don’t intend my remarks to extend to “infohazard” kinds of information. Perhaps you know things about the biosecurity nonprofit world that I don’t. However, I know something about the kinds of things that some EA grantmakers like SFF consider, and I don’t see why being the kind of person who speaks their mind about controversial beliefs would make them less likely to fund you.
Yeah this seems fair.
In the past I have actively brought up political topics to discuss with my close circle of trust. If you discuss enough political topics you can easily end up hitting on this particular topic (IQ and group differences). I have distanced from people over similar topics though not this exact one. I can imagine the stakes being much higher once I am in a position of influence (which I aspire to be in).
Multiple such experiences are part of what made me realise there are pros and cons to having even an innocent discussion with your closest friend.
I am curious how you navigate this in discussions with people close to you.
Most big ways of influencing the world route through acquiring approval of others. Maybe not 100 million people, but atleast 1000 people.
Yes my point in bringing up that example was infohazardous information. And not just, say knowing DNA sequences of unreleased pathogens worse than covid, but lots of lower stakes information like knowledge of protocols and operating equipment, knowing how to procure cultures and equipment anonymously, knowing who knows what inside the biosecurity world, etc. Even one sufficiently agentic and trusted PhD going rogue can cause meaningful damage IMO.
The resulting secrecy-focussed culture has implications for the personality traits of the senior people in the space, how big their circles of trust are, how they look at and treat other people, and so on. (I don’t know as much as I’d like about what the implications are, but I know they’re non-trivial.)
Also, not everyone who is quite open in the biosecurity world should necessarily be as open as they are, that is a whole another discussion. It’s not obvious to me anyone has sufficiently figured this stuff out to conclusively say that for biosecurity, openness policy X is Good and Y is Bad, end of discussion. Which is why I want to discuss it.
I think this is generally true, UNTIL you hit one of the big red flags they have secretly written down in their google doc.
(Or them just not liking you as a person; EA leadership sometimes claims to be high-trust which means its implicit reliance on “vibes” and friends of friends as a short circuit for trust is non-zero. But this is a less important point so I won’t argue it a lot.)
More importantly though, doing important stuff in the world requires gaining approval of more people than just SFF grantmakers.
I guess this comes back to my earlier point about—do you want to blindly execute the implications of whatever culture SFF grantmakers want to propagate (which is ultimately traceable back to Yudkowsky in the year 2000) or do you want to create your own culture? Culture is billion-dimensional.
This is kinda vague and I haven’t explained it very well, but it is something I think about a lot.