Yeah, to be clear, I don’t think that, and I think most people didn’t think that, but Eliezer has sometimes said stuff that made it seem like he thought people think that. I was remembering a quote from 2:49:00 at this podcast
...effective altruists were devoting some funding to this issue basically because I brow beat them into it. That’s how I would tell the story. And a whole bunch of them, like their theory of AI three years ago was that we probably had about 30 more years in which to work on this problem because of an elaborate argument about how large an AI needed AI model needed to be by analogy to human neurons and it would be trained via the following scaling law which would require this many gpus which at the rate of Moore’s Law and this like attempted rate of software progress began 30 years and I was like:
this entire thing falls apart at the very first joint where you’re trying to make an analogy between the AI models and and the number of human neurons this entire thing is bogus it’s been tried before in all these historical examples none of which were correct either and the effect of altruists like can’t tell that I’m speaking sense and that the 30-year projection is has no grasp on reality if they can’t tell the difference between a good and bad argument there until you know like stuff starts to blow up
now how do you tell who’s making progress in alignment I can stand around being like no no that’s wrong that’s wrong too this is particularly going to fail you you know like this is how it will fail when you try it but as far as they know they’re inventing Brilliant Solutions...
Indicating bioanchors make a stronger statement than it is, and that EAs are much more dogmatic about that report than most EAs are. Although to be fair, he did say probably here.
Yeah, to be clear, I don’t think that, and I think most people didn’t think that, but Eliezer has sometimes said stuff that made it seem like he thought people think that. I was remembering a quote from 2:49:00 at this podcast
Indicating bioanchors make a stronger statement than it is, and that EAs are much more dogmatic about that report than most EAs are. Although to be fair, he did say probably here.