I was just watching this Andrew Huberman video titled “Train to Gain Energy & Avoid Brain Fog”. The interviewee was talking about track athletes and stuff their coaches would have them do.
It made me think back to Anders Ericsson’s book Peak: Secrets from the New Science of Expertise. The book is popular for discussing the importance of deliberate practice, but another big takeaway from the book is the importance of receiving coaching. I think that takeaway gets overlooked. Top performers in fields like chess, music and athletics almost universally receive coaching.
And at the highest levels the performers will have a team of coaches. LeBron James is famous for spending roughly $1.5 million a year on his body.
And he’s like, “Well, he’s replicated the gym that whatever team — whether it was Miami or Cleveland — he’s replicated all the equipment they have in the team’s gym in his house. He has two trainers. Everywhere he goes, he has a trainer with him.” I’m paraphrasing what he told me, so I might not be getting all these facts right. He’s got chefs. He has all the science of how to sleep. All these different things. Masseuses. Everything he does in his life is constructed to have him play basketball and to stay on the court and to be as healthy as possible and to absorb punishment when he goes into the basket and he gets crushed by people.
This makes me think about AI safety. I feel like the top alignment researchers—and ideally a majority of competent alignment researchers—should have such coaching and resources available to them.
I’m not exactly sure what form this would take. Academic/technical coaches? Writing coach? Performance psychologists? A sleep specialist? Nutritionist? Meditation coach?
All of this costs money of course. I’m not arguing that this is the most efficient place to allocate our limited resources. I don’t have enough of an understanding of what the other options are to make such an argument.
But I will say that providing such resources to alignment researchers seems like it should pretty meaningfully improve their productivity. And if so, we are in fact funding constrained. I recall (earlier?) conversations about funding not being a constraint, rather the real constraint is that there aren’t good places to spend such money.
Also relevant is that this is perhaps an easier sell to prospective donors then something more wacky. Like, it seems like a safe bet to have a solid impact, and there’s a precedent for providing expert performers with such coaching, so maybe that sort of thing is appealing to prospective donors.
Finally, I recall hearing at some point that in a field like physics, the very top researchers—people like Einstein—have a very disproportionate impact. If so, I’d think that it’s at least pretty plausible that something similar is true in the field of AI alignment. And if it is, then it’d probably make sense to spend time 1) figuring out who the Einsteins are and then 2) investing in them and doing what we can to maximize their impact.
Top performers in fields like chess, music and athletics almost universally receive coaching.
I wonder how much of that is actually based on science, and how much is just superstition / scams.
Do you know whether these coaches are somehow trained / certified themselves? Like, are there some scientific studies that a wannabe coach needs to learn and take an exam? Or is it more like some random person decides “I feel smart, I am going to be a coach”, and the rest depends only on their charisma and marketing?
If I somehow happen to be a top athlete, is there some organization where I can go and tell them “give me a list of coaches you recommend”, or do I have to search online and make a guess about what is a scam and what is not?
One reason I am asking is that if there is a coach certifying body and a list of scientific literature, it might be interesting for someone to look at the literature and maybe write some summary on LW.
I would expand your suggestion; I think it would be interesting to have something like “coaching for intellectuals” in general, not just for AI alignment researchers. Sleep, sport, nutrition, meditation, writing, that applies to many professions. Well, the way I said it, I guess it applies to all humans, but let’s say that the coaching for intellectuals would also specifically address things like written communication or risks of sedentary lifestyle, and it could give you articles to read.
The cheapest version could consist of a website with articles on different topics; a coach that would meet you once in a few months to talk to, who would give you a high-level summary and links to more details; and maybe some hand-holding such as “you recommended me to get a blood test, so… what specifically should I tell the doctor I want… ok, now I got this result, how do I interpret it?”. And the more expensive versions would include more time spent with the coach, any maybe some other services, like buying the healthy food / supplements / home gym equipment / sleeping masks / whatever. Maybe with some group discounts, if people at the same place subscribe, so the couch can meet them as a group.
What is specific, from this perspective, for AI alignment researchers? Maybe the feeling of great responsibility, higher chance of burnout and nightmares?
I wonder how much of that is actually based on science, and how much is just superstition / scams.
In basketball there isn’t any certification. Coaches/trainers usually are former players themselves who have had some amount of success, so that points towards them being competent to some extent. There’s also the fact that if you don’t feel like you’re making progress with a coach you can fire them and hire a new one. But I think there is also a reasonably sized risk of the coach lacking competence and certain players sticking with them anyway, for a variety of reasons.
I’m sure that similar things are true in other fields, including athletics but also in fields like chess where there isn’t a degree you could get. In fields with certifications and degrees it probably happens less often, but I know I’ve dealt with my fair share of incompetent MDs and PhDs.
So ultimately, I agree with the sentiment that finding competent coaches might involve some friction, but despite that, it still feels to me like a very tractable problem. Relatedly, I’m seeing now that there has been some activity on the topic of coaching in the EA community.
What is specific, from this perspective, for AI alignment researchers? Maybe the feeling of great responsibility, higher chance of burnout and nightmares?
I don’t expect that the needs of alignment researchers are too unique when compared to the needs of other intellectuals. I mention alignment researchers because I think they’re a prototypical example of people having large, positive impacts on the world, as opposed to intellectuals who study string theory or something.
I was just watching this Andrew Huberman video titled “Train to Gain Energy & Avoid Brain Fog”. The interviewee was talking about track athletes and stuff their coaches would have them do.
It made me think back to Anders Ericsson’s book Peak: Secrets from the New Science of Expertise. The book is popular for discussing the importance of deliberate practice, but another big takeaway from the book is the importance of receiving coaching. I think that takeaway gets overlooked. Top performers in fields like chess, music and athletics almost universally receive coaching.
And at the highest levels the performers will have a team of coaches. LeBron James is famous for spending roughly $1.5 million a year on his body.
This makes me think about AI safety. I feel like the top alignment researchers—and ideally a majority of competent alignment researchers—should have such coaching and resources available to them.
I’m not exactly sure what form this would take. Academic/technical coaches? Writing coach? Performance psychologists? A sleep specialist? Nutritionist? Meditation coach?
All of this costs money of course. I’m not arguing that this is the most efficient place to allocate our limited resources. I don’t have enough of an understanding of what the other options are to make such an argument.
But I will say that providing such resources to alignment researchers seems like it should pretty meaningfully improve their productivity. And if so, we are in fact funding constrained. I recall (earlier?) conversations about funding not being a constraint, rather the real constraint is that there aren’t good places to spend such money.
Also relevant is that this is perhaps an easier sell to prospective donors then something more wacky. Like, it seems like a safe bet to have a solid impact, and there’s a precedent for providing expert performers with such coaching, so maybe that sort of thing is appealing to prospective donors.
Finally, I recall hearing at some point that in a field like physics, the very top researchers—people like Einstein—have a very disproportionate impact. If so, I’d think that it’s at least pretty plausible that something similar is true in the field of AI alignment. And if it is, then it’d probably make sense to spend time 1) figuring out who the Einsteins are and then 2) investing in them and doing what we can to maximize their impact.
I wonder how much of that is actually based on science, and how much is just superstition / scams.
Do you know whether these coaches are somehow trained / certified themselves? Like, are there some scientific studies that a wannabe coach needs to learn and take an exam? Or is it more like some random person decides “I feel smart, I am going to be a coach”, and the rest depends only on their charisma and marketing?
If I somehow happen to be a top athlete, is there some organization where I can go and tell them “give me a list of coaches you recommend”, or do I have to search online and make a guess about what is a scam and what is not?
One reason I am asking is that if there is a coach certifying body and a list of scientific literature, it might be interesting for someone to look at the literature and maybe write some summary on LW.
I would expand your suggestion; I think it would be interesting to have something like “coaching for intellectuals” in general, not just for AI alignment researchers. Sleep, sport, nutrition, meditation, writing, that applies to many professions. Well, the way I said it, I guess it applies to all humans, but let’s say that the coaching for intellectuals would also specifically address things like written communication or risks of sedentary lifestyle, and it could give you articles to read.
The cheapest version could consist of a website with articles on different topics; a coach that would meet you once in a few months to talk to, who would give you a high-level summary and links to more details; and maybe some hand-holding such as “you recommended me to get a blood test, so… what specifically should I tell the doctor I want… ok, now I got this result, how do I interpret it?”. And the more expensive versions would include more time spent with the coach, any maybe some other services, like buying the healthy food / supplements / home gym equipment / sleeping masks / whatever. Maybe with some group discounts, if people at the same place subscribe, so the couch can meet them as a group.
What is specific, from this perspective, for AI alignment researchers? Maybe the feeling of great responsibility, higher chance of burnout and nightmares?
In basketball there isn’t any certification. Coaches/trainers usually are former players themselves who have had some amount of success, so that points towards them being competent to some extent. There’s also the fact that if you don’t feel like you’re making progress with a coach you can fire them and hire a new one. But I think there is also a reasonably sized risk of the coach lacking competence and certain players sticking with them anyway, for a variety of reasons.
I’m sure that similar things are true in other fields, including athletics but also in fields like chess where there isn’t a degree you could get. In fields with certifications and degrees it probably happens less often, but I know I’ve dealt with my fair share of incompetent MDs and PhDs.
So ultimately, I agree with the sentiment that finding competent coaches might involve some friction, but despite that, it still feels to me like a very tractable problem. Relatedly, I’m seeing now that there has been some activity on the topic of coaching in the EA community.
I don’t expect that the needs of alignment researchers are too unique when compared to the needs of other intellectuals. I mention alignment researchers because I think they’re a prototypical example of people having large, positive impacts on the world, as opposed to intellectuals who study string theory or something.