I remember previous discussions that went something like this:
Alice: EA has too much money and not enough places to spend it.
Bob: Why not give grants anyone and everyone who wants to do, for example, alignment research?
Alice: That sets up bad incentives. Malicious actors would seek out those grants and wouldn’t do real work. And that’d have various bad downstream effects.
But what if those grants were minimal? What if they were only enough to live out a Mustachian lifestyle?
Well, let’s see. A Mustachian lifestyle costs something like $25k/year iirc. But it’s not just this years living expenses that matter. I think a lot of people would turn down the grant and go work for Google instead if it was only a few years because they want to set themselves up financially for the future. So what if the grant was $25k/year indefinitely? That could work but also starts to get large enough where people might try to exploit it.
What if there was some sort of house you could live at, commune style? Meals would be provided, there’d be other people there to socialize with, health care would be taken care of, you’d be given a small stipend for miscellaneous spending, etc. I don’t see how bad actors would be able to take advantage of that. They’d be living at the same house so if they were taking advantage of it it’d be obvious enough.
I think that only addresses a branch concern, not the main problem. It filters out some malicious actors, but certainly not all—you still get those who seek the grants IN ADDITION to other sources of revenue.
More importantly, even if you can filter out the bad actors, you likely spend a lot on incompetent actors, who don’t produce enough value/progress to justify the grants, even if they mean well.
I don’t think those previous discussions are still happening very much—EA doesn’t have spare cash, AFAIK. But when it was, it was nearly-identical to a lot of for-profit corporations—capital was cheap, interest rates were extremely low, and the difficulty was in figuring out what marginal investments brought future returns. EA (18 months ago) had a lot of free/cheap capital and no clear models for how to use it in ways that actually improved the future. Lowering the bar for grants likely didn’t convince people that it would actually have benefits.
I think that only addresses a branch concern, not the main problem. It filters out some malicious actors, but certainly not all—you still get those who seek the grants IN ADDITION to other sources of revenue.
Meaning that, now that they’re living in the commune they’ll be more likely seek more funding for other stuff? Maybe. But you can just keep the barriers as high as they currently are for the other stuff, which would just mean slightly(?) more applicants to filter out at the initial stages.
More importantly, even if you can filter out the bad actors, you likely spend a lot on incompetent actors, who don’t produce enough value/progress to justify the grants, even if they mean well.
My model is that the type of person who would be willing to move to a commune and live amongst and bunch of alignment researchers is pretty likely to be highly motivated and slightly less likely to be competent. The combination of those two things makes me thing they’d be pretty productive. But even if they weren’t, the bar of eg. $20k/year/person is pretty low.
I don’t think those previous discussions are still happening very much—EA doesn’t have spare cash, AFAIK.
Thanks for adding some clarity here. I get that impression too but not confidently. Do you know if it’s because a majority of the spare cash was from FTX and that went away when FTX collapsed?
EA (18 months ago) had a lot of free/cheap capital and no clear models for how to use it in ways that actually improved the future.
That’s always seemed really weird to me. I see lots of things that can be done. Finding the optimal action or even a 90th+ percentile action might be difficult but finding an action that meets some sort of minimal threshold seems like it’s not a very high bar. And letting the former get in the way of the latter seems like it’s making the perfect the enemy of the good.
Ah, I see—I didn’t fully understand that you meant “require (and observe) the lifestyle” not just “grants big enough to do so, and no bigger”. That makes it quite a bit safer from fraud and double-dipping, and a LOT less likely (IMO) to get anyone particularly effective that’s not already interested.
Mustachian Grants
I remember previous discussions that went something like this:
But what if those grants were minimal? What if they were only enough to live out a Mustachian lifestyle?
Well, let’s see. A Mustachian lifestyle costs something like $25k/year iirc. But it’s not just this years living expenses that matter. I think a lot of people would turn down the grant and go work for Google instead if it was only a few years because they want to set themselves up financially for the future. So what if the grant was $25k/year indefinitely? That could work but also starts to get large enough where people might try to exploit it.
What if there was some sort of house you could live at, commune style? Meals would be provided, there’d be other people there to socialize with, health care would be taken care of, you’d be given a small stipend for miscellaneous spending, etc. I don’t see how bad actors would be able to take advantage of that. They’d be living at the same house so if they were taking advantage of it it’d be obvious enough.
I think that only addresses a branch concern, not the main problem. It filters out some malicious actors, but certainly not all—you still get those who seek the grants IN ADDITION to other sources of revenue.
More importantly, even if you can filter out the bad actors, you likely spend a lot on incompetent actors, who don’t produce enough value/progress to justify the grants, even if they mean well.
I don’t think those previous discussions are still happening very much—EA doesn’t have spare cash, AFAIK. But when it was, it was nearly-identical to a lot of for-profit corporations—capital was cheap, interest rates were extremely low, and the difficulty was in figuring out what marginal investments brought future returns. EA (18 months ago) had a lot of free/cheap capital and no clear models for how to use it in ways that actually improved the future. Lowering the bar for grants likely didn’t convince people that it would actually have benefits.
Meaning that, now that they’re living in the commune they’ll be more likely seek more funding for other stuff? Maybe. But you can just keep the barriers as high as they currently are for the other stuff, which would just mean slightly(?) more applicants to filter out at the initial stages.
My model is that the type of person who would be willing to move to a commune and live amongst and bunch of alignment researchers is pretty likely to be highly motivated and slightly less likely to be competent. The combination of those two things makes me thing they’d be pretty productive. But even if they weren’t, the bar of eg. $20k/year/person is pretty low.
Thanks for adding some clarity here. I get that impression too but not confidently. Do you know if it’s because a majority of the spare cash was from FTX and that went away when FTX collapsed?
That’s always seemed really weird to me. I see lots of things that can be done. Finding the optimal action or even a 90th+ percentile action might be difficult but finding an action that meets some sort of minimal threshold seems like it’s not a very high bar. And letting the former get in the way of the latter seems like it’s making the perfect the enemy of the good.
Ah, I see—I didn’t fully understand that you meant “require (and observe) the lifestyle” not just “grants big enough to do so, and no bigger”. That makes it quite a bit safer from fraud and double-dipping, and a LOT less likely (IMO) to get anyone particularly effective that’s not already interested.