I agree that increasing duration has a greater impact than increasing funding. But increasing duration is harder than increasing funding.
AI safety spending is only $0.1 billion while AI capabilities spending is $200 billion. Increasing funding by 10x is relatively more attainable, while increasing duration by 10x would require more of a miracle.
Even if you believe that funding today isn’t very useful and funding in the future is more useful, increasing funding now moves the Overton window a lot. It’s hard for any government which has traditionally spent only $0.01 billion to suddenly spend $100 billion. They’ll use the previous budget as an anchor point to decide the new budget.
For inventive steps, having twice as many “inventors” reduces the time to invention by half, while for engineering steps, having twice as many “engineers” doesn’t help very much.
(Assuming the time it takes each inventor to think of an invention is an independent exponential distribution)
I’m not sure if it means much, but I’d be very happy if AI safety could get another $50B from smart donors today.
I’d flag that [stopping AI development] would cost far more than $50B. I’d expect that we could easily lose $3T of economic value in the next few years if AI progress seriously stopped.
I guess, it seems to me like duration is basically dramatically more expensive to get than funding, for amounts of funding people would likely want.
I do think that convincing the government to pause AI in a way which sacrifices $3000 billion economic value, is relatively easier than directly spending $3000 billion on AI safety.
Maybe spending $1 is similarly hard to sacrificing $10-$100 of future economic value via preemptive regulation.[1]
But $0.1 billion AI safety spending is so ridiculously little (1000 times less than capabilities spending), increasing it may still be the “easiest” thing to do. Of course we should still push for regulation at the same time (it doesn’t hurt).
PS: what do you think of my open letter idea for convincing the government to increase funding?
Maybe “future economic value” is too complicated. A simpler guesstimate would be “spending $1 is similarly hard to sacrificing $10 of company valuations via regulation.”
I think both duration and funding are important.
I agree that increasing duration has a greater impact than increasing funding. But increasing duration is harder than increasing funding.
AI safety spending is only $0.1 billion while AI capabilities spending is $200 billion. Increasing funding by 10x is relatively more attainable, while increasing duration by 10x would require more of a miracle.
Even if you believe that funding today isn’t very useful and funding in the future is more useful, increasing funding now moves the Overton window a lot. It’s hard for any government which has traditionally spent only $0.01 billion to suddenly spend $100 billion. They’ll use the previous budget as an anchor point to decide the new budget.
My guess is that 4x funding ≈ 2x duration.[1]
For inventive steps, having twice as many “inventors” reduces the time to invention by half, while for engineering steps, having twice as many “engineers” doesn’t help very much.
(Assuming the time it takes each inventor to think of an invention is an independent exponential distribution)
I’m not sure if it means much, but I’d be very happy if AI safety could get another $50B from smart donors today.
I’d flag that [stopping AI development] would cost far more than $50B. I’d expect that we could easily lose $3T of economic value in the next few years if AI progress seriously stopped.
I guess, it seems to me like duration is basically dramatically more expensive to get than funding, for amounts of funding people would likely want.
I do think that convincing the government to pause AI in a way which sacrifices $3000 billion economic value, is relatively easier than directly spending $3000 billion on AI safety.
Maybe spending $1 is similarly hard to sacrificing $10-$100 of future economic value via preemptive regulation.[1]
But $0.1 billion AI safety spending is so ridiculously little (1000 times less than capabilities spending), increasing it may still be the “easiest” thing to do. Of course we should still push for regulation at the same time (it doesn’t hurt).
PS: what do you think of my open letter idea for convincing the government to increase funding?
Maybe “future economic value” is too complicated. A simpler guesstimate would be “spending $1 is similarly hard to sacrificing $10 of company valuations via regulation.”