Solving the AGI alignment problem demands a herculean level of ambition, far beyond what we’re currently bringing to bear. Dear Reader, grab a pen or open a google doc right now and answer these questions:
1.What would you do right now if you became 5x more ambitious? 2. If you believe we all might die soon, why aren’t you doing the ambitious thing?
This comment has been on my mind a lot the past week—not because I’m not ambitious, but because I’ve always been ambitious (intellectually at least) and frustrated in my ambitions. I’ve always had goals that I thought were important and neglected, I always directly pursued them from a socially marginal position rather than trying to make money first (or whatever people do when they put off their real ambitions), but I can’t say I ever had a decisive breakthrough, certainly not to recognition. So I only have partial progress on a scattered smorgasbord of unfulfilled agendas, and meanwhile, after OpenAI’s “o3 Christmas” and the imminent inauguration of an e/acc administration in the USA, it looks more than ever that we are out of time. I would be deeply unsurprised if it’s all over by the end of the year.
I’m left with choices like (1) concentrate on family in the final months (2) patch together what I have and use AI to quickly make the best of it (3) throw myself into AI safety. In practice they overlap, I’m doing all three, but there are tensions between them, and I feel the frustration of being badly positioned while also thinking I have no time for the meta-task of improving my position.
I think you are right that most people suffer from this lack of big ambition. I think I tend to fail in the other direction. I consistently come up with big plans, which might potentially have big payoffs if I could manage them, and then fail partway through. I’m a lot more effective (on average) working in a team where my big ideas get curtailed and my focus is kept on the achievable. I do also think that I bring value to under-aimers, by encouraging them to think bigger.
Sometimes people literally laugh out loud at me when I tell them about my current goals. I am not poorly calibrated, overall. I tell them that I know my chance of succeeding at my current goal is small, but that the payoff would really matter if I did manage it. In the course of aiming at a long term goal, I do tend to actively try to set aside my realistic estimate of my success, in order to let myself be buoyed by a sense of impending achievement. Being too realistic in the ‘doing’ phase, rather than the ‘planning’ phase tends to sharply bring down my chance of sticking with the project long enough that it has a chance to succeed.
Solving the AGI alignment problem demands a herculean level of ambition, far beyond what we’re currently bringing to bear. Dear Reader, grab a pen or open a google doc right now and answer these questions:
1. What would you do right now if you became 5x more ambitious?
2. If you believe we all might die soon, why aren’t you doing the ambitious thing?
This comment has been on my mind a lot the past week—not because I’m not ambitious, but because I’ve always been ambitious (intellectually at least) and frustrated in my ambitions. I’ve always had goals that I thought were important and neglected, I always directly pursued them from a socially marginal position rather than trying to make money first (or whatever people do when they put off their real ambitions), but I can’t say I ever had a decisive breakthrough, certainly not to recognition. So I only have partial progress on a scattered smorgasbord of unfulfilled agendas, and meanwhile, after OpenAI’s “o3 Christmas” and the imminent inauguration of an e/acc administration in the USA, it looks more than ever that we are out of time. I would be deeply unsurprised if it’s all over by the end of the year.
I’m left with choices like (1) concentrate on family in the final months (2) patch together what I have and use AI to quickly make the best of it (3) throw myself into AI safety. In practice they overlap, I’m doing all three, but there are tensions between them, and I feel the frustration of being badly positioned while also thinking I have no time for the meta-task of improving my position.
I think you are right that most people suffer from this lack of big ambition. I think I tend to fail in the other direction. I consistently come up with big plans, which might potentially have big payoffs if I could manage them, and then fail partway through. I’m a lot more effective (on average) working in a team where my big ideas get curtailed and my focus is kept on the achievable. I do also think that I bring value to under-aimers, by encouraging them to think bigger.
Sometimes people literally laugh out loud at me when I tell them about my current goals. I am not poorly calibrated, overall. I tell them that I know my chance of succeeding at my current goal is small, but that the payoff would really matter if I did manage it. In the course of aiming at a long term goal, I do tend to actively try to set aside my realistic estimate of my success, in order to let myself be buoyed by a sense of impending achievement. Being too realistic in the ‘doing’ phase, rather than the ‘planning’ phase tends to sharply bring down my chance of sticking with the project long enough that it has a chance to succeed.