Because if you people truly believe that the world is ending, then you would be ready to show something more than cheap words, make some great sacrifices which would only people in such a great desperation make.
This seems reasonable. Signalling. One would hope that actions like dedicating one’s career to AI alignment and AI ethics or leaving AI companies over ethical concerns would count as such a signal, and there are many people doing such things. But I don’t know how compelling these actions actually are to people. I could be making way more money if I was just trying to make money instead of trying to figure out how to work on AI ethics. That’s a pretty significant sacrifice, maybe not as significant as cutting off a let, but in some ways it might actually be more significant. It seems hard to quantify.
But to be more visible, I have considered doing that “human statue” thing buskers do with a sign saying “if I can pause everything, you can pause AGI development”.
Maybe I wasn’t clear enough, but changing a job is a good political expression gesture if you are proposing to vote for Blue. If you are saying something as extreme as that the world is ending, you should be ready to make signal as extreme as that—changing a job isn’t.
To be clear: by ‘signaling’ I don’t mean that people are calculating how much of hard to fake bayesian evidence changing a job provides, they don’t feel that changing a job anywhere as near worldshattering as belief in world end. Eh. It doesn’t feel that I convey it well. I think of something like person getting cancer diagnosis—people expect that if it is true, then Walter White is maximum level of calmness allowed (as to say—Breaking Bad into a drug dealer).
This seems reasonable. Signalling. One would hope that actions like dedicating one’s career to AI alignment and AI ethics or leaving AI companies over ethical concerns would count as such a signal, and there are many people doing such things. But I don’t know how compelling these actions actually are to people. I could be making way more money if I was just trying to make money instead of trying to figure out how to work on AI ethics. That’s a pretty significant sacrifice, maybe not as significant as cutting off a let, but in some ways it might actually be more significant. It seems hard to quantify.
But to be more visible, I have considered doing that “human statue” thing buskers do with a sign saying “if I can pause everything, you can pause AGI development”.
Maybe I wasn’t clear enough, but changing a job is a good political expression gesture if you are proposing to vote for Blue. If you are saying something as extreme as that the world is ending, you should be ready to make signal as extreme as that—changing a job isn’t.
To be clear: by ‘signaling’ I don’t mean that people are calculating how much of hard to fake bayesian evidence changing a job provides, they don’t feel that changing a job anywhere as near worldshattering as belief in world end. Eh. It doesn’t feel that I convey it well. I think of something like person getting cancer diagnosis—people expect that if it is true, then Walter White is maximum level of calmness allowed (as to say—Breaking Bad into a drug dealer).