--He wants potential investors to think that AI is the next internet or electricity—like, he wants them to think that it’ll automate half the jobs in the economy and make them trillionaires if they invest now.
--He wants his loyal lieutenants to think AI will reach superintelligent levels (notably, a much higher level than mere ‘the next electricity’) in the next few years, so they can help him plan out his moves.
--He wants his researchers to think that also, because they are the ones who need to do the research to get there & also the research to make sure the superintelligences are obedient to the company (i.e. to him).
--He wants everyone else (the public, Congress, etc.) to think it’s all just hype, so that they don’t interfere.
You might think you are Fighting the Good Fight by loudly shouting “it’s all hype.” However, the AI company employees don’t give a shit what you say, and neither do the investors. So you are playing right into [Insert evil tech CEO here]’s hands.
Imo your plea, as currently written, moderately anti-contributes to truth-seeking norms. There’s a missing mood imo. Like, I would have phrased it as eg:
First, if it is all hype, then it is good that you are saying it is all hype compared to not saying anything, even if various people ignore you. That said, you should treat your time/thought/writing as a valuable resource: it makes sense to assess what true things you are best positioned to help others understand and are most important for others to understand, and to focus on helping others understand those. And I think you might be wrong about this being such a truth, for the following reasons: …
(This could have been a preamble, or a post-preamble for attentional reasons.)
I think the version I suggest is significantly less norm-eroding, though it should still be viewed with some suspicion if one says this sort of thing selectively in response to positions one disagrees with.
One could claim that many of these people are themselves already corrupt consequentialists in their talking, i.e. already not in some sort of truth-seeking community, and it is just fine to “advise these criminals on how to do crime better” if that has good local consequences? I don’t think that’s fine — I think that would be a mistaken assessment of these people, and I think one should be trying more to bring even the people about whom this assessment is correct into the fold.
Yeah I should probably have a disclaimer somewhere that’s like “I respect people for following the policy of deciding what is true and then saying what they think is true. Insofar as you are saying it’s all hype because you really deep down are confident of that, then this isn’t addressed to you. This is addressed to the many people I know who are loudly saying ‘it’s all hype’ in significant part because they think that doing so is a way to Resist the evil tech companies.”
I agree, but I think this is how the Hypebusters see reality:
There’s definitely a bubble, it’s definitely going to pop, and it’s just a matter of when. I’m warning the wise would-be investors, including governments and the public. The foolish investors will be left holding the bag regardless, but if I can get the bubble to pop sooner, maybe the evil tech CEO hucksters won’t have retired to private islands yet.
Idk, probably true for many of them—BUT I have talked to at least one prominent hypebuster who told me words to the effect of “yeah you may be right about superintelligence Daniel but you shouldn’t talk about it the way you do, because even if ASI is feasible soon, talking about this publicly helps them raise capital”
I’m warning the wise would-be investors, including governments and the public.
you’re ascribing too much consequentialism to people who are generally not consequentialists. the mindset is more like:
AI is personally annoying to me and is obviously a scam/illusion. the only reason people are excited about it is because there are scammers flooding the zone with hype. me and my friends can’t do anything about it, but it’s at least gratifying to complain about in solidarity; the world may be insane, but i may as well point out the insanity. the wheel keeps turning and in a few years some other scam will come along.
I’d say that that depends on which side of the industry/government demarcation line you’re closest to. Assuming you’re neither a key player at a major institutional investor nor a congressional staffer, and don’t directly help to make decisions about where the big money goes or how the government reacts to LLMs, you’re probably a more minor player along one or both of those axes.
For example, if you’re a middle-manager at some midwestern industrial microcontroller programming company, you’ll be making decisions about tech adoption. <Evil tech CEO> wants you to be an optimist, because if your team doesn’t adopt, his bottom line shrinks slightly, and if your non-adopting team does well, your bosses (and theirs, and so on) might reconsider spending $200 per employee to get the other teams Ultra AI Max subscriptions from <Evil tech CEO>’s company.
On the flip side, if you’re a prolific Twitter account within the landscape of your statewide <Democrat/Republican> political machine, maybe <Evil tech CEO> would slightly prefer that you consider it all hype, and advise your followers to take the tax money for some data centers here and there, and quietly plan out what you’re going to do with them “when the bubble pops”.
Plea Addressed to the Hypebusters:
Here’s what [Insert evil tech CEO here] wants:
--He wants potential investors to think that AI is the next internet or electricity—like, he wants them to think that it’ll automate half the jobs in the economy and make them trillionaires if they invest now.
--He wants his loyal lieutenants to think AI will reach superintelligent levels (notably, a much higher level than mere ‘the next electricity’) in the next few years, so they can help him plan out his moves.
--He wants his researchers to think that also, because they are the ones who need to do the research to get there & also the research to make sure the superintelligences are obedient to the company (i.e. to him).
--He wants everyone else (the public, Congress, etc.) to think it’s all just hype, so that they don’t interfere.
You might think you are Fighting the Good Fight by loudly shouting “it’s all hype.” However, the AI company employees don’t give a shit what you say, and neither do the investors. So you are playing right into [Insert evil tech CEO here]’s hands.
Imo your plea, as currently written, moderately anti-contributes to truth-seeking norms. There’s a missing mood imo. Like, I would have phrased it as eg:
First, if it is all hype, then it is good that you are saying it is all hype compared to not saying anything, even if various people ignore you. That said, you should treat your time/thought/writing as a valuable resource: it makes sense to assess what true things you are best positioned to help others understand and are most important for others to understand, and to focus on helping others understand those. And I think you might be wrong about this being such a truth, for the following reasons: …
(This could have been a preamble, or a post-preamble for attentional reasons.)
I think the version I suggest is significantly less norm-eroding, though it should still be viewed with some suspicion if one says this sort of thing selectively in response to positions one disagrees with.
One could claim that many of these people are themselves already corrupt consequentialists in their talking, i.e. already not in some sort of truth-seeking community, and it is just fine to “advise these criminals on how to do crime better” if that has good local consequences? I don’t think that’s fine — I think that would be a mistaken assessment of these people, and I think one should be trying more to bring even the people about whom this assessment is correct into the fold.
Yeah I should probably have a disclaimer somewhere that’s like “I respect people for following the policy of deciding what is true and then saying what they think is true. Insofar as you are saying it’s all hype because you really deep down are confident of that, then this isn’t addressed to you. This is addressed to the many people I know who are loudly saying ‘it’s all hype’ in significant part because they think that doing so is a way to Resist the evil tech companies.”
I agree, but I think this is how the Hypebusters see reality:
Idk, probably true for many of them—BUT I have talked to at least one prominent hypebuster who told me words to the effect of “yeah you may be right about superintelligence Daniel but you shouldn’t talk about it the way you do, because even if ASI is feasible soon, talking about this publicly helps them raise capital”
you’re ascribing too much consequentialism to people who are generally not consequentialists. the mindset is more like:
I’m of the opinion that “automate half the jobs in the economy” is certainly on the table:
https://canaryinstitute.ai/research/task-exposure/
This is part of what I’m using to talk to policymakers about why they need to act NOW
I’d say that that depends on which side of the industry/government demarcation line you’re closest to. Assuming you’re neither a key player at a major institutional investor nor a congressional staffer, and don’t directly help to make decisions about where the big money goes or how the government reacts to LLMs, you’re probably a more minor player along one or both of those axes.
For example, if you’re a middle-manager at some midwestern industrial microcontroller programming company, you’ll be making decisions about tech adoption. <Evil tech CEO> wants you to be an optimist, because if your team doesn’t adopt, his bottom line shrinks slightly, and if your non-adopting team does well, your bosses (and theirs, and so on) might reconsider spending $200 per employee to get the other teams Ultra AI Max subscriptions from <Evil tech CEO>’s company.
On the flip side, if you’re a prolific Twitter account within the landscape of your statewide <Democrat/Republican> political machine, maybe <Evil tech CEO> would slightly prefer that you consider it all hype, and advise your followers to take the tax money for some data centers here and there, and quietly plan out what you’re going to do with them “when the bubble pops”.