I think I replied before reading your edit, sorry about that.
I’d say that Tyson does have incentives for popularizing a threat that’s right up his alley as an astrophysicist, though maybe not to the same degree as MIRIans. However, assuming the latter may be uncharitable, since people joined MIRI before they had that incentive. If the financial incentive played a crucial part, that dedicating-their-professional-life-to-AI-as-an-x-risk wouldn’t have happened.
As for “(AI takeoff) likely isn’t possible”, even if you throw that into your probability calculation, it may (in my opinion will) still beat out a “certain threat but with a very low probability”.
However, assuming the latter may be uncharitable, since people joined MIRI before they had that incentive.
I don’t think appeals to charity are valid here. Let’s imagine some known obvious cult, like Scientology. Hubbard said: “You don’t get rich writing science fiction. If you want to get rich, you start a religion.” So he declared what he was doing right away—however folks who joined, including perhaps even Mr. Miscavige himself * may well have had good intentions. Perhaps they wanted to “Clear the planet” or whatever. But so what? Once Miscavige got into the situation with appropriate incentives, he happily went crooked.
Regardless of why people joined MIRI, they have incentives to be crooked now.
*: apparently Miscavige was born into Scientology. “You reap what you sow.”
To be clear—I am not accusing them of being crooked. They seem like earnest people. I am merely explaining why they have a perception problem in a way that Tyson does not. Tyson is a well-known personality who makes money partly from his research gigs, and partly from speaking engagements. He has an honorary doctorate list half a page long. I am sure existential threats are one of his topics, but he will happily survive without asteroids.
I think I replied before reading your edit, sorry about that.
I’d say that Tyson does have incentives for popularizing a threat that’s right up his alley as an astrophysicist, though maybe not to the same degree as MIRIans. However, assuming the latter may be uncharitable, since people joined MIRI before they had that incentive. If the financial incentive played a crucial part, that dedicating-their-professional-life-to-AI-as-an-x-risk wouldn’t have happened.
As for “(AI takeoff) likely isn’t possible”, even if you throw that into your probability calculation, it may (in my opinion will) still beat out a “certain threat but with a very low probability”.
Thanks for your thoughts, upvotes all around :)
I don’t think appeals to charity are valid here. Let’s imagine some known obvious cult, like Scientology. Hubbard said: “You don’t get rich writing science fiction. If you want to get rich, you start a religion.” So he declared what he was doing right away—however folks who joined, including perhaps even Mr. Miscavige himself * may well have had good intentions. Perhaps they wanted to “Clear the planet” or whatever. But so what? Once Miscavige got into the situation with appropriate incentives, he happily went crooked.
Regardless of why people joined MIRI, they have incentives to be crooked now.
*: apparently Miscavige was born into Scientology. “You reap what you sow.”
To be clear—I am not accusing them of being crooked. They seem like earnest people. I am merely explaining why they have a perception problem in a way that Tyson does not. Tyson is a well-known personality who makes money partly from his research gigs, and partly from speaking engagements. He has an honorary doctorate list half a page long. I am sure existential threats are one of his topics, but he will happily survive without asteroids.