If what Sam Altman says is true then I would agree the more it’s talked about, the more it actually pushes forwards capabilities, and enhances interest in advancing these capabilities.
In that sense it seems like the real world second order effects are the opposite of the expressed intentions of notable personalities like Eliezer. It’s ironic, but matches my sense of how common unintended effects are.
“A machine smarter than humans could kill us all!”
“Are you saying ‘a machine smarter than humans’? That actually sounds like a business plan! If it is strong enough to kill us all, it is certainly also strong enough to make us billionaires!”
“But… what about the ‘killing us’ part?”
“Meh, if we don’t build it, someone else will. Think about the money, and hope for the best!”
If what Sam Altman says is true then I would agree the more it’s talked about, the more it actually pushes forwards capabilities, and enhances interest in advancing these capabilities.
In that sense it seems like the real world second order effects are the opposite of the expressed intentions of notable personalities like Eliezer. It’s ironic, but matches my sense of how common unintended effects are.
“A machine smarter than humans could kill us all!”
“Are you saying ‘a machine smarter than humans’? That actually sounds like a business plan! If it is strong enough to kill us all, it is certainly also strong enough to make us billionaires!”
“But… what about the ‘killing us’ part?”
“Meh, if we don’t build it, someone else will. Think about the money, and hope for the best!”