Here was the context: ”I think controlling Earth’s destiny is only modestly harder than understanding a sentence in English—in the same sense that I think Einstein was only modestly smarter than George W. Bush. EY makes a similar point.
You sound to me like someone saying, sixty years ago: “Maybe some day a computer will be able to play a legal game of chess—but simultaneously defeating multiple grandmasters, that strains credibility, I’m afraid.” But it only took a few decades to get from point A to point B. I doubt that going from “understanding English” to “controlling the Earth” will take that long.”
It seems clear to me EY was more saying something like “ASI will arrive soon after natural language understanding”, rather than it having anything to do with alignment specifically.
I suspect you’re misinterpreting EY’s comment.
Here was the context:
”I think controlling Earth’s destiny is only modestly harder than understanding a sentence in English—in the same sense that I think Einstein was only modestly smarter than George W. Bush. EY makes a similar point.
You sound to me like someone saying, sixty years ago: “Maybe some day a computer will be able to play a legal game of chess—but simultaneously defeating multiple grandmasters, that strains credibility, I’m afraid.” But it only took a few decades to get from point A to point B. I doubt that going from “understanding English” to “controlling the Earth” will take that long.”
It seems clear to me EY was more saying something like “ASI will arrive soon after natural language understanding”, rather than it having anything to do with alignment specifically.