We do not know what it takes to build a digital intelligence. Because of this, we do not know what groundwork will be needed to understand intelligence, nor how long it may take to get there.
This sentence doesn’t cover all evolutionary based approaches, but that’s not necessarily critically important.
Should you plan as though there are 50⁄50 odds of reaching digital intelligence in the next 30 years? Are you 99% confident that digital intelligence won’t arrive in the next 30 years? Or is it somewhere in between?
People don’t understand VoI, opportunity cost, etc. You need to be more explicit about different beliefs about what is likely imply different optimal actions.
guts
“gut”
Many neuroscientists think this estimate is too optimistic, but the basic approach has promise.
Many neuroscientists think this estimate is far too optimistic [cite], but the basic approach has promise [cite].
But what if 20 more diggers join him, and they are all given steroids?
Or food, even.
Include an alternative analogy, such as a shovel breaking, to correspond to difficulty writing threaded programs, etc.
This sentence doesn’t cover all evolutionary based approaches, but that’s not necessarily critically important.
People don’t understand VoI, opportunity cost, etc. You need to be more explicit about different beliefs about what is likely imply different optimal actions.
“gut”
Many neuroscientists think this estimate is far too optimistic [cite], but the basic approach has promise [cite].
Or food, even.
Include an alternative analogy, such as a shovel breaking, to correspond to difficulty writing threaded programs, etc.