it seems that an isomorphic argument ‘proves’ that computer programs will crash—since “almost any” computer program crashes.
More obviously, an isomorphic argument ‘proves’ that books will be gibberish—since “almost any” string of characters is gibberish. An additional argument that non-gibberish books are very difficult to write and that naively attempting to write a non-gibberish book will almost certainly fail on the first try, is required. The analogous argument exists for AGI, of course, but is not given there.
Right—so we have already had 50+ years of trying and failing. A theoretical argument that we won’t succeed the first time does not tell us very much that we didn’t already know.
What is more interesting is the track record of engineers of not screwing up or killing people the first time.
We have records about engineers killing people for cars, trains, ships, aeroplanes and rockets. We have failure records from bridges, tunnels and skyscrapers.
Engineers do kill people—but often it is deliberately—e.g. nuclear bombs—or with society’s approval—e.g. car accidents. There are some accidents which are not obviously attributable to calculated risks—e.g. the Titanic, or the Tacoma Narrows bridge—but they typicallly represent a small fraction of the overall risks involved.
More obviously, an isomorphic argument ‘proves’ that books will be gibberish—since “almost any” string of characters is gibberish. An additional argument that non-gibberish books are very difficult to write and that naively attempting to write a non-gibberish book will almost certainly fail on the first try, is required. The analogous argument exists for AGI, of course, but is not given there.
Right—so we have already had 50+ years of trying and failing. A theoretical argument that we won’t succeed the first time does not tell us very much that we didn’t already know.
What is more interesting is the track record of engineers of not screwing up or killing people the first time.
We have records about engineers killing people for cars, trains, ships, aeroplanes and rockets. We have failure records from bridges, tunnels and skyscrapers.
Engineers do kill people—but often it is deliberately—e.g. nuclear bombs—or with society’s approval—e.g. car accidents. There are some accidents which are not obviously attributable to calculated risks—e.g. the Titanic, or the Tacoma Narrows bridge—but they typicallly represent a small fraction of the overall risks involved.