K1 wants to write a novel because she calculated a novel to be the best thing to be working on given many environmental factors as input to a reflectively stable and emotionally integrated theory of axiology.
The novel is completed if at least 300 future Ks agree.
However, K1 mostly ignores “other people” in favor of thinking of herself as something like a local/momentary snapshot of a turing machine’s read/write head in operation.…
She has obvious inputs and an obvious place for outputs, plus some memory and awareness of the larger program, and an ability and interest in fixing the program she is executing when definite errors are detected… and just trusting the system otherwise.
K1 writes 1/300th of a novel.
Since K1′s value estimates were very reasonable, the estimates are replicated by many future K’s and 753 days later a novel is finished.
It took more than 300 days, but during the 753 days many other similarly valuable things were also done that were plausibly valuable things to have done. The whole time, K has been more or less safely interuptible, and it would have been pretty weird if K had ignored surprising issues that were more important than the novel when those things actually came up.
If the novel was somehow never finished that would have been OK. It probably would mean it was an omnicient-persperctive-error to have worked on it, but that’s OK because humans aren’t omniscient.
Lesson: stop worrying about other people (who are often mostly crazy anyway) and instead pay attention to efficiently and reliably knowing what is actually good.
K1 wants to write a novel because she calculated a novel to be the best thing to be working on given many environmental factors as input to a reflectively stable and emotionally integrated theory of axiology.
The novel is completed if at least 300 future Ks agree.
However, K1 mostly ignores “other people” in favor of thinking of herself as something like a local/momentary snapshot of a turing machine’s read/write head in operation.…
She has obvious inputs and an obvious place for outputs, plus some memory and awareness of the larger program, and an ability and interest in fixing the program she is executing when definite errors are detected… and just trusting the system otherwise.
K1 writes 1/300th of a novel.
Since K1′s value estimates were very reasonable, the estimates are replicated by many future K’s and 753 days later a novel is finished.
It took more than 300 days, but during the 753 days many other similarly valuable things were also done that were plausibly valuable things to have done. The whole time, K has been more or less safely interuptible, and it would have been pretty weird if K had ignored surprising issues that were more important than the novel when those things actually came up.
If the novel was somehow never finished that would have been OK. It probably would mean it was an omnicient-persperctive-error to have worked on it, but that’s OK because humans aren’t omniscient.
Lesson: stop worrying about other people (who are often mostly crazy anyway) and instead pay attention to efficiently and reliably knowing what is actually good.