it’s telling that you equate “being rational agents” with “more intelligence”, but as long as this cones in the context of denying the very possibility of yudkowskian asi ill vibe with it.
edit: your entire reply suffers from the local pathology of equating intelligence with “thinkiness”. “a more detailed world model, thinking for longer” are only symptoms of more intelligence if they get you closer to a goal. you want to have the capacity of doing that if/when necessary, not the habit of doing it constantly, even when the only effect is a more pointlesdly verbose reply.
re: jessi and my understanding: that is known as “a joke”, borne of the fact that someone was smugly opining on my lack of understanding of a concept for which I’ve been Jessis sounding board and beta tester as she fleshed it out.
it’s telling that you equate “being rational agents” with “more intelligence”, but as long as this cones in the context of denying the very possibility of yudkowskian asi ill vibe with it.
edit: your entire reply suffers from the local pathology of equating intelligence with “thinkiness”. “a more detailed world model, thinking for longer” are only symptoms of more intelligence if they get you closer to a goal. you want to have the capacity of doing that if/when necessary, not the habit of doing it constantly, even when the only effect is a more pointlesdly verbose reply.
re: jessi and my understanding: that is known as “a joke”, borne of the fact that someone was smugly opining on my lack of understanding of a concept for which I’ve been Jessis sounding board and beta tester as she fleshed it out.