HCH is more like an infinite bureaucracy. You have some underlings who you can ask to think for a short time, and those underlings have underlings of their own who they can ask to think for a short time, and so on. Nobody in HCH thinks for a long time, though the total thinking time of one person and their recursive-underlings may be long.
(This is exactly why factored cognition is so important for HCH & co: the thinking all has to be broken into bite-size pieces, which can be spread across people.)
Yes sorry — I’m aware that in the HCH procedure no one human thinks for a long time. I’m generally used to mentally abstracting HCH (or whatever scheme fits that slot) as something that could “effectively replicate the benefits you could get from having a human thinking a long time,” in terms of the role that it plays in an overall scheme for alignment. This isn’t guaranteed to work out, of course. My position is similar to Rohin’s above:
I just personally find it easier to think about “benefits of a human thinking for a long time” and then “does HCH get the same benefits as humans thinking for a long time” and then “does iterated amplification get the same benefits as HCH”.
HCH is more like an infinite bureaucracy. You have some underlings who you can ask to think for a short time, and those underlings have underlings of their own who they can ask to think for a short time, and so on. Nobody in HCH thinks for a long time, though the total thinking time of one person and their recursive-underlings may be long.
(This is exactly why factored cognition is so important for HCH & co: the thinking all has to be broken into bite-size pieces, which can be spread across people.)
Yes sorry — I’m aware that in the HCH procedure no one human thinks for a long time. I’m generally used to mentally abstracting HCH (or whatever scheme fits that slot) as something that could “effectively replicate the benefits you could get from having a human thinking a long time,” in terms of the role that it plays in an overall scheme for alignment. This isn’t guaranteed to work out, of course. My position is similar to Rohin’s above: