More about why CFAR would be non-functional if it weren’t dogfooding:
As I said, my thoughts aren’t really in such a state that I know how to communicate them coherently. But I’ve often found that going ahead and communicating incoherently can nevertheless be valuable; it lets people’s implicit models interact more rapidly (both between people and within individuals), which can lead to developing explicit models that would otherwise have remained silent.
So, when I find myself in this position, I often throw a creative prompt to the part of my brain that thinks it knows something, and don’t bother trying to be coherent, just to start to draw out the shape of a thing. For example, if CFAR were a boat, what sort of boat would it be?
If CFAR were a boat, it would be a collection of driftwood bound together with twine. Each piece of driftwood was yanked from the shore in passing when the boat managed to get close enough for someone to pull it in. The riders of the boat are constantly re-organizing the driftwood (while standing on it), discarding parts (both deliberately and accidentally), and trying out variations on rudders and oars and sails. All the while, the boat is approaching a waterfall, and in fact the riders are not trying to make a boat at all, but rather an airplane.
The CFAR techniques are first of all the driftwood pieces themselves, and are also ways of balancing atop something with no rigid structure, of noticing when the raft is taking on water, of coordinating about which bits of driftwood ought to be tied to which other bits, and of continuing to try to build a plane when you’d rather forget the waterfall and go for a swim.
Which, if I had to guess, is an impressionistic painting depicting my concepts around an organization that wants to bootstrap an entire community into equalling the maybe impossible task of thinking well enough to survive x-risk.
This need to quickly bootstrap patterns of thought and feeling, not just of individual humans but of far-flung assortments of people, is what makes CFAR’s problem so hard, and its meager success thus far so impressive to me. It doesn’t have the tools it needs to efficiently and reliably accomplish the day-to-day tasks of navigation and not sinking and so forth, so it tries to build them by whatever means it can manage in any given moment.
It’s a shitty boat, and an even shittier plane. But if everyone on it were just passively riding the current, rather than constantly trying to build the plane and fly, the whole thing would sink well before it reached the waterfall.
More about why CFAR would be non-functional if it weren’t dogfooding:
As I said, my thoughts aren’t really in such a state that I know how to communicate them coherently. But I’ve often found that going ahead and communicating incoherently can nevertheless be valuable; it lets people’s implicit models interact more rapidly (both between people and within individuals), which can lead to developing explicit models that would otherwise have remained silent.
So, when I find myself in this position, I often throw a creative prompt to the part of my brain that thinks it knows something, and don’t bother trying to be coherent, just to start to draw out the shape of a thing. For example, if CFAR were a boat, what sort of boat would it be?
If CFAR were a boat, it would be a collection of driftwood bound together with twine. Each piece of driftwood was yanked from the shore in passing when the boat managed to get close enough for someone to pull it in. The riders of the boat are constantly re-organizing the driftwood (while standing on it), discarding parts (both deliberately and accidentally), and trying out variations on rudders and oars and sails. All the while, the boat is approaching a waterfall, and in fact the riders are not trying to make a boat at all, but rather an airplane.
The CFAR techniques are first of all the driftwood pieces themselves, and are also ways of balancing atop something with no rigid structure, of noticing when the raft is taking on water, of coordinating about which bits of driftwood ought to be tied to which other bits, and of continuing to try to build a plane when you’d rather forget the waterfall and go for a swim.
Which, if I had to guess, is an impressionistic painting depicting my concepts around an organization that wants to bootstrap an entire community into equalling the maybe impossible task of thinking well enough to survive x-risk.
This need to quickly bootstrap patterns of thought and feeling, not just of individual humans but of far-flung assortments of people, is what makes CFAR’s problem so hard, and its meager success thus far so impressive to me. It doesn’t have the tools it needs to efficiently and reliably accomplish the day-to-day tasks of navigation and not sinking and so forth, so it tries to build them by whatever means it can manage in any given moment.
It’s a shitty boat, and an even shittier plane. But if everyone on it were just passively riding the current, rather than constantly trying to build the plane and fly, the whole thing would sink well before it reached the waterfall.