It’s almost certainly a reference to this theory, which essentially describes extremely difficult to predict events that have a disproportionate impact. Unfortunately the very nature of these events makes them difficult to control for.
I’m familiar with the term, and it’s not the same thing as obscure edge cases that require special handling; rather, it’s unforseeable events that have disproportionately large impact, nothing to do with being an edge case or requiring special handling per se. An unknown unknown that never has an impact is not a black swan. And something that requires special handling isn’t a black swan if that “special handling” is known in advance (even if the fact that it will happen is not).
I think an unforeseeable edge case or bug that requires deep refactoring and severely cuts into allotted development time fits the bill for a black swan dead on.
It’s almost certainly a reference to this theory, which essentially describes extremely difficult to predict events that have a disproportionate impact. Unfortunately the very nature of these events makes them difficult to control for.
I’m familiar with the term, and it’s not the same thing as obscure edge cases that require special handling; rather, it’s unforseeable events that have disproportionately large impact, nothing to do with being an edge case or requiring special handling per se. An unknown unknown that never has an impact is not a black swan. And something that requires special handling isn’t a black swan if that “special handling” is known in advance (even if the fact that it will happen is not).
I think an unforeseeable edge case or bug that requires deep refactoring and severely cuts into allotted development time fits the bill for a black swan dead on.
Understood. For some reason I misread your original post as saying that you weren’t sure what a black swan was. I agree with your analysis here.