What for? There aren’t any stick-and-stones cultures around.
Do you assign significant probability to the need for such a book in humanity’s future? I don’t. It would require that:
No technological human societies survive
Adults who know the relevant things don’t survive
Technological artifacts and particularly sources of knowledge (e.g., copies of encyclopedias or entire libraries-on-disk) don’t survive
But also that:
Some people survive all this
Such a book will survive all this and there will be a high chance of a copy being found by survivor groups
Survivors will be able to use the book (requires resources like extra food/manpower to sink into rebuilding project, and the organization/government to provide this) - in fact survivors will mostly lack for knowledge
There’s a huge different between having the raw knowledge available and simple step by step instructions.
A book created for this express purpose would be an order of magnitude more useful than any number of encyclopedias or even entire libraries. A big challenge would be even knowing what to research—if you don’t have the next technology, you may not even know what it will be.
The biggest obstacle is really distribution. What you’d need its a government, church, or NGO to put a copy in every branch or something.
Maybe you could donate a copy to every prison library. Prisons would actually be a really defensible location to stay post-societal collapse . . .
We can imagine a handbook that is written to be useful for a broad spectrum of possible disastrous situations.
The handbook could be written for post-disaster survivors finding themselves in many possible situations. For example, your first bullet “No technological human societies survive” could be expanded to “(No|Few|Distant|Hostile) technological human societies survive”. Indeed, uncertainty about which of the aforementioned possibilities actually hold might be quite probable, given both a civilization-destroying disaster and some survivors.
To some extent, the Long Now’s Rosetta project (to build sturdy discs inscribed with examples of many languages) is an example of this sort of handbook.
I agree a knowledge repository would be very useful for survivors right after the disaster. But I don’t think any scenario is probable that involves a society with a reasonably stable level of technology and food production existing and profiting from such a book.
BTW, the Rosetta project seems to be purely about describing languages so future people can understand them.
For example, your first bullet “No technological human societies survive” could be expanded to “(No|Few|Distant|Hostile) technological human societies survive”.
If a few distant technological societies survive, even just one with some reasonable shipping & industry, then I expect they will quickly establish contact with most of the world, if only to exploit natural resources & farming. Most or all tech. economies today rely on many imports of minerals, food, etc. And knowledge and technology would be dispersed quicker with the assistance of this society than by means of such a book.
If a ‘hostile’ society survives—well, hostile towards whom? Towards all other, non-high-tech survivors? I don’t see this as the default attitude of a surviving society that’s the most powerful country left on Earth, so without knowing more I hesitate to try to empower whoever they’re hostile towards. What did you have in mind here?
Your first point is that the handbook is not likely to be useful for the purpose of helping reconstruction after a disaster, because the chance of a disaster being total enough to destroy technology, but not total enough to destroy humanity, is small. I agree completely—you have a very strong argument there.
However, you go on to argue that IF a technology-destroying-humanity-sparing disaster occured, THEN technological societies would quickly establish contact, disperse knowledge, et cetera. In this after-the-disaster reasoning, you’re using our present notions of what is likely and unlikely to happen.
Reasoning like this beyond the very very unlikely occurrence seems fraught with danger. In order for such an unlikely occurrence to occur, we must have something significantly wrong with our current understanding of the world. If something like that happened, we would revise our understanding, not continue to use it. Anyone writing the handbook would have to plan for a wild array of possibilities.
Instead of focusing on the fact that the handbook is not likely to be used for its intended purpose, consider:
Might it have side benefits, spin-offs from its officially-intended purpose?
If we assume that there is “something significantly wrong with our current understanding of the world” but don’t know anything more specific, we can’t come to any useful conclusions. There’s a huge number of things we could do that we think aren’t likely to be useful but where we might be wrong.
So is writing this book something we should do (as the original comment seemed to suggest)? No. But I agree it’s something we could do, is very unlikely to be harmful, and is neat and fun into the bargain.
With that said, I’m going back to working on my cool, neat, fun, non-humanity-saving project :-)
Actually, all you would need for serious problems is that none of the relatively few people who know the essential details of a critical piece of support technology don’t survive. Or at least don’t survive in your group or that you otherwise have access to. And since if that happens, and you can’t know ahead of time what bits of information you might lose, having references to everything possible only makes good sense. Especially given how relatively inexpensive references are now. Cheap insurance against an very unlikely result (of course, they can also be helpful day-to-day too).
What you seem to be talking about is a group of people a few years to a few decades post collapse, who want to operate or rebuild preexisting tech and need a reference work. If they had a copy of wikipedia plus a good technical & reference library, it would probably answer most of their needs. A special book isn’t essential.
What I was talking about is a group of people completely lacking pre-collapse knowledge and experience. You can’t give them instructions for building a radio because they tend to ask questions like “what’s a screwdriver?” and “how can I avoid being burnt as a witch?” That’s what a real stones-and-sticks to high-tech guide book needs to address.
You might think of “my book” as a subset of yours. My book would be more likely to be useful (though hopefully not) and could be expanded to add the material necessary for yours. And your book would be a library in itself, there is no possible way that such a “book” would not span many volumes.
A single long “book” would have high quality cross links, well ordered reading sequences, a uniform style, no internal contradictions, etc. In that sense it’s a book as opposed to a library collection.
I don’t believe anyone can assign meaningfulvery small or very large probabilities in most situations. It is one of my long-running disagreements with people here and on OB.
I don’t know of a unified way of handling extremely small risks, but there are two things that can be helpful. First, as suggested by Marc Stiegler in “David’s Sling”, is to simply recognize explicitly that they are possible, that way if they do occur you can get on with dealing with the problem without also having to fight disbelief that it could have happened at all. Second, different people have different perspectives and interests and will treat different low possibility events differently, this sort of dispersion of views and preparation will help ensure that someone is at least somewhat prepared. As I said, neither of these is really enough, but I simply can’t see any better options.
I’m saying “Black Swan” to compress the following message: We cannot assign probability at all because we don’t have statistics. Nevertheless, the stakes are so high that we should be overly cautious. We need the book “just in case”. It’s a very specific, actionable step in existential threat mitigation. Unlike other measures it requires no new discoveries but just a modest investment of money and time.
You have to assign probabilities anyway. See the amended article:
Considering some event a black swan doesn’t give a leave to not assign any probabilities, since making decisions depending on the plausibility of such event is still equivalent to assigning probabilities that make the expected utility calculation give those decisions.
Okay, okay! How much our civilization is worth? Say, 10^20 dollars. If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
How much our civilization is worth? Say, 10^20 dollars.
That’s meaningless. You can’t assign a value in dollars to the continued existence of our civilization. Dollars are only useful for pricing things inside that civilization. (Some people argue for using utilons to price the civilization’s existence.)
If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
The amount you’re willing to pay is a fact about you, not about the book’s usefulness. You’re saying you estimate its probability of usefulness at 10^-14. But why?
I see scenarios like the following not impossible.
90% of the human population dies from a plague/meteor along with the knowledge/sufficient numbers to maintain things like power plants, steel mills and the trappings of modern life. Those people that are left with the knowledge have to spend all their time subsistence farming just to survive.
A few generations later when the population has increased a bit and subsistence farming improved in yield due to experience. People want to recreate technology, with just the knowledge passed down by word of mouth.
What for? There aren’t any stick-and-stones cultures around.
Do you assign significant probability to the need for such a book in humanity’s future? I don’t. It would require that:
No technological human societies survive
Adults who know the relevant things don’t survive
Technological artifacts and particularly sources of knowledge (e.g., copies of encyclopedias or entire libraries-on-disk) don’t survive
But also that:
Some people survive all this
Such a book will survive all this and there will be a high chance of a copy being found by survivor groups
Survivors will be able to use the book (requires resources like extra food/manpower to sink into rebuilding project, and the organization/government to provide this) - in fact survivors will mostly lack for knowledge
There’s a huge different between having the raw knowledge available and simple step by step instructions.
A book created for this express purpose would be an order of magnitude more useful than any number of encyclopedias or even entire libraries. A big challenge would be even knowing what to research—if you don’t have the next technology, you may not even know what it will be.
The biggest obstacle is really distribution. What you’d need its a government, church, or NGO to put a copy in every branch or something.
Maybe you could donate a copy to every prison library. Prisons would actually be a really defensible location to stay post-societal collapse . . .
We can imagine a handbook that is written to be useful for a broad spectrum of possible disastrous situations.
The handbook could be written for post-disaster survivors finding themselves in many possible situations. For example, your first bullet “No technological human societies survive” could be expanded to “(No|Few|Distant|Hostile) technological human societies survive”. Indeed, uncertainty about which of the aforementioned possibilities actually hold might be quite probable, given both a civilization-destroying disaster and some survivors.
To some extent, the Long Now’s Rosetta project (to build sturdy discs inscribed with examples of many languages) is an example of this sort of handbook.
http://rosettaproject.org/
I agree a knowledge repository would be very useful for survivors right after the disaster. But I don’t think any scenario is probable that involves a society with a reasonably stable level of technology and food production existing and profiting from such a book.
BTW, the Rosetta project seems to be purely about describing languages so future people can understand them.
If a few distant technological societies survive, even just one with some reasonable shipping & industry, then I expect they will quickly establish contact with most of the world, if only to exploit natural resources & farming. Most or all tech. economies today rely on many imports of minerals, food, etc. And knowledge and technology would be dispersed quicker with the assistance of this society than by means of such a book.
If a ‘hostile’ society survives—well, hostile towards whom? Towards all other, non-high-tech survivors? I don’t see this as the default attitude of a surviving society that’s the most powerful country left on Earth, so without knowing more I hesitate to try to empower whoever they’re hostile towards. What did you have in mind here?
Your first point is that the handbook is not likely to be useful for the purpose of helping reconstruction after a disaster, because the chance of a disaster being total enough to destroy technology, but not total enough to destroy humanity, is small. I agree completely—you have a very strong argument there.
However, you go on to argue that IF a technology-destroying-humanity-sparing disaster occured, THEN technological societies would quickly establish contact, disperse knowledge, et cetera. In this after-the-disaster reasoning, you’re using our present notions of what is likely and unlikely to happen.
Reasoning like this beyond the very very unlikely occurrence seems fraught with danger. In order for such an unlikely occurrence to occur, we must have something significantly wrong with our current understanding of the world. If something like that happened, we would revise our understanding, not continue to use it. Anyone writing the handbook would have to plan for a wild array of possibilities.
Instead of focusing on the fact that the handbook is not likely to be used for its intended purpose, consider:
Might it have side benefits, spin-offs from its officially-intended purpose?
Is it harmful?
Is it neat, cool, and fun?
If we assume that there is “something significantly wrong with our current understanding of the world” but don’t know anything more specific, we can’t come to any useful conclusions. There’s a huge number of things we could do that we think aren’t likely to be useful but where we might be wrong.
So is writing this book something we should do (as the original comment seemed to suggest)? No. But I agree it’s something we could do, is very unlikely to be harmful, and is neat and fun into the bargain.
With that said, I’m going back to working on my cool, neat, fun, non-humanity-saving project :-)
Actually, all you would need for serious problems is that none of the relatively few people who know the essential details of a critical piece of support technology don’t survive. Or at least don’t survive in your group or that you otherwise have access to. And since if that happens, and you can’t know ahead of time what bits of information you might lose, having references to everything possible only makes good sense. Especially given how relatively inexpensive references are now. Cheap insurance against an very unlikely result (of course, they can also be helpful day-to-day too).
There’s a mixup of two different scenarios here.
What you seem to be talking about is a group of people a few years to a few decades post collapse, who want to operate or rebuild preexisting tech and need a reference work. If they had a copy of wikipedia plus a good technical & reference library, it would probably answer most of their needs. A special book isn’t essential.
What I was talking about is a group of people completely lacking pre-collapse knowledge and experience. You can’t give them instructions for building a radio because they tend to ask questions like “what’s a screwdriver?” and “how can I avoid being burnt as a witch?” That’s what a real stones-and-sticks to high-tech guide book needs to address.
You might think of “my book” as a subset of yours. My book would be more likely to be useful (though hopefully not) and could be expanded to add the material necessary for yours. And your book would be a library in itself, there is no possible way that such a “book” would not span many volumes.
A single long “book” would have high quality cross links, well ordered reading sequences, a uniform style, no internal contradictions, etc. In that sense it’s a book as opposed to a library collection.
Black Swan.
Just saying “black swan” isn’t enough to give higher probability. If you think I can’t assign any meaningful probability at all to this scenario, why?
I don’t believe anyone can assign meaningful very small or very large probabilities in most situations. It is one of my long-running disagreements with people here and on OB.
There are indeed many known human biases of this kind, plus general inability to predict small differences in probability.
But we can’t treat every low probability scenario as being e.g. of p=0.1 or some other constant! What do you suggest then?
I don’t know of a unified way of handling extremely small risks, but there are two things that can be helpful. First, as suggested by Marc Stiegler in “David’s Sling”, is to simply recognize explicitly that they are possible, that way if they do occur you can get on with dealing with the problem without also having to fight disbelief that it could have happened at all. Second, different people have different perspectives and interests and will treat different low possibility events differently, this sort of dispersion of views and preparation will help ensure that someone is at least somewhat prepared. As I said, neither of these is really enough, but I simply can’t see any better options.
I’m saying “Black Swan” to compress the following message: We cannot assign probability at all because we don’t have statistics. Nevertheless, the stakes are so high that we should be overly cautious. We need the book “just in case”. It’s a very specific, actionable step in existential threat mitigation. Unlike other measures it requires no new discoveries but just a modest investment of money and time.
You have to assign probabilities anyway. See the amended article:
Okay, okay! How much our civilization is worth? Say, 10^20 dollars. If I had the money, I would be willing to part with 10^6 dollars to develop, manufacture, and distribute the book. Therefore, the probability of the book serving it’s primary purpose is 10^(-14).
That’s meaningless. You can’t assign a value in dollars to the continued existence of our civilization. Dollars are only useful for pricing things inside that civilization. (Some people argue for using utilons to price the civilization’s existence.)
The amount you’re willing to pay is a fact about you, not about the book’s usefulness. You’re saying you estimate its probability of usefulness at 10^-14. But why?
Clearly the market for civilization creation books is efficient.
Nice point. Maybe we should instead talk about scenarios where humanity (including us) no longer suffers aging but a collapse still occurs.
Incidentally, I wonder what the market price for writing a civilization-destroying book might be?
I believe the going rate is 45 virgins in the afterlife.
I see scenarios like the following not impossible.
90% of the human population dies from a plague/meteor along with the knowledge/sufficient numbers to maintain things like power plants, steel mills and the trappings of modern life. Those people that are left with the knowledge have to spend all their time subsistence farming just to survive.
A few generations later when the population has increased a bit and subsistence farming improved in yield due to experience. People want to recreate technology, with just the knowledge passed down by word of mouth.