Dr. David Denkenberger co-founded and is a director at the Alliance to Feed the Earth in Disasters (ALLFED.info) and donates half his income to it. He received his B.S. from Penn State in Engineering Science, his masters from Princeton in Mechanical and Aerospace Engineering, and his Ph.D. from the University of Colorado at Boulder in the Building Systems Program. His dissertation was on an expanded microchannel heat exchanger, which he patented. He is an associate professor at the University of Canterbury in mechanical engineering. He received the National Merit Scholarship, the Barry Goldwater Scholarship, the National Science Foundation Graduate Research Fellowship, is a Penn State distinguished alumnus, and is a registered professional engineer. He has authored or co-authored 134 publications (>4400 citations, >50,000 downloads, h-index = 34, second most prolific author in the existential/global catastrophic risk field), including the book Feeding Everyone no Matter What: Managing Food Security after Global Catastrophe. His food work has been featured in over 25 countries, over 300 articles, including Science, Vox, Business Insider, Wikipedia, Deutchlandfunk (German Public Radio online), Discovery Channel Online News, Gizmodo, Phys.org, and Science Daily. He has given interviews on 80,000 Hours podcast (here and here) and Estonian Public Radio, WGBH Radio, Boston, and WCAI Radio on Cape Cod, USA. He has given over 80 external presentations, including ones on food at Harvard University, MIT, Princeton University, University of Cambridge, University of Oxford, Cornell University, University of California Los Angeles, Lawrence Berkeley National Lab, Sandia National Labs, Los Alamos National Lab, Imperial College, and University College London.
denkenberger
The substantive complaint was that they [ALLFED] did an invalid calculation when calculating the annual probability of nuclear war. They did a survey to establish a range of probabilities, then they averaged them. One could argue about what kinds of ‘average them’ moves work for the first year, but over time the lack of a nuclear war is Bayesian evidence in favor of lower probabilities and against higher probabilities. It’s incorrect to not adjust for this, and the complaint was not merely the error, but that the error was pointed out and not corrected.
Tl; dr: ALLFED appreciates the feedback. We disagree that it was a mistake—there were smart people on both sides of this issue. Good epistemics are very important to ALLFED.
Full version:
Zvi is investigating the issue. I won’t name names, but suffice it to say, there were smart people disagreeing on this issue. We have been citing the fault tree analysis of the probability of nuclear war, which we think is the most rigorous study because it uses actual data. Someone did suggest that we should update the probability estimate based on the fact that nuclear war has not yet occurred (excluding World War II). Taking a look at the paper itself (see the top of page 9 and equation (5) on that page), for conditional probabilities of occurrence for which effectively zero historical occurrences have been observed out of n total cases when it could have occurred, the probability in the model was updated according to a Bayesian posterior distribution with a uniform prior and binomial likelihood function. Historical occurrences updated in this way were A) the conditional probability that Threat Assessment Conference (TAC)-level attack indicators will be promoted to a Missile Attack Conference (MAC), and (B) the conditional probability of leaders’ decision to launch in response to mistaken MAC-level indicators of being under attack. Based on this methodology, it would be double-counting to update their final distribution further based on the historical absence of accidental nuclear launches over the last 76 years.
But what we do agree on is that if one starts with a high prior, one should update. And that’s what was done by one of our coauthors for his model of the probability of nuclear war, and he got similar results to the fault tree analysis. Furthermore, the fault tree analysis was only for inadvertent nuclear war (one side thinking they are being attacked, and then “retaliating”). However, there are other mechanisms for nuclear war, including intentional attack, and accidental detonation of a nuclear weapon and escalation from there. Furthermore, though many people consider nuclear winter only possible for a US-Russia nuclear war, now that China has a greater purchasing power parity than the US, we think there is comparable combustible material there. So the possibility of US-China nuclear war or Russia-China nuclear war further increases probabilities. So even if there should be some updating downward on the inadvertent US-Russia nuclear war, I think the fault tree analysis still provides a reasonable estimate. I also explained this on my first 80k podcast.
Also, we say in the paper, “Considering uncertainty represented within our models, our result is robust: reverting the conclusion required simultaneously changing the 3-5 most important parameters to the pessimistic ends.” So as Zvi has recognized, even if one thinks the probability of nuclear war should be significantly lower, the overall conclusion doesn’t change. We have encouraged people to put their own estimates in.
Again, we really appreciate the feedback. Good epistemics are very important to us. We are trying to reach the truth. We want to have maximum positive impact on the world, so that’s why we spend a significant amount of time on prioritization.
I have estimated global vitamin D3 production to be a few tons per year, so at US RDA of 600 UI, we could only provide about 3% of the global population. At your suggestion of 5000 UI/day, it would only be about 0.3% of people. This is why I looked into quickly scaling up vitamin D production. The most promising appeared to be seaweed, but we could not get anyone excited about doing it before there was a shortage. Fortunately, just mega dosing of those testing positive appears to be within our global D3 production capability at current infection rate. However, if we let it run through the population, I don’t think we would have sufficient supplies at current production.
Great post! I’ve been mentioning for years that volunteering can be an effective way of making a contribution. Though many people think of volunteering as for a specific organization, I don’t think it has to be, so a hobby could be an example. I think there are not enough volunteer opportunities in EA, and we’ve worked hard at ALLFED on our volunteer program. Not only have we had dozens of volunteers skill up, but they have also made significant contributions, often co-authoring journal articles and becoming full time staff. Thanks for the shout out! I’m actually still volunteering for ALLFED (and donating).
Zvi has now put a postscript in the ALLFED section above. We have updated the inadvertent nuclear war fault tree model result based on no nuclear war since the data stopped coming in, and also reduced the annual probability of nuclear war further going forward. And then, so as to not over claim on cost effectiveness, we did not include a correction for non-inadvertent US/Russia nuclear war nor conflict with China. Resilient foods are still highly competitive with AGI safety according to the revised model.
Thank you, Jennifer, for the introduction. Some more background on me: I have read the sequences and the foom debate. In 2011, I tried to do cost-effectiveness scoping for all causes inspired by Yudkowsky’s scope and neglectedness framework (the scope, neglectedness, and tractability framework had not yet been invented). I am concerned about AI risk, and have been working with Alexey Turchin. I am primarily motivated by existential risk reduction. If we lose anthropological civilization (defined by cooperation outside the clan), we may not recover for the following reasons:
• Easily accessible fossil fuels and minerals exhausted
• Don’t have the stable climate of last 10,000 years
• Lose trust or IQ permanently
• Endemic disease prevents high population density
• Permanent loss of grains precludes high population density
Not recovering is a form of existential risk (not realizing our potential), and we might actually go extinct because of a supervolcano or asteroid after losing civilization. Because getting prepared (research and development of non-sunlight dependent foods such as mushrooms and natural gas digesting bacteria, and planning) is so cost-effective for the present generation, I think it will be a very cost effective way of reducing existential risk.
Sure looks like we’re past the peak [in South Africa], and the peak was remarkably low there, so low that it doesn’t make sense. Why would behaviors adjust this much this fast for so few cases, which were on average much milder?
You can see Google mobility data here at the bottom, and indeed, the response to this wave is much smaller than other waves.
When asked on Lex’s podcast to give advice to high school students, Elezier’s response was “don’t expect to live long.”
Not to belittle the perceived risk if one believes in 90% chance of doom in the next decade, but even if one has a 1% chance of an indefinite lifespan, the expected lifespan of teenagers now is much higher than previous generations.
For the one paper, it is too early to tell. For the other, there just has not been very much engagement. Mainly the public debate has been between the Robock team, which is highly confident that full-scale nuclear war would cause nuclear winter, and the Los Alamos team, which is highly confident that full-scale nuclear war would not cause nuclear winter. We find the truth is likely somewhere in between. I talked about this in one of my 80k podcasts. Our analysis is quite similar to Luisa Rodriguez’ analysis that cubefox links to below.
Thanks, Peter. That draft assumes global cooperation, which is likely too optimistic, so we have submitted another draft that also analyzes the case of breakdown of trade (hopefully public soon). We also have this paper that looks at the US specifically and takes into account food storage (and uncertainty of whether nuclear war would result in nuclear winter).
I am happy to do an AMA.
Denkenberger posted two papers he wrote in regards to a 150Tg nuclear exchange scenario (worst case scenario, total targeting of cities). As far as I can tell, although the developed world doesn’t come close to famine and there is theoretically enough food to feed everyone on Earth
To clarify, the world would have enough food if trade continues and if we massively scale up resilient foods. Trade continuing is very uncertain, and making it likely that we scale up resilient foods would require significantly more planning and piloting.
If you want to geek out on this you can use a psychrometric chart. For instance, if outdoor air is 85F and 50% relative humidity (RH), that’s an enthalpy of about 35 BTU/lb of dry air. Typical exit air conditions on the cool side of an air conditioner are ~50F and 100% RH, so ~20 BTU/lb of dry air. The dehumidification portion would be going to 85F and ~30% RH or ~29 BTU/lb of dry air, so ~40% of the heat removed is in the form of condensing water (latent). This means you would take the sensible part and multiply by about 1.7 to get the total load on the air conditioner. If you were not drawing in outdoor air, the latent load would be much lower. So overall I think you’re right that in CA the humidity correction is not as big as the other factors.
I must admit I was surprised by the statistics here. It is true if you only use the air conditioner few days a year, the energy efficiency is not important. However, the cooling capacity is important. I think many people are using efficiency to mean cooling capacity above. Anyway, let’s say the incremental cost of going from one hose to two hoses is $30. From working on Department of Energy energy efficiency rules, typically the marginal markup of an efficient product is less than the markup on the product overall (meaning that the incremental cost of just adding a hose is less than the $20 of buying it separately). It is true that with a smaller area for the air to come into the device with a hose, the velocity has to be higher, so the fan blades need to be made bigger (it typically is one motor powering two different fan blades on two sides, at least for window units). But then you could save money on the housing because the port is smaller. The incremental cost of motors is low. Then if the air conditioner cost $200 to start with, that would be 15% incremental cost. Then let’s say the cooling capacity increased by 25% (I would say it actually does matter that a T-shirt was used, which would allow room area and instead of just outdoor air, so it probably would be higher than this). What this means is that the two hose actually has greater cooling capacity per dollar, so you should choose a small two hose even if you don’t care about energy use at all. Strictly this is only true with no economies of scale, which is not a great assumption. But I think overall it will hold. Another case this would break down is if a person were plugging and unplugging many times, but I don’t think that’s the typical person. So I suspect what is going on is that people don’t realize that the cooling capacity of the one hose is actually reduced more than the cost, so they should just be getting a smaller capacity two hose unit (at lower initial cost and energy cost).
There is a broader question here of whether there should be energy efficiency regulations. If people were perfectly rational and had perfect information, we would not need them. But not only are the incremental costs of energy efficiency regulations found to be economically beneficial by the US Department of Energy (basically a good return on investment), but a retrospective study found that the actual incremental cost of meeting the efficiency regulations was about an order of magnitude lower than predicted by the Department of Energy! So I think there’s a very strong case for energy efficiency regulations.
Portable units have to meet a much weaker standard. I actually pushed for a more stringent standard on these products when I was consulting for the Appliance Standards Awareness Project.
Note that that statistic is how long people have been in their current job, not how long they will stay in their current job total. If everyone stayed in their jobs for 40 years, and you did a survey of how long people have been in their job, the median will come out to 20 years. I have not found hard data for the number we actually want, but this indicates that the median time that people stay in their jobs is about eight years, though it would be slightly shorter for younger people.
The thermal time constant of a building is around a day, so you should really be running each of these tests for more than a day (and correcting for differences in ambient conditions). Basically, the control should exceed the average ambient temp because of solar and internal (e.g. electricity consumption) gains. And see my other comment about doing something about humidity removal. Then we might actually have something rigorous (based on doing an experiment with fairly expensive equipment, I still had error bars around +/-1°C, so I don’t think you have very much confidence at this point).
this gives a paltry annual return on investment of 0.075%
which seems large until we note that it implies an annualized rate of return of 0.08%; far more than our estimate above, but a tiny rate of return.
Am I comparing the right numbers? It doesn’t seem like far more to me.
This is a tricky thing to define, because by some definitions we are already in the 5 year count-down on a slow takeoff.
Some people advocate for using GDP, so the beginning is if you can see the AI signal in the noise (which we can’t yet).
I studied the impact of infiltration because of clothes dryers when I was doing energy efficiency consulting. The nonobvious thing that is missing from this discussion is that the infiltration flow rate does not equal the flow rate of the hot air out the window. Basically absent the exhaust flow, there is an equilibrium of infiltration through the cracks in the building equaling the exfiltration through the cracks in the building. When you have a depressurization, this increases the infiltration but also decreases the exfiltration. If the exhaust flow is a small fraction of the initial infiltration, the net impact on infiltration is approximately half as much as the exhaust flow. The rule of thumb for infiltration is it produces about 0.3 air changes per hour, but it depends on the temperature difference to the outside and the wind (and the leakiness of the building). I would guess that if you did this in a house, the exhaust flow would be relatively small compared to the natural infiltration. So roughly the impact due to the infiltration is about half as much as the calculations indicate. But if you were in a tiny tight house, then the exhaust flow would overwhelm the natural infiltration and the increase in infiltration would be close to the exhaust flow.
Another factor is the dehumidification load on the air conditioner. This is a really big deal in the southeastern US, though it would be less of a deal in the Bay Area. Basically, if it is very humid outside, the additional infiltration air has to be de-humidified, and that can double how much heat the air conditioner needs to remove from the infiltration air. So this could counteract the benefit of the net infiltration being smaller than the exhaust flow.
The exhaust temperature of 130°F sounds high to me for regular air conditioner, but heat pumps designed to heat hot water and dry clothing to go even higher than that. So it is possible they increase it more than a regular air conditioner to increase the overall efficiency (because the fan energy is significantly larger with the hose as compared to a typical window unit). Still, I am confident that the reduction in efficiency of one hose versus two hose is less than 50% unless it is very hot and humid outside.