Great post! I’ve been mentioning for years that volunteering can be an effective way of making a contribution. Though many people think of volunteering as for a specific organization, I don’t think it has to be, so a hobby could be an example. I think there are not enough volunteer opportunities in EA, and we’ve worked hard at ALLFED on our volunteer program. Not only have we had dozens of volunteers skill up, but they have also made significant contributions, often co-authoring journal articles and becoming full time staff. Thanks for the shout out! I’m actually still volunteering for ALLFED (and donating).
I’m probably a bit more concerned about monkeypox than you are, mainly because it has an alarmingly long incubation period (up to 14 days) and then a punishingly long infectious period (3-4 weeks).
So with doubling every 10.5 days, that would seem to mean a high R0 - what’s your estimate? And really because some people are still being cautious about COVID, the true R0 (with normal behavior) would be even higher than what is measured now.
I would say that is basically right. AC exhaust is about as humid as indoor air. The fraction of the heating load in the summer due to infiltration really does depend on how tight your building construction is. With the numbers Jeff was assuming for a very old house, infiltration would be a much larger percentage. There are some other sources of heat in a house that come with humidity, such as people and showers, but overall it is much less humidity than bringing in outdoor air (there is heat conduction through the walls, electricity use of lighting and appliances, etc.). So that might mean that it would take you from a 25% efficiency loss (ignoring humidity) up to a 35% efficiency loss, which is still a big deal. But I’m not sure if 85°F in California typically corresponds to 50% relative humidity.
If you want to geek out on this you can use a psychrometric chart. For instance, if outdoor air is 85F and 50% relative humidity (RH), that’s an enthalpy of about 35 BTU/lb of dry air. Typical exit air conditions on the cool side of an air conditioner are ~50F and 100% RH, so ~20 BTU/lb of dry air. The dehumidification portion would be going to 85F and ~30% RH or ~29 BTU/lb of dry air, so ~40% of the heat removed is in the form of condensing water (latent). This means you would take the sensible part and multiply by about 1.7 to get the total load on the air conditioner. If you were not drawing in outdoor air, the latent load would be much lower. So overall I think you’re right that in CA the humidity correction is not as big as the other factors.
The thermal time constant of a building is around a day, so you should really be running each of these tests for more than a day (and correcting for differences in ambient conditions). Basically, the control should exceed the average ambient temp because of solar and internal (e.g. electricity consumption) gains. And see my other comment about doing something about humidity removal. Then we might actually have something rigorous (based on doing an experiment with fairly expensive equipment, I still had error bars around +/-1°C, so I don’t think you have very much confidence at this point).
I must admit I was surprised by the statistics here. It is true if you only use the air conditioner few days a year, the energy efficiency is not important. However, the cooling capacity is important. I think many people are using efficiency to mean cooling capacity above. Anyway, let’s say the incremental cost of going from one hose to two hoses is $30. From working on Department of Energy energy efficiency rules, typically the marginal markup of an efficient product is less than the markup on the product overall (meaning that the incremental cost of just adding a hose is less than the $20 of buying it separately). It is true that with a smaller area for the air to come into the device with a hose, the velocity has to be higher, so the fan blades need to be made bigger (it typically is one motor powering two different fan blades on two sides, at least for window units). But then you could save money on the housing because the port is smaller. The incremental cost of motors is low. Then if the air conditioner cost $200 to start with, that would be 15% incremental cost. Then let’s say the cooling capacity increased by 25% (I would say it actually does matter that a T-shirt was used, which would allow room area and instead of just outdoor air, so it probably would be higher than this). What this means is that the two hose actually has greater cooling capacity per dollar, so you should choose a small two hose even if you don’t care about energy use at all. Strictly this is only true with no economies of scale, which is not a great assumption. But I think overall it will hold. Another case this would break down is if a person were plugging and unplugging many times, but I don’t think that’s the typical person. So I suspect what is going on is that people don’t realize that the cooling capacity of the one hose is actually reduced more than the cost, so they should just be getting a smaller capacity two hose unit (at lower initial cost and energy cost).
There is a broader question here of whether there should be energy efficiency regulations. If people were perfectly rational and had perfect information, we would not need them. But not only are the incremental costs of energy efficiency regulations found to be economically beneficial by the US Department of Energy (basically a good return on investment), but a retrospective study found that the actual incremental cost of meeting the efficiency regulations was about an order of magnitude lower than predicted by the Department of Energy! So I think there’s a very strong case for energy efficiency regulations.
I overlooked a crucial consideration raised by denkenberger here that reduces the efficiency loss ~2x.
Thanks-it looks like you are referring to the net infiltration flow rate impact on the building. But there was also the consideration of humidity, and I did not see any humidity measurements in the data, so we are not able to resolve that one. Humidity sensors are fairly cheap, but notoriously unreliable. But one could actually measure the amount of water condensed pretty accurately to get an idea how much of the cooling of the air conditioner is going to condensing water versus cooling air (sensibly).
What is your estimate of the Metaculus question “Will there be a positive transition to a world with radically smarter-than-human artificial intelligence?” It sounds like it is much lower than the community prediction of 55%. Do you think this is because the community has significant probability mass on CAIS, ems, or Paul-like scenarios? What probability mass do you put on those (and are there others)?
Yes, 0.35 ACH is for the whole house. Most houses do not have active ventilation systems, so that’s all you would get for the bedroom. But that is true that if you are worried about CO2, you should have higher ACH in bedrooms. But this recommendation is not just about CO2, but also things like formaldehyde. Also it is roughly the amount that houses get on average. I have seen studies showing that the cost of sick building syndrome is well worth having higher ventilation rates. So probably more houses should have active ventilation. But if you don’t have active ventilation in a house, I think 0.35 ACH is a reasonable average. Apartment buildings will have active ventilation and higher occupant density, so the ACH will generally be higher, as you point out.
Yes—it is quite leaky—the rule of thumb the American Society of Heating, Refrigerating and Air Conditioning Engineers for low rise residential is more like 0.3 ACH. This would make your filtration look a lot better.
The infiltration factor of a well-functioning woodstove is far less than a one hose air conditioner, because the air is heated to much higher temperatures. However, it can be significant for fireplaces.
I studied the impact of infiltration because of clothes dryers when I was doing energy efficiency consulting. The nonobvious thing that is missing from this discussion is that the infiltration flow rate does not equal the flow rate of the hot air out the window. Basically absent the exhaust flow, there is an equilibrium of infiltration through the cracks in the building equaling the exfiltration through the cracks in the building. When you have a depressurization, this increases the infiltration but also decreases the exfiltration. If the exhaust flow is a small fraction of the initial infiltration, the net impact on infiltration is approximately half as much as the exhaust flow. The rule of thumb for infiltration is it produces about 0.3 air changes per hour, but it depends on the temperature difference to the outside and the wind (and the leakiness of the building). I would guess that if you did this in a house, the exhaust flow would be relatively small compared to the natural infiltration. So roughly the impact due to the infiltration is about half as much as the calculations indicate. But if you were in a tiny tight house, then the exhaust flow would overwhelm the natural infiltration and the increase in infiltration would be close to the exhaust flow.
Another factor is the dehumidification load on the air conditioner. This is a really big deal in the southeastern US, though it would be less of a deal in the Bay Area. Basically, if it is very humid outside, the additional infiltration air has to be de-humidified, and that can double how much heat the air conditioner needs to remove from the infiltration air. So this could counteract the benefit of the net infiltration being smaller than the exhaust flow.
The exhaust temperature of 130°F sounds high to me for regular air conditioner, but heat pumps designed to heat hot water and dry clothing to go even higher than that. So it is possible they increase it more than a regular air conditioner to increase the overall efficiency (because the fan energy is significantly larger with the hose as compared to a typical window unit). Still, I am confident that the reduction in efficiency of one hose versus two hose is less than 50% unless it is very hot and humid outside.
Portable units have to meet a much weaker standard. I actually pushed for a more stringent standard on these products when I was consulting for the Appliance Standards Awareness Project.
Zvi has now put a postscript in the ALLFED section above. We have updated the inadvertent nuclear war fault tree model result based on no nuclear war since the data stopped coming in, and also reduced the annual probability of nuclear war further going forward. And then, so as to not over claim on cost effectiveness, we did not include a correction for non-inadvertent US/Russia nuclear war nor conflict with China. Resilient foods are still highly competitive with AGI safety according to the revised model.
Remember the things that ALL have to be true for a “nuclear winter” to happen at all. I’m not gonna say it’s a completely debunked myth, but to me the probability is clearly low enough that I mostly ignore it in my planning.
It is conjunctive, but I’ve run probability distributions in a Monte Carlo model in a journal article and got about 20% chance of agricultural collapse given full scale nuclear war. So I think it is important for planning, as the consequences are far larger than the direct effects.
Sure looks like we’re past the peak [in South Africa], and the peak was remarkably low there, so low that it doesn’t make sense. Why would behaviors adjust this much this fast for so few cases, which were on average much milder?
You can see Google mobility data here at the bottom, and indeed, the response to this wave is much smaller than other waves.
I believe your calculation was 70% chance of not having it given a negative test, so if you have two independent negative tests, that would be 91% chance of not having it (1 − 0.09), or 9% chance of having it. But in reality, false negatives are very common. And you need to start with a prior probability to update from. From the paper I referenced, if you have some symptoms and were exposed, the prior probability of having COVID might be 91%, but after one negative result, you are still at 77-80% probability of having COVID. However, if your symptoms don’t match the common ones for COVID or if you don’t know you were exposed, then the prior probability of having COVID is much lower to start with. Then a negative test result would update downward slightly from that prior.
*If the rapid test had some probability of success, like 70%, then if you took two test you might figure 1-(1-.7)^2 = 1-.3^2 = 1-.09 = 01% you have covid. But are the rapid tests independent?**
You need to start with a prior for this calculation. This paper also discusses independence of tests. And I think you meant to write 91%.
The substantive complaint was that they [ALLFED] did an invalid calculation when calculating the annual probability of nuclear war. They did a survey to establish a range of probabilities, then they averaged them. One could argue about what kinds of ‘average them’ moves work for the first year, but over time the lack of a nuclear war is Bayesian evidence in favor of lower probabilities and against higher probabilities. It’s incorrect to not adjust for this, and the complaint was not merely the error, but that the error was pointed out and not corrected.
Tl; dr: ALLFED appreciates the feedback. We disagree that it was a mistake—there were smart people on both sides of this issue. Good epistemics are very important to ALLFED.
Zvi is investigating the issue. I won’t name names, but suffice it to say, there were smart people disagreeing on this issue. We have been citing the fault tree analysis of the probability of nuclear war, which we think is the most rigorous study because it uses actual data. Someone did suggest that we should update the probability estimate based on the fact that nuclear war has not yet occurred (excluding World War II). Taking a look at the paper itself (see the top of page 9 and equation (5) on that page), for conditional probabilities of occurrence for which effectively zero historical occurrences have been observed out of n total cases when it could have occurred, the probability in the model was updated according to a Bayesian posterior distribution with a uniform prior and binomial likelihood function. Historical occurrences updated in this way were A) the conditional probability that Threat Assessment Conference (TAC)-level attack indicators will be promoted to a Missile Attack Conference (MAC), and (B) the conditional probability of leaders’ decision to launch in response to mistaken MAC-level indicators of being under attack. Based on this methodology, it would be double-counting to update their final distribution further based on the historical absence of accidental nuclear launches over the last 76 years.
But what we do agree on is that if one starts with a high prior, one should update. And that’s what was done by one of our coauthors for his model of the probability of nuclear war, and he got similar results to the fault tree analysis. Furthermore, the fault tree analysis was only for inadvertent nuclear war (one side thinking they are being attacked, and then “retaliating”). However, there are other mechanisms for nuclear war, including intentional attack, and accidental detonation of a nuclear weapon and escalation from there. Furthermore, though many people consider nuclear winter only possible for a US-Russia nuclear war, now that China has a greater purchasing power parity than the US, we think there is comparable combustible material there. So the possibility of US-China nuclear war or Russia-China nuclear war further increases probabilities. So even if there should be some updating downward on the inadvertent US-Russia nuclear war, I think the fault tree analysis still provides a reasonable estimate. I also explained this on my first 80k podcast.
Also, we say in the paper, “Considering uncertainty represented within our models, our result is robust: reverting the conclusion required simultaneously changing the 3-5 most important parameters to the pessimistic ends.” So as Zvi has recognized, even if one thinks the probability of nuclear war should be significantly lower, the overall conclusion doesn’t change. We have encouraged people to put their own estimates in.
Again, we really appreciate the feedback. Good epistemics are very important to us. We are trying to reach the truth. We want to have maximum positive impact on the world, so that’s why we spend a significant amount of time on prioritization.
It’s hard to pin down a threshold of a specific time of exposure because it depends on the minimum infectious dose, which varies widely among people, at least for lots of diseases. Also, the rate of shedding varies widely based on the progression of the disease, whether the person is talking, how far away the person is, etc. Furthermore, the HVAC system causes additional variation. So I think when you add all these uncertainties, a 16 times reduction in emission/inhalation would correspond to very roughly a 16 times reduction in infection, but I would be very interested to see if someone has run the math on this.