“Mandates continue to make people angry”
True for some people, but also worth noting that they’re popular overall. Looks like around 60% of Americans support Biden’s mandate, for instance (this is pretty high for a cultural war issue).
“Republicans are turning against vaccinations and vaccine mandates in general… would be rather disastrous if red states stopped requiring childhood immunizations”
Support has waned, and it would be terrible if they stopped them, but note that:
Now republicans are split ~50:50; so it’s not like they have a consensus either way
Republicans being split and others (including independents) being in favor means that majority is clearly in favor, even in red states
Republican support has recovered somewhat already, and I’d expect support will continue to revert closer to pre-COVID levels as we progress further (especially years out); we might not reach pre-covid levels, but I’d be surprised if the general view of republicans was against several years from now (though OTOH, perhaps those against are more strongly against, so you could wind up in a single-issue voter type problem)
“3%”
This seems to be at the 12 week mark, which is somewhat arbitrary. Even according to the same study, looks like long covid rates are closer to 1% after 19 weeks.
“To be blunt, they cheated (intentionally or otherwise)”
Flagging that I don’t like this language, for a couple reasons:
I think it’s inaccurate/misrepresentative. “Cheating”, in my mind, implies some dishonesty. Yes, words can obviously be defined in any way, but I’m generally not a fan of redefining words with common definitions unless there’s a good reason. If, on the other hand, your claim is that they were indeed being dishonest, then I think you should come out and say that (otherwise what you’re doing is a little motte-and-bailey-ish).
I think it’s unnecessarily hostile. People make mistakes, including scientists making dumb mistakes. It’s good that they corrected their mistake (which is not something lots of people—including scientists—do). The fact that none of us caught it shows just how easy it is to make these sort of mistakes. (Again, this point doesn’t stand if you are trying to imply that it was intentional, but then I think you should state that.) I similarly don’t think it’s apt to call it “fessing up” when they correct their mistake.
Looks like this dropped after your post here so you wouldn’t have been able to incorporate it – advisors to the FDA are recommending moderna boosters for the same group of people that are getting pfizer boosters (65+, risk for health reasons, or risk for job), and also this will be at half dose. They should make a recommendation on J&J tomorrow.
Great post!
I was curious what some of this looked like, so I graphed it, using the dates you specifically called out probabilities. For simplicity, I assumed constant probability within each range (though I know you said this doesn’t correspond to your actual views). Here’s what I got for cumulative probability:
And here’s the corresponding probabilities of TAI being developed per specific year:
The dip between 2026 and 2030 seems unjustified to me. (I also think the huge drop from 2040-2050 is too aggressive, as even if we expect a plateauing of compute/another AI winter/etc, I don’t think we can be super confident exactly when that would happen, but this drop seems more defensible to me than the one in the late 2020s.)
If we instead put 5% for 2026, here’s what we get:
which seems more intuitively defensible to me. I think this difference may be important, as even shift of small numbers of years like this could be action-relevant when we’re talking about very short timelines (of course, you could also get something reasonable-seeming by shifting up the probabilities of TAI in the 2026-2030 range).
I’d also like to point out that your probabilities would imply that if TAI is not developed by 2036, there would be an implied 23% conditional chance of it then being developed in the subsequent 4 years ((50%-35%)/(100%-35%)), which also strikes me as quite high from where we’re now standing.