I didn’t downvote it, but I did find the writing style mildly grating. (Relatedly: I cannot exceed your set-point of smug, it is over 9000 :) )
To be clear: I think the underlying point was pretty good, and I mostly had issues with the delivery. I still feel it was probably something worth writing, although I also think I’m not the target audience for this particular bit of advice.
Some of it was probably a tone thing, which I won’t go into. But here are some things that seem tractable:
My experience of it was a bit better as soon as I switched out almost all the “You”s for “I”s. I have something of a distaste for the… puppety-feeling where someone seems to be trying to put words into my mouth, that don’t fit with my actual experience. This set it off pretty badly. There are a lot of specifics, and it’s clearly your personal account; own it.*
This got really stark for me at around...
First, you just end up being an asshole pretending to be honest
First, you just end up being an asshole pretending to be honest
...which instantly broke my immersion. My experience of being painfully honest with myself, and then others, was radically different.**
It probably also could have used more short paragraphs, and some variety in presentation. Some of the goals you’ve acted on and then forgotten, such as Real Analysis or Mandarin, could have been better-presented as single bullet-points after going into only 1 of them in-depth. The Etsy section could have used a header, and been broken into more than 1 paragraph. That kind of thing.
*A lot of bad advice on persuasive essay writing encourages the formation of habits like this. One cannot list the number of times one has been told to make that unnatural substitution of “One,” where “I” would have been better, and more honest. Teachers who do this are just… wrong. Technical writing is a real thing, but this way of teaching it is crap, and can ruin otherwise-decent writers.
** My experience was close to painful self-consciousness (for self-honesty), and weird social penalties (for honesty with others). Real honesty is often distinctly un-charming, but in my case… bluntness leaned closer to “overly-invested*** eccentric” than “asshole.” If it had been framed as a self-account, this jarring wouldn’t have been an issue.
*** Exhibit A: This overgrown write-up.
I’ve personally found that just stapling shut the outer edge of the first fold on either side of a surgical mask results in a mask that mostly sucks to my face when I breathe in. It doesn’t stick well when I breathe out, though.
This is much easier to implement than fancy adherents; all it takes is 2 staples and a stapler, or a needle and thread. It struck me as a plausible 80-20. (Well, less than 80%. Obviously, this is no N95.)
(It might only work for some face-shapes, though.)
This generates a new problem, which would also apply to taped/glued solutions:
The front is made of a softer fabric than N95s. After an hour of wear, it will suck to my lips if I breathe in with my mouth. And since it doesn’t stick on out-breaths, air still gets out the sides and sometimes the top. I don’t know how much additional risk this presents, but I would be curious to hear someone weigh in.
(It also has uncomfortably-high humidity, but that’s even more true of N95s.)
Despite the virus being characterized in pangolins, after looking into this, I now think it is basically incorrect to think of this as primarily a “pangolin virus.” The pangolins were a dying canary in a coal mine, and probably caught it from something else that serves as the real reservoir species for this nCOV precursor*.
These pangolins were being smuggled when they were captured by the authorities in Guangxi. They were dying of probably several diseases; they had lesions in their skin, intense congestion, and were in generally atrocious condition when they got sequenced for viruses. They turned up positive for all manner of things (herpes out the wazoo, but also a sendai virus which was most closely related to the sequence of a human-taken sample, a paramyxovirus, and yes, several coronaviruses).
Here’s the original article on the pangolins whose virome they sequenced, and the article noting its relatedness to nCOV.
Given that so many of the pangolins died, the pangolins look more like a highly-susceptible secondary species, than a mostly-asymptomatic primary reserve species* to me.
* GD/PIL or GD/P2S is thought of as a possible nCOV progenitor, alongside bat-virus RaTG13. GD/PIL’s receptor-binding motif (RBM) in particular is identical to SARS-2′s, although nCOV otherwise appears more closely related to RaTG13.
** On educated priors, I think the true reservoir is probably rats, bats, or (less likely) humans in the Guanxi, Hunan, and/or Hubei province.
Personally, I assign >90% on either rats (strong priors + skin lesions) or bats (strong priors + simplest story). But these were exotic animal smugglers; there is a small chance that the original reservoir species could be any animal.
I think it’s probably a virus that was merely identified in pangolins, but whose primary host is probably not pangolins.
The pangolins they sequenced weren’t asymptomatic carriers at all; they were sad smuggled specimens that were dying of many different diseases simultaneously.
I looked into this semi-recently, and wrote up something here.
The pangolins were apprehended in Guangxi, which shares some of its border with Yunnan. Neither of these provinces are directly contiguous with Hubei (Wuhan’s province), fwiw. (map)
True, and I’ve seen lab work cultivate something similar.
(I’m pretty sure this particular skill is the inverse of programmer-style “laziness,” funnily enough. In one field, seeing repetition is reassuring. In the other, it can be evidence that your code is not as elegant and modularized as it could be.)
I always thought you’d automatically learn the gait if you just did the work often enough, though. It’s definitely a coping skill, but I read its origins as more cultivated than culturally-induced or taught.
It mostly follows the natural incentive gradients of the work. This can be in contrast to things like separation of self and client in psychology, which seems to feel actively un-natural for many people. Of course, there’s something of a spectrum here, with heavy individual variation.
Thanks! Sounds like a promising lead.
It’s more “Ugh, I hate pissing people off on the internet” than “Oh Noes the Governments.” Whether I have good or bad things to say, it’s a contentious and semi-political topic. That said, I’m still probably overreacting.
(I’m more worried that I’ll be wrong about something, that people will badly misinterpret me or misconstrue my beliefs, or that I rub people’s personal issues the wrong way than that I Awaken the Powers That Be by… armchair philosophizing about the influence of culture on PTSD?)
The set of cultures I most want to poke this lens at, and yet want to write up not at all, are the various military ones.
High-stress environment, with a strong culture and close-to-explicit transmission to young people. And the type of stress varies considerably depending on whether you’re in the Army, Navy, Air Force… it seems like an ideal case-study. But it’s also a Whole Can of Worms. I suspect it’s a bad idea for me to try to publicly analyze subsets of American military culture with thoughts that are half-cocked.
At minimum: I kinda suspect the style of Boydian thought was an excellent fit to the challenges and culture of fighter pilots at the time. Agile judgements that take uncertainty into account, done within a competitive environment where “outmaneuvering your opponent without overextending yourself” is the name of the game.
I plan to flesh more examples out, but this had languished untouched in my Drafts folder for close to a month. So I settled onto “publish first, flesh out more later.”
The idea mostly struck me when examining differences between good biology culture and rationalist culture, rather than from a particularly new cultural exposure on my part.
(The Catalyst conference may have highlighted the differences a bit.)
See: further explanation here
While we should at least ask and assess that question...
Yes, it would be good to rule out those things that we do know to expect. And I think animal results* could check this one somewhat. But corners are already being cut, and I still expect some degree of surprises.
I do feel like there’s a lot we don’t know with this virus. I don’t know that the problems will be limited to the things we currently know to look for, and I’d be a little surprised if timing was not at least a bit of an influencing factor.
* Apparently they haven’t found/developed an easy animal that catches the virus, but they are doing animal testing in parallel to check the type of immune response? And the vaccine test result with the 4 macaques at least seemed promising, now up by another 10 macaques tested with that same inactivated-virus vaccine.
While it definitely helps that we have some experience with SARS-1, we can’t totally rely on what we know about SARS-1 and trust that it’ll apply to SARS-2.
(I think SARS-1 and SARS-2′s genetic similarity was said to be only ~80%? This is about as much as we share in common with cows. There can be meaningful differences between the two.)
Here’s just one example. Did you see the “UPDATE” I added to my answer above? Says something like “Oh hey, I guess it probably does make immune cells apoptose?” SARS-1 doesn’t do that thing. As in, the article specifically mentions that they tried with SARS-1 and SARS-2, and only SARS-2 gets into T-cells like this. And they weren’t sure which receptor was responsible.
It’s great to see a lot of science happening on this, but it’s also something of a marker of our vast uncertainties paired with its high priority.
I’m normally on team “the FDA is making everything too slow,” but in this case I actually think there’s good reason to be really careful with those Phase-1 vaccine development trials. You don’t need to be using a live vaccine to actively make things worse!
How long does testing for this problem actually take? I’m not sure.
If we were sure this was the only thing we needed to worry about (but I don’t think that it is)… getting a line of evidence where some vaccine takers are exposed deliberately some time after vaccination could maybe speed up weeding out vaccines that trigger worse Th-2 reactions. But if it turned out that time-since-vaccination or current antibody-titer are major factors underlying outcome, we may genuinely need the full period of safety-testing.
And asking people to deliberately expose themselves doesn’t strike me as a… safe cheap or easy thing to do.
This looks like a complicated immune effect, and it seems to be under-characterized. Overall, it looks hard to test in-vitro or in cell culture. I’m pretty sure you’d need animals or humans to do it. I’m not sure which is faster.
The article I summarize here goes over some of the specific bad vaccine reactions for SARS-1. I expect similar challenges for SARS-2.
In situations where someone took the vaccine, then gets infected by the contagion, they can have a bad reaction where the course of the disease is more severe than if they had not been vaccinated at all.*
Here’s some of what we know about those bad reactions**
Th-2 type immune activation definitely happens
This is an allergy-like immune response
Th-2 reactions happen in severe cases of COVID-19 generally
There might also be a bad complement system related reaction
Complement system means protein complexes that kill cells by poking holes in membranes
Antibody Dependent Enhancement (ADE) might be possible, but is not likely
This is when imperfect antibodies are used as an anchor for the virus to infect white blood cells.
UPDATE: A related thing now strikes me as somewhat likely. It might be fusing with some white blood cells (at least T-cells) and ordering them to apoptose (activate cell-death). Article, h/t CellBioGuy.
* Going off of some other bits of research on this, these individuals probably have lower virus-titer, but higher severity and lethality. A damaging immune response, basically.
** Which vaccine types cause this bad reaction? For SARS-1, any whole-S-protein vaccines were more prone to this bad reaction. Some smaller S-protein fragments didn’t have this issue, hopefully the same fix works for SARS-2. I heard of at least one case where an N-protein-only vaccine attempt also resulted in the Th-2 reaction, though. It’s not totally clear how to avoid triggering it.
So maybe the speed-up you really want is to vaccinate, then deliberately expose to the live virus, and monitor what happens?
This is the type of test I’d rather we do on animal models than humans, to be frank. It seemed that you could test this phenomenon just fine with SARS-1 in animal models.
Vaccines are still our best shot in the long-term.
I wouldn’t phrase it as “vaccines do not look promising,” but more as “SARS is relatively hard to vaccinate well.” I do think we’ll have a vaccine that works reliably, eventually. No other antiviral method has their price-to-effectiveness ratio.
We were able to find fixes to the problems with some SARS-1vaccines, and I think we’ll be able to route around these problems for SARS-2 as well.
This just means that I don’t expect vaccine development to be quite as fast as it would be for viruses without these known problems. Additionally, I suspect animal-testing could be crucial to the development of a safe vaccine, unless we’re willing to risk a few human lives in their stead (which, maybe we are).
And speaking personally, until the clinical trial results are in, I’m inclined to be cautious about taking vaccines that use large swathes of the viral S-protein, although I suspect some with smaller fragments will turn out to be fine.
As for why blood clots are a problem in the first place… one of the hypotheses I’ve seen floating around is that it might be tied into complement system malfunction?
Warning that this is pretty speculative...
The complement system is an immune response that uses C-protein complexes to poke holes in membranes to kill cells and fight large infections.
This paper used results from 5 lung autopsies and tried to draw a link between the prolonged procoagulant state in the lungs with excessive activity of the complement system. I could barely follow it beyond that.
I had also heard before that complement system malfunctions were thought to be connected to bad vaccine response for SARS-1.
I don’t feel certainty in this at all. But it comes up semi-consistently, and I don’t have a better theory yet.
Here’s a paper that situationally agrees with you on anticoagulants… Anticoagulant treatment is associated with decreased mortality in severe coronavirus disease 2019 patients with coagulopathy
449 people. Specifically, they observed no difference in survival between heparin users/non-users overall, but in the very-high-D-dimer subset (or in people with lots of sepsis‐induced coagulopathy), survival seemed to be better with heparin.
This link carries no new information yet, but seems to be a placeholder for a future review paper on this topic.
The “blood coagulation as a major contributor to death” bit generally matches pretty well with some early results where high D-dimer predicted worse rates of mortality fairly reliably, since D-dimer is basically a problematic-blood-clot indicator.
There’s a potential complicating factor for the elderly, which is that many of them are already on anticoagulants (to mitigate stroke-risk). And going on some experiences of my grandparents, it seems to be hard to navigate the risks mitigated with anticoagulants with the risk of bleeding out unless you’re pretty careful. All the same, it looks like a promising line of improvement to treatment of severe COVID-19.
Nitpick: __ice9 has 2 underscores, not 1.
micpie’s answer here is good! I don’t have a lot to add to it.
Here are a few related reference-links* and a tiny bit of commentary.
COVID-19 vaccine development landscape as of April 9th (post)
ETA: In the Pipeline blog post providing commentary and description of the vaccines furthest along in the pipeline (Apr 23)
ChristianKI’s LW Q for mRNA vaccine development for COVID-19 in particular (something of a spin-off of the afforelinked thread)
Bit of context: Some of the fastest vaccines to hit human trials were mRNA-based vaccines**, although at this point the field is more balanced.
Good article on additional vaccine development challenges for SARS-2
Clickbaitier take: Whole S-Protein Vaccines Are Dangerous (The Solution May Surprise You!)
It gives a good overview of the additional challenges to developing COVID-19 vaccines, such as Th-2 immunopathology (allergy-like immune overreaction, seen for some SARS-1 vaccines). This doesn’t cover the details of clinical trials generally (which are hard enough on their own).
It’s quite readable, but if you want bullet-points, here
2016 Review paper on the typical vaccine development pipeline
This one is technically-oriented, and about typical vaccine-development
...as one might guess, people are Cutting Corners for COVID. Not all of the best-practice guidelines are going to be stringently applied.
*...some of which I heard about via other micpie comments. Go figure.
** 60 days from the nCOV DNA sequence to mRNA vaccines starting clinical trials! mRNA is a relatively flexible/easy-to-modify platform, but new and a bit untested.
I don’t really know, I wouldn’t do this. Here are a couple of possibilities that ran through my mind.
COVID’s symptoms are basically “see: undefined flu-like symptoms.” This might just be an equivalent of “I looked up my symptoms on WebMD and it’s definitely cancer,” only with COVID.
There was that revelation that Washington got it earlier than expected. Maybe they’re pattern-matching blindly to this. It’s really easy to do so, especially if there was another nasty flu or cold going around back then (which there probably was).
People want an excuse to go about their life as normal (or to complain if they’re not)
People especially hate taking the possibility of their own death seriously
Nobody wants to deal with the guilt of knowing that their “normal” actions may be endangering others (cough asymptomatic transmission), and they would rather believe something potentially-false than contend with that
It’s probably a mix of all three, or even more.
With all due affection, I’ve heard that New Yorkers as a whole are fairly prone to contrarianism. So the frequency with which you’re hearing this might also partially be local variance.
TL;DR: No. The earliest I’d buy for pandemic-track COVID is early-to-mid December, and in China or maybe Australia. Otherwise, it’d have to be a non-pandemic substrain that died out early, and left no children behind except the first Wuhan strain. The theory loses in an Occam’s Razor fight with “your friends probably had something else back then.”
ETA: This post mentions a second independent line of evidence on the matter (using antibodies), and also dates the first COVID-19 cases to no earlier than December.
I’m going to be basing most of this on nextstrain’s COVID-19 phylogeny data and their accompanying chart.*
The earliest sequenced US case we have came from Washington. The Washington strain’s earliest sequenced sample was 5 weeks of mutation out from the Wuhan strain at the time, leading to the inference that it arrived (or at least split off from the Wuhan gene-pool) in about mid-January. Australia seems to have a divergent strain that might have broken off even earlier, possibly as far back as mid-December.
Going on their graph, they dated the Wuhan last common-ancestor (LCA) strain to roughly mid-December, and the all-strain LCA is the same one.
(I’m not going to detail all of how this works, but LUCA is a similar concept.)
So, cases much earlier than mid-December (and in anyplace other than China or Australia) seem really unlikely to me**.
*Note that old phylogenies are often very inaccurate and rough, and rely heavily on your starting assumptions (I’ve played around with them, and it is wild how different the trees can be). But this is a fresh phylogeny, and should be a bit of a best-case-scenario. This a very recent series of mutations and splits, and on top of that, unlike with paleontology we can access, date, and sequence old blood samples just fine. I expect this phylogeny to get the occasional detail wrong, but to hit the broad-strokes and to be largely pretty accurate.
**If an earlier strain existed, it would have had to have left no lingering sequenced sub-strains in the present day. If even one of those theoretical highly-divergent sub-strains persisted, got sequenced, and were added to the phylogeny calculation… it would have introduced a new early-stage split into the phylogenetic tree that would have pushed the probable LCA way back into the past.***
*** I mention this in part because… HIV did have these. A non-pandemic ancestor that we found in some very old blood samples. It was nowhere near as contagious and persistent at the time, and even seemed to be a transient illness for those who caught it. But being a pretty unsuccessful virus at the time, only a very small pool of people had it back then.