I love this feature! really the one i probably lacked most on lesswrong. all the features you’re considering sound good to me.
Even without finding the exact person who’s sick, this can go along very well with community based approaches (as described here for example), where you try to find communities that have infections and isolate them from other communities to stop the spread from getting everywhere. its especially helpful when you have a large scarcity of tests.
I think an answer is mainly how “fat” the tail is, which you addressed.
i am wondering though, how much of the risks to animals in animal trials apply to humans. not because of difference in biology, but whether we can know how much we don’t know about a vaccine before we give it, and only give the least uncertain to humans (cause i assume we’re being less careful on animals). i guess we can’t know much and my prior on testing vaccines was “dangerous on average, with really fat tail”
If anyone here is in or near Seattle -
This vaccine trial is recruiting volunteers in that area: https://corona.kpwashingtonresearch.org/It’s testing the safety of a SARS-CoV-2 vaccine that uses a new vaccine technology: mRNA in lipid nanoparticles; the idea is a few decades old, I think, but no vaccine using this tech has been approved for human use yet.
Is there an easy way to follow these daily updates or the corona virus tag i can recommend to people who don’t frequent LessWrong?
Eliezer mentioned Korzybski here (though it’s not part of the sequences).
Over all this felt more like a book review then a history or a biography essay. but maybe that was the purpose.
I heard there was a trial on monkey for a SARS vaccine that instead of making them immune only made the disease worse when they got it.
Related question, how dangerous is it to test a vaccine without animal trials
Hey didn’t know where to tell the mod team this, so I’m just gonna write it here. I think the tag feature you used for the coronavirus is great and it’d be great if it was used for more things :)
This is a website by the New England Complex System Instituted (NECSI) founded by Yaneer Bar-Yam. the main thing this website offers are comprehensive and specific guidelines to individuals, families, Businesses, governments, and more.
I like the term “Memetic Ancestors” that you used (coined?)
“even if Korzybski gets lets himself get sucked into”
I agree. I love LessWrong (and its surroundings), but i think it hasn’t yet lived to its promise. to me it seems the community/movement suffers somewhat from focusing on the wrong stuff and premature optimization.
it also seems that sequences suffer from the same halo effect as the author’s project (origin, which I’m not familiar with). it has been written more then 10 years ago, ending on a note that there’s still much to be discovered and improved about rationality—even with it’s release as a book Eliezer noted in the preface his mistakes with it. Since there seems to be agreement on the usefulness of a body of information everybody is expected to read (e.g “read the sequences”), I’d expect there would at least be work or thought on some sort a second version.
Just to be clear, since intentions sometimes don’t come through in text, I’m saying that out of love for the project, not spite. I’ve came across this site a bit more then a year ago and have read a ton of content here, i both love it and somewhat disappointed -
In short, I feel there’s still a level above ours.
Yes, thanks :)
I added your suggestion here
I have an hypothesis, but first I’ll write what i see.
I don’t live in the US (i live in Israel, which is a cultural mimic of the US in many ways, with some delay), so for what it’s worth, i have an outsider perspective. most of the media i consume and the online forums i participate in are in English, so in that sense i hear a lot of what’s going on there. I am aware it means i might have a biased view since normality is rarely reported, but with that it seems that the US is far more extreme in this ideology then Israel, including the epistemic conditions. So yes, i think what happens now in the US is something special, and you’re not being an alarmist.
My hypothesis is this:
instead of seeing a degradation of epistemic conditions we’ve seen a polarization. Whether or not the epistemic conditions of the far left are some new low, there are also much more rationalists/skeptics and people who have a strong sense of epistemic and epistemic arguments.
Maybe part of the explanation is that since this new ideology had to fight far better epistemic then other ideologies in the past, they simply had to throw epistemics out of the window.
The process is similar to a “backfire effect”. whenever they got objections that were of an epistemic nature, then to keep their beliefs intact they had to polarize against those epistemic intuitions. since this sort of opposition was strongest in this time, the backfire was strongest in this time.
On the other end of the spectrum, their degrading epistemic conditions might have pushed the other side’s developing of better epistemics (see the IDW’s focus on norms of conversation and reasoning for example).
Hopefully this side of epistemology wins at the end.
P.S I recall seeing some graph/article that showed that in campuses, these problem aren’t of an equal distribution around the US, but very much centered around certain geographic areas (which if i was living in the US i would surly remember, but you can probably even guess.)
I’m trying to find the article in which Eliezer explains the last paragraph (strange loops through the meta level), i remember reading it, but now i can’t find it. does anyone remember which one it is?
Here too. as long as i hover over the article (not just the black boxes), but not on the sides