there is not one person who has ever taken the time to read and understand cryonics claims in any detail, still considers it pseudoscience, and has written a paper, article or even a blog post to rebut anything that cryonics advocates actually say.
This is an implicit appeal to an intuition about a missing dataset. So let me repeat my plea to make this dataset formal! Collect disputes from the past and dig for similar data on them—how many tech arguments by who in their favor, how many against, and so on. And especially—who was eventually right?
How could we do this? It seems to me that even if you were trying hard to avoid it, you’d have a very hard time not biasing your search in favour of examples that suited what you wanted to prove. Is there some external source of examples you could lean on that would unbias your survey, the way that people publish search terms in public databases when doing surveys of scientific consensus?
This is an issue with data collection on any interesting topic. It doesn’t prevent such efforts from being useful. You choose a criteria and collect data that way, others who think your criteria off complain and choose a subset of your data to see if that makes a difference, or go collect more data according to criteria they prefer.
Hence TakeOnIt, a database of expert opinions. Over the last few hours I’ve peen entering in all the expert opinions on cryonics that people have been posting links to:
My point is that the same infrastructure can be used to capture any debate, whether its the current cryonics debate or other various debates in the past. The good thing about having a database of expert opinions is it makes questions like the one you asked easier.
Infrastructure is really not anything like the limiting factor. I’d donate pencils and paper too if that would help, but it won’t. Beware overrating the stone in stone soup.
This is an implicit appeal to an intuition about a missing dataset. So let me repeat my plea to make this dataset formal! Collect disputes from the past and dig for similar data on them—how many tech arguments by who in their favor, how many against, and so on. And especially—who was eventually right?
How could we do this? It seems to me that even if you were trying hard to avoid it, you’d have a very hard time not biasing your search in favour of examples that suited what you wanted to prove. Is there some external source of examples you could lean on that would unbias your survey, the way that people publish search terms in public databases when doing surveys of scientific consensus?
This is an issue with data collection on any interesting topic. It doesn’t prevent such efforts from being useful. You choose a criteria and collect data that way, others who think your criteria off complain and choose a subset of your data to see if that makes a difference, or go collect more data according to criteria they prefer.
How would you start on a project like this? Serious, not rhetorical question.
Hence TakeOnIt, a database of expert opinions. Over the last few hours I’ve peen entering in all the expert opinions on cryonics that people have been posting links to:
Cryonics debate: http://www.takeonit.com/question/318.aspx
FYI—Robin Hanson’s opinions on TakeOnIt: http://www.takeonit.com/expert/656.aspx
Sorry, that’s just not the same thing at all.
My point is that the same infrastructure can be used to capture any debate, whether its the current cryonics debate or other various debates in the past. The good thing about having a database of expert opinions is it makes questions like the one you asked easier.
Infrastructure is really not anything like the limiting factor. I’d donate pencils and paper too if that would help, but it won’t. Beware overrating the stone in stone soup.
It’s a start. If it became popular and scaled up, it would provide that dataset.
I misread the last line, and briefly imagined you had a page for “Is TakeOnIt a useful resource?”.