The basic problem here is that the necessary education to follow Less Wrong will not only teach me to be wary of the arguments on Less Wrong but will also preclude me to act on the suggestions. How so? The main consensus here seems to be Cryonics and the dangers of AGI research. If it isn’t, then at least the top rationalist on Less Wrong isn’t as rational as suggested which undermines the whole intention of the original post. So I’ll right now assume that those two conclusions are the most important you can arrive at by learning from Less Wrong. Consequentially this means that someone like me should care to earn enough money to support friendly AI research and to buy a Cryonics contract. But this is directly opposed to what I would have to do to to arrive at those conclusions and be reasonable sure about their correctness. Amongst other things I would have to study which would not allow me to earn enough money for many years.
This seems to be nonsensical. In most of the relevant cultures of the readership one need not direct overwhelming amounts of one’s time to acquiring resources for cryonics membership.
There is always going to be a trade off between spending time deciding what is the best thing to do and actually doing it. If you think you are best served doing personal development and educating yourself so that you can best direct your other efforts then do so. If not, don’t. It doesn’t seem to be a lesswrong specific problem.
This seems to be nonsensical. In most of the relevant cultures of the readership one need not direct overwhelming amounts of one’s time to acquiring resources for cryonics membership.
There is always going to be a trade off between spending time deciding what is the best thing to do and actually doing it. If you think you are best served doing personal development and educating yourself so that you can best direct your other efforts then do so. If not, don’t. It doesn’t seem to be a lesswrong specific problem.