I have only read a small fraction of Yudkowsky’s sequences (I printed the 1800 pages two days ago and have only read about 50), so maybe I think I am discussing interesting stuff where in reality EY has already discussed it in length.
Mostly this. Other things too, but all mostly are caused by this one. I am one of the few who commented in one of your posts with links to some of his writings exactly for this reason. While I’m guilty of not having given you any elaborate feedback and of downvoting that post, I still think you need to catch up with the basics. It’s praiseworthy that you want to engage in rationality and in new ideas, but by doing it without becoming familiar with the canon first, you are not just (1) probably going to say something silly (because rationality is harder than you think), (2) probably going to say something old (because a lot has been written), but also (3) wasting your own time.
Hi, full time content developer at RAISE here.
The overview page you are referring to (is it this one?) contains just some examples of subjects that we are working on.
1. One of the main goals is making a complete map of what is out there regarding AI Safety, and then recursively create explanations for the concepts it contains. That could fit multiple audiences depending on how deep we are able to go. We have started doing that with IRL and IDA. We are also trying a bottom-up approach with the prerequisite course because why not.
2. Almost the same as reading papers, with clear pointers to references to quickly integrate any missing knowledge. Whether this will be achieved in the best case or in the average case is currently under testing.
3. I don’t know about the absolute amount of time required for that. Keep in mind that this remains to be confirmed, but we have recently started collecting some statistics that suggest it’s going to be at least comparatively quicker to read RAISE material, compared to having to search for the right papers plus reading and understanding them. This would be the second main goal.
Thanks :)