Most of my posts and comments are about AI and alignment. Posts I’m most proud of, which also provide a good introduction to my worldview:
Without a trajectory change, the development of AGI is likely to go badly
Steering systems, and a follow up on corrigibility.
I also created Forum Karma, and wrote a longer self-introduction here.
PMs and private feedback are always welcome.
NOTE: I am not Max Harms, author of Crystal Society. I’d prefer for now that my LW postings not be attached to my full name when people Google me for other reasons, but you can PM me here or on Discord (m4xed) if you want to know who I am.
What specifically is this referring to? The Mere Goodness sequences?
I read your recent post about not rolling your own metaethics as addressed mostly at current AGI or safety researchers who are trying to build or align AIs today. I had thought what you were saying was that those researchers would be better served by stopping what they are doing with AI research, and instead spend their time carefully studying / thinking about / debating / writing about philosophy and metaethics. If someone asked me, I would point to Eliezer’s metaethics sequences (and some of your posts and comments, among others) as a good place to start with that.
I don’t think Eliezer got everything right about philosophy, morality, decision theory, etc. in 2008, but I don’t know of a better / more accessible foundation, and he (and you) definitely got some important and basic ideas right, which are worth accepting and building on (as opposed to endlessly rehashing or recursively going meta on).
Is your view that it was a mistake to even try writing about metaethics while also doing technical alignment research in 2008? Or that the specific way Eliezer wrote those particular sequences is so bad / mistaken / overconfident, that it’s a central example of what you want to caution against with “rolling your own metaethics”? Or merely that Eliezer did not “solve” metaethics sufficiently well, and therefore he (and others) were mistaken to move ahead and / or turn their attention elsewhere? (Either way / regardless, I still don’t really know what you are concretely recommending people do instead, even after reading this thread.)