The stuff in the Sequences will just be obvious thinking habits.
I don’t understand how we are to achieve this. How plausible is it that some of the stuff was repeatedly discovered, succeeded in infecting a group of people and died out once it was no longer novel or once a broader tradition was lost? I remember a Sovietbook “Axioms of biology” popularising science and taking a jab at a mysterious answer to the origins of life and to the properties of opium causing the users to fall asleep.
without subscribing to stereotypical rationalist/EA beliefs on AI
I am sure that the part about beliefs related to AIs will be either forgotten (if alignment is solved) or inscribed into a global ban (if the doomers succeed in convincing enough others). How plausible is it that making donations to effective charities popular ends up requiring the new generations to keep becoming smarter, which they fail to do and are becoming stupider?
UPD: I encountered the term ‘Scout Mindset’ in use by a right-wing author who quoted a reference to Julia Galef’s book on the Scout Mindset(!)
I don’t understand how we are to achieve this. How plausible is it that some of the stuff was repeatedly discovered, succeeded in infecting a group of people and died out once it was no longer novel or once a broader tradition was lost? I remember a Soviet book “Axioms of biology” popularising science and taking a jab at a mysterious answer to the origins of life and to the properties of opium causing the users to fall asleep.
I am sure that the part about beliefs related to AIs will be either forgotten (if alignment is solved) or inscribed into a global ban (if the doomers succeed in convincing enough others). How plausible is it that making donations to effective charities popular ends up requiring the new generations to keep becoming smarter, which they fail to do and are becoming stupider?
UPD: I encountered the term ‘Scout Mindset’ in use by a right-wing author who quoted a reference to Julia Galef’s book on the Scout Mindset(!)