If you haven’t already you should take a look at surveys of population ethics, meta-ethics, and normative ethics literature.
Could I get some links, please? I’m a CS grad student by trade, so academic literature on ethics is an Outside Context thing to me.
And I don’t understand your second question; could you elaborate?
Imagine that I know what I want very well, but do not possess a “blank” AGI agent (or “blank genie” as such things were previously described) into which my True Wishes can be inserted to make them come true. What other modes of implementation might be open to me for implementing my True Wishes?
As I currently understand it, your plan is to write an essay arguing that humanity should devote lots of resources to figuring out how to steer the far future? In other words, “We ought to steer the near future into a place where we have a better idea of where to steer the far future, and then steer the far future in that direction.”
Certainly this is how I want to describe our steering towards the near-future. “Far future” depends on the definition: without some kind of Blank Optimization Process like an AI, it’s hard to claim that we can steer our future 100 million years ahead of time, say, rather than being randomly wiped out by an asteroid one day 200,000 years from now, or reverting to such normal, predictable patterns of human behavior that history degrades into non-progressing cycles, etc.
Ah, but that’s a thesis in itself! “If we don’t truly know what we want, even the mid-term future, let alone the far future, risks becoming mere chaos! Progress itself is at risk!”
Sorry, no links—I’m an outsider to current meta-ethics too. But I’m sure people have thought about e.g. the question “How am I supposed to go about figuring out what is good / what I value?]” Rawls, for example, famously introduced the notion of “Reflective Equilibrium,” which is a good first pass at the problem I’d say.
Imagine that I know what I want very well, but do not possess a “blank” AGI agent (or “blank genie” as such things were previously described) into which my True Wishes can be inserted to make them come true. What other modes of implementation might be open to me for implementing my True Wishes?
It sounds like you are asking how we could get our goals accomplished without AI. The answer to that is “The same way people have accomplished their goals since the dawn of time—through hard work, smart economic decisions, political maneuvering, and warfare.” If you are asking what the most effective ways for us to implement our True Wishes are, after AGI, then… it depends on your Wishes and on your capabilities as a person, but I imagine it would have to do with influencing the course of society at large, perhaps simply in your home country or perhaps all around the world. (If you don’t care so much what other people do, so long as you have a nice life, then the problem is much simpler. I’m assuming you have grand ideas about the rest of the world.)
Making lots of money is a really good first step, given capitalism and stable property rights. Thanks to increasing democracy and decreasing warfare in the world, getting many people to listen to your ideas is important. In fact, if you know what your True Wishes are, then you are probably going to be pretty good at convincing other people to follow you, since most people aren’t even close to that level of self-awareness, and since people’s True Wishes probably overlap a lot.
Edit: P.S. If you haven’t read Nick Bostrom’s stuff, you definitely should. He has said quite a bit on how we should be steering the far future. Since you are on LW it is highly likely that you have already done this, but I might as well say it just in case.
In fact, if you know what your True Wishes are, then you are probably going to be pretty good at convincing other people to follow you, since most people aren’t even close to that level of self-awareness, and since people’s True Wishes probably overlap a lot.
I don’t believe the Fascist Dictator route is ethically desirable. This is not to accuse you of encouraging me to become a fascist dictator, but to point out that mind-killed, group-thinking, affective-death-spiralling cults driven by the rhetoric and charisma of even one Truly Clever, Truly Inspired Leader are a sociological attractor we must proactively avoid. Otherwise, they happen on their own the instant some poor sod with more intelligence, charisma, and inspiration than wisdom starts preaching his ideas to the masses.
True, we ought to proactively avoid such cults. My suggestion was merely that, in the event that you figure out what your True Wishes are, persuading other people to follow them is an effective way to achieve them. It is hard to get something accomplished by yourself. Since your True Wishes involve not being a fascist dictator, you will have to find a balance between not telling anyone what you think and starting a death cult.
Perhaps I shouldn’t have said “convincing other people to follow you” but rather “convincing other people to work towards your True Wishes.”
Could I get some links, please? I’m a CS grad student by trade, so academic literature on ethics is an Outside Context thing to me.
Imagine that I know what I want very well, but do not possess a “blank” AGI agent (or “blank genie” as such things were previously described) into which my True Wishes can be inserted to make them come true. What other modes of implementation might be open to me for implementing my True Wishes?
Certainly this is how I want to describe our steering towards the near-future. “Far future” depends on the definition: without some kind of Blank Optimization Process like an AI, it’s hard to claim that we can steer our future 100 million years ahead of time, say, rather than being randomly wiped out by an asteroid one day 200,000 years from now, or reverting to such normal, predictable patterns of human behavior that history degrades into non-progressing cycles, etc.
Ah, but that’s a thesis in itself! “If we don’t truly know what we want, even the mid-term future, let alone the far future, risks becoming mere chaos! Progress itself is at risk!”
Sorry, no links—I’m an outsider to current meta-ethics too. But I’m sure people have thought about e.g. the question “How am I supposed to go about figuring out what is good / what I value?]” Rawls, for example, famously introduced the notion of “Reflective Equilibrium,” which is a good first pass at the problem I’d say.
It sounds like you are asking how we could get our goals accomplished without AI. The answer to that is “The same way people have accomplished their goals since the dawn of time—through hard work, smart economic decisions, political maneuvering, and warfare.” If you are asking what the most effective ways for us to implement our True Wishes are, after AGI, then… it depends on your Wishes and on your capabilities as a person, but I imagine it would have to do with influencing the course of society at large, perhaps simply in your home country or perhaps all around the world. (If you don’t care so much what other people do, so long as you have a nice life, then the problem is much simpler. I’m assuming you have grand ideas about the rest of the world.)
Making lots of money is a really good first step, given capitalism and stable property rights. Thanks to increasing democracy and decreasing warfare in the world, getting many people to listen to your ideas is important. In fact, if you know what your True Wishes are, then you are probably going to be pretty good at convincing other people to follow you, since most people aren’t even close to that level of self-awareness, and since people’s True Wishes probably overlap a lot.
Edit: P.S. If you haven’t read Nick Bostrom’s stuff, you definitely should. He has said quite a bit on how we should be steering the far future. Since you are on LW it is highly likely that you have already done this, but I might as well say it just in case.
I don’t believe the Fascist Dictator route is ethically desirable. This is not to accuse you of encouraging me to become a fascist dictator, but to point out that mind-killed, group-thinking, affective-death-spiralling cults driven by the rhetoric and charisma of even one Truly Clever, Truly Inspired Leader are a sociological attractor we must proactively avoid. Otherwise, they happen on their own the instant some poor sod with more intelligence, charisma, and inspiration than wisdom starts preaching his ideas to the masses.
True, we ought to proactively avoid such cults. My suggestion was merely that, in the event that you figure out what your True Wishes are, persuading other people to follow them is an effective way to achieve them. It is hard to get something accomplished by yourself. Since your True Wishes involve not being a fascist dictator, you will have to find a balance between not telling anyone what you think and starting a death cult.
Perhaps I shouldn’t have said “convincing other people to follow you” but rather “convincing other people to work towards your True Wishes.”