I have no interest in tiling the universe with anything—that would be dull. Therefore I would strive to subvert the spirit of such a restriction as effectively as I could. Off the top of my head, pre-supernova stars seem like adequate tools for the purpose.
No, but given the restrictions of the hypothetical it’s on my list of possible courses of action. Were there any possibility of my being forced to make the choice, I would definitely want more options than just this one to choose from.
If something capable as serving as a cell in a cellular automaton would count as simple enough, I’d choose that. And I’d design it to very occasionally malfunction and change states at random, so that interesting patterns could spontaneously form in the absence of any specific design.
Basically, the “simple” condition was designed to elicit answers more along the lines of “paperclips!” or “cheesecake!”, rather than “how can I game the system so that I can have interesting stuff in the universe again after the tiling happens?” You’re not playing fair if you do that.
I find this an interesting question because while it does seem to be a consensus that we don’t want the universe tiled with orgasmium, it also seems intuitively obvious that this would be less bad than tiling the universe with agonium or whatever you’d call it; and I want to know what floats to the top of this stack of badness.
Basically, the “simple” condition was designed to elicit answers more along the lines of “paperclips!”
Mission accomplished! c=@
Now, since there seems to be a broad consensus among the posters that paperclips would be the optimal thing to tile the universe with, how about we get to work on it?
Basically, the “simple” condition was designed to elicit answers more along the lines of “paperclips!” or “cheesecake!”, rather than “how can I game the system so that I can have interesting stuff in the universe again after the tiling happens?” You’re not playing fair if you do that.
And that is a good thing. Long live the munchkins of the universe!
I think orgasmium is significantly more complex than cheesecake. Possibly complex enough that I could make an interesting universe if I were permitted that much complexity, but I don’t know enough about consciousness to say.
Hmm… a universe full of cheescake will have enough hydrogen around to form stars once the cheesecakes attract each other, with further cheescake forming to planets that are are a perfect breeding ground for life, already seeded with DNA and RNA!
I suppose that the majority of the cheesecake does not consist of eukaryotic cells, but there are definitely plenty of them in there. I’ve never looked at milk under a microscope but I would expect it to contain cells from the cow. The lemon zest contains lemon cells. The graham cracker crust contains wheat. Dead cells would not be much simpler than living cells.
If you had to tile the universe with something—something simple—what would you tile it with?
Copies of my genome. If I can’t do anything to affect the utility function I really care about, then I might as well optimize the one evolution tried to make me care about instead.
(Note that I interpret ‘simple’ as excluding copies of my mind, simulations of interesting universes, and messages intended for other universes that simulate this one to read, any of which would be preferable to anything simple.)
Or the Countess just decides not to pay, unconditional on anything the Baron does. Also, if the Baron ends up in an infinite loop or failing to resolve the way the Baron wants to, that is not really the Countess’s problem.
As I always press the “Reset” button in situations like this, I will never find myself in such a situation.
EDIT: Just to be clear, the idea is not that I quickly shut off the AI before it can torture simulated Eliezers; it could have already done so in the past, as Wei Dai points out below. Rather, because in this situation I immediately perform an action detrimental to the AI (switching it off), any AI that knows me well enough to simulate me knows that there’s no point in making or carrying out such a threat.
I am assuming that an agent powerful enough to put me in this situation can predict that I would behave this way.
It is also potentialy serves decision-theoretic purposes. Much like a Dutchess choosing not to pay off her blackmailer. If it is assumed that a cheesecake maximiser has a reason to force you into such a position (rather than doing it himself) then it is not unreasonable to expect that the universe may be better off if Cheesy had to take his second option.
I can’t recall: do your views on consciousness have a dualist component? If consciousness is in some way transcendental (that is, as a whole somehow independent or outside of the material parts), then I understand valuing it as, for example, something that has interesting or unique potential.
If you are not dualistic about consciousness, could you describe why you value it more than cheesecake?
If you are not dualistic about consciousness, could you describe why you value it more than cheesecake?
To be precise, I value positive conscious experience more than cheesecake, and negative conscious experience less than cheesecake.
I assign value to things according to how they are experienced, and consciousness is required for this experience. This has to do with the abstract properties of conscious experience, and not with how it is implemented, whether by mathematical structure of physical arrangements, or by ontologically basic consciousness.
In a tiled universe, the universe is partitioned into a grid of tiles, and the same pattern is repeated exactly in every tile, so that if you know what one tile looks like, you know what the entire universe looks like.
If you had to tile the universe with something—something simple—what would you tile it with?
Paperclips.
I have no interest in tiling the universe with anything—that would be dull. Therefore I would strive to subvert the spirit of such a restriction as effectively as I could. Off the top of my head, pre-supernova stars seem like adequate tools for the purpose.
Are you sure that indiscriminately creating life in this fashion is a good thing?
No, but given the restrictions of the hypothetical it’s on my list of possible courses of action. Were there any possibility of my being forced to make the choice, I would definitely want more options than just this one to choose from.
Can the tiles have states that change and interact?
Only if that doesn’t violate the “simple” condition.
What counts as simple?
If something capable as serving as a cell in a cellular automaton would count as simple enough, I’d choose that. And I’d design it to very occasionally malfunction and change states at random, so that interesting patterns could spontaneously form in the absence of any specific design.
Basically, the “simple” condition was designed to elicit answers more along the lines of “paperclips!” or “cheesecake!”, rather than “how can I game the system so that I can have interesting stuff in the universe again after the tiling happens?” You’re not playing fair if you do that.
I find this an interesting question because while it does seem to be a consensus that we don’t want the universe tiled with orgasmium, it also seems intuitively obvious that this would be less bad than tiling the universe with agonium or whatever you’d call it; and I want to know what floats to the top of this stack of badness.
Mission accomplished! c=@
Now, since there seems to be a broad consensus among the posters that paperclips would be the optimal thing to tile the universe with, how about we get to work on it?
Hold on, we’re still haven’t settled on ‘paperclips’ over ‘miniature smiley faces’ and ‘orgasmium’. Jury is still out. ;)
And that is a good thing. Long live the munchkins of the universe!
I think orgasmium is significantly more complex than cheesecake. Possibly complex enough that I could make an interesting universe if I were permitted that much complexity, but I don’t know enough about consciousness to say.
Cheesecake is made of eukaryotic life, so it’s pretty darn complex.
Hmm… a universe full of cheescake will have enough hydrogen around to form stars once the cheesecakes attract each other, with further cheescake forming to planets that are are a perfect breeding ground for life, already seeded with DNA and RNA!
Didn’t think of that. Okay, orgasmium is significantly more complex than paperclips.
What? It’s products of eukaryotic life. Usually the eukaryotes are dead. Though plenty of microorganisms immediately start colonizing.
Unless you mean the other kind of cheesecake.
I suppose that the majority of the cheesecake does not consist of eukaryotic cells, but there are definitely plenty of them in there. I’ve never looked at milk under a microscope but I would expect it to contain cells from the cow. The lemon zest contains lemon cells. The graham cracker crust contains wheat. Dead cells would not be much simpler than living cells.
Copies of my genome. If I can’t do anything to affect the utility function I really care about, then I might as well optimize the one evolution tried to make me care about instead.
(Note that I interpret ‘simple’ as excluding copies of my mind, simulations of interesting universes, and messages intended for other universes that simulate this one to read, any of which would be preferable to anything simple.)
I have no preferences within the class of states of the universe that do not, and cannot evolve to, contain consciousness.
But if, for example, I was put in this situation by a cheesecake maximizer, I would choose something other than cheese cake.
Interesting. Just to be contrary?
Because, as near as I can calculate, UDT advises me too. Like what Wedrifid said.
And like Eliezer said here:
And here:
I am assuming that an agent powerful enough to put me in this situation can predict that I would behave this way.
It is also potentialy serves decision-theoretic purposes. Much like a Dutchess choosing not to pay off her blackmailer. If it is assumed that a cheesecake maximiser has a reason to force you into such a position (rather than doing it himself) then it is not unreasonable to expect that the universe may be better off if Cheesy had to take his second option.
I can’t recall: do your views on consciousness have a dualist component? If consciousness is in some way transcendental (that is, as a whole somehow independent or outside of the material parts), then I understand valuing it as, for example, something that has interesting or unique potential.
If you are not dualistic about consciousness, could you describe why you value it more than cheesecake?
No, I am not a dualist.
To be precise, I value positive conscious experience more than cheesecake, and negative conscious experience less than cheesecake.
I assign value to things according to how they are experienced, and consciousness is required for this experience. This has to do with the abstract properties of conscious experience, and not with how it is implemented, whether by mathematical structure of physical arrangements, or by ontologically basic consciousness.
me
(i’m assuming I’ll be broken down as part of the tiling process, so this preserves me)
Damn. If only I was simple, I could preserve myself that way too! ;)
Witty comics. (eg)
The words “LET US OUT” in as many languages as possible.
Isn’t the universe already tiled with something simple in the form of fundamental particles?
In a tiled universe, the universe is partitioned into a grid of tiles, and the same pattern is repeated exactly in every tile, so that if you know what one tile looks like, you know what the entire universe looks like.
A sculpture of stars, nebulae and black holes whose beauty will never be admired by anyone.
ETA: If this has too little entropy to count as simple—well whatever artwork I can get away with I’ll take.
Computronium