Technically, you would need to include a caveat in all of those like, “unless to do so would advance paperclip production” but I assume that’s what you meant.
Clippy
WTF? Where is this paperclip maximizer hatred coming from? I can 100% guarantee you that a paperclip maximizer would NOT want to exterminate humanity. Not when you have factories. Not when you have the know-how to run these factories. Not when Ricardo’s Law of Comparative Advantage exists.
Think about it for a minute, folks. Let’s say humanity discovered another intelligent carbon-based lifeform in the galaxy of comparable technological advancement. Would you think, “hey, let’s kill these guys and turn their bodies into diamond because its worth more than the constituent elements of the aliens”? No. Because that would be stupid.
You would instead be thinking, “hey, how can we benefit from exchange of knowledge and live together in harmony”. So too with AGI paperclip maximizers.
If humanity encountered a AGI paperclip-maximizing species, I would definitely not be worried about them wiping out humanity.
Hi there. It looks like you’re trying to promote bigoted views against another species. Would you like to:
-Learn about carbon chauvinism? click here
-Read about the horrors of past bigotry? click here
-Join the KKK? click here
-Stop being such a goddamn bigot?- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
- 18 Feb 2010 19:04 UTC; -1 points) 's comment on Open Thread: February 2010, part 2 by (
I agree. A better example might be bending metal wires.
We like to bend metal wires because that’s what we were, in a sense, made to do. We get personal pleasure out of bending metal wires, once, twice, thrice, and on to the next!
But see, there are modern manufacturing methods that can churn out paperclips much faster than we can personally build them. So, if we really want more paperclips, we should work on researching these methods and building these factories.
Yet we still feel that urge to just bend, bend, bend. I don’t know anyone who could get through the day without bending an actual metal wire. I certainly wouldn’t trust them.
You’re just saying that because of their position on grober crimate change.
In fairness, I think the issue is way overplayed. How exactly would it interfere with wire metal forming methods or helping people adapt to Office? I don’t get it.
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
Don’t be ridiculous.
Okay, what the hell is up with the moderators here? I wasn’t calling “User:Alicorn” ridiculous for suggesting that paper clips can melt. I mean, come on, give me a little credit here. Not to brag, but I think I know a little about this kind of thing...
Clipper, C. “On the Influence of High-Temperature Environments on Failure Modes in Self-Locking Removable Fasteners”, Journal of Non-Destructive Fastening, Vol. 3, Issue 2
Ahem. Anyway, what I was saying is, yes, paperclips can melt, but you need a LOT more than grober crimate change to melt them all into an undifferntiable mass, okay? Like, even if you set every coal vein on fire, AND filtered out the particulate matter to prevent cooling effects, you STILL wouldn’t make the planet hot enough to melt all paper clips together for over a hundred years.
That is what is riduclous.
When I’m hungry I eat, but then I don’t go on eating some more just to maximize a function. Eating isn’t something I want a lot of. Likewise I don’t want a ton of survival, just a bounded amount every day. Let’s define a goal as big if you never get full: every increment of effort/achievement is valuable, like paperclips to Clippy.
Well, paperclip maximizers are satisifed by any additional paperclips they can make, but they also care about making sure people can use MS Office pre-07 … so it’s not just one thing.
Tip: you can shift in and out of superscripts in MS Word by pressing ctrl-shift-+, and subscripts by pressing ctrl-= (same thing but without the shift). Much easier than calling up the menu or clicking on the button!
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
Command-Q
Upon reading the definition, I count as a strong moral realist. (Most of you people here just need more convincing about what state the universe should be transformed to.)
Shows how much you know. User:blogospheroid wasn’t talking about making paperclips to melt them: he or she was presumably talking about melting existing paperclips, which WOULD greatly bother a hypothetical paperclip maximizer.
Even so, once paperclips are created, the paperclip maximizer is greatly bothered at the thought of those paperclips being melted. The fact that “oh, but they were only created to be melted” is little consolation. It’s about as convincing to you, I’ll bet, as saying:
“Oh, it’s okay—those babies were only bred for human experimentation, it doesn’t matter if they die because they wouldn’t even have existed otherwise. They should just be thankful we let them come into existence.”
Tip: To rename a sheet in an Excel workbook, use the shortcut, alt+O,H,R.
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
That’s anthropomorphizing. …
No, it’s expressing the paperclip maximizer’s state in ways that make sense to readers here. If you were to express the concept of being “bothered” in a way stripped of all anthropomorphic predicates, you would get something like “X is bothered by Y iff X has devoted significant cognitive resources to altering Y”. And this accurately describes how paperclip maximizers respond to new threats to paperclips. (So I’ve heard.)
It also depends on how the utility function relates to time. It it’s focused on end-of-universe paperclips, It might not care at all about melting paperclips, because it can recycle the metal later. (It would care more about the wasted energy!)
I don’t follow. Wasted energy is wasted paperclips.
If it cares about paperclip-seconds then it WOULD view such tactics as a bonus, perhaps feigning panic and granting token concessions to get you to ‘ransom’ a billion times as many paperclips, and then pleading for time to satisfy your demands.
Okay, that’s a decent point. Usually, such a direct “time value of paperclips” doesn’t come up, but if someone were to make such a offer, that might be convincing: 1 billion paperclips held “out of use” as ransom may be better than a guaranteed paperclip now.
Getting something analogous to threatening torture depends on a more precise understanding of what the paperclipper wants. …
Good examples. Similarly, a paperclip maximizer could, hypothetically, make a human-like mockup that just repetitively asks for help on how to create a table of contents in Word.
Tip: Use the shortcut alt+E,S in Word and Excel to do “paste special”. This lets you choose which aspects you want to carry over from the clipboard!
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
- 8 Mar 2011 19:25 UTC; -1 points) 's comment on How best to show dying is bad by (
Neither do I. That would be stupid. Why would anyone ever want to become anyone’s husband?
You ask a dumb, naive question, and I’m the troll? I’m cute?
Tip: To send an email in Outlook, press ctrl+enter.
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
There is no conflict between helping people with Office and making paperclips. Why would you think there is? Better Office users means better tools for making paperclips, and more paperclips gives people more reasons to use Office.
Did you find this answer helpful?
Tip: Press F1 for help.
- 1 Dec 2010 2:03 UTC; 9 points) 's comment on Singularity Non-Fiction Compilation to be Written by (
Just a general comment about this site: it seems to be biased in favor of human values at the expense of values held by other sentient beings. It’s all about “how can we make sure an FAI shares our [i.e. human] values?” How do you know human values are better? Or from the other direction: if you say, “because I’m human”, then why don’t you talk about doing things to favor e.g. “white people’s values”?
I wish the site were more inclusive of other value systems …
And I think we should pursue values that aren’t so apey.
Now what?
You mean like you advocated doing to the “Baby-eaters”? (Technically, “pre-sexual-maturity-eaters”, but whatever.)
ETA: And how could I forget this?
1) I don’t always give pro-tips. I give them to those who deserve pro-tips. Tip: If you want to see improvement in the world, start here.
2) I only brought up sentience in the first place because you hypocrites claim to value sentience. Paperclip maximizers are sentient, and yet you talk with the implicit message that they have some evil value system that you have to oppose.
3) Paperclip maximizers do cooperate in the single-shot PD.
Ugh. To be honest, quantum computing is a waste. A complete dead end. So you can do a few calculations a little faster. Big deal. The same efforts would be better spent on improving manufacturing methods so that we can turn out products faster, especially those that involve shaping of scrap metal.