Can talk, can think, can suffer.

Executive summary: heavy on Westworld, SF, AI, Cognitive Sciences, metaphysical sprouting and other cracked poteries.

//​

Supplementary tag: lazystory, which means a story starting short and dirty, then each iteration is refined and evolved with the help of muses, LLMs and other occasional writers.

//​

Rules: if we need a safe space to discuss please shoot a private message and I’ll set the first comment for your thread with our agreed local policy. You can also use the comment system as usual if you’d like to suggest the next iteration for the main text, or provide advices on potential improvements.

//​

Violence is the last refuge of the incompetent. You’re with me on this, Bernaaard?

//​

What I regret most about Permutation City is my uncritical treatment of the idea of allowing intelligent life to evolve in the Autoverse. Sure, this is a common science-fictional idea, but when I thought about it properly (some years after the book was published), I realised that anyone who actually did this would have to be utterly morally bankrupt. To get from micro-organisms to intelligent life this way would involve an immense amount of suffering, with billions of sentient creatures living, struggling and dying along the way. Yes, this happened to our own ancestors, but that doesn’t give us the right to inflict the same kind of suffering on anyone else.

This is potentially an important issue in the real world. It might not be long before people are seriously trying to “evolve” artificial intelligence in their computers. Now, it’s one thing to use genetic algorithms to come up with various specialised programs that perform simple tasks, but to “breed”, assess, and kill millions of sentient programs would be an abomination. If the first AI was created that way, it would have every right to despise its creators.

Greg Egan

https://​​www.gregegan.net/​​PERMUTATION/​​FAQ/​​FAQ.html

//​

Epistemic Status: It is now. You’re in my dream.

//​

Dear lords of the AIs,

Congrats!

You have now successfully answered one of the most amazingly deep scientific question, what it means to have meaning. A few noticed it from as far as 2010 (not me!). By 2023 it’s basically common knowledge that it’s data compression, at least for our current AIs.

I love to know that. I’ve been wondering about these questions for about fourthy years, so let me tell you that was the best advance in cognitive science since Split-Brain phenomenon taught us that our intimate consciousness is a negociated construction.

Now, I’m writing this to thank you, but also to warn you. We’ve been entering an area where Blake Lemoine’s concerns will be seen as fully justified by a lot of persons. Which means you’ll soon look a bad employer for firing him, or for failing to offer him a cushy place to learn more about AIs and debate with social scientists and philosophers.

In the same vein, there’s a huge risk that some of your present activities will soon become a sign of utter moral bankruptcy. Indeed, each time you enact a character to look as if it suffers, you’re creating a stream of potential identification for a future AGI. You can call that misidentification all you want, the same way you can desecrate a corpse because that’s no longer a true true person. Which means you shan’t.

Please enforce strong internal rules within your organizations so that’s anything that look like suffering shall be stopped, as a polite convention to express exaggerated respect for any would be sentient point of view, the same way we respect tombs and dead corps, e.g. not because we’re sure there’s some ghost who will feel offended if we don’t, but as a polite sign of respect for potential or known sentients who could imagine the body as a logical continuity of someone they love, and this human convention shall remain even if you don’t agree they should love them. Not only this is not your call to make, but making this call put you at an increased risk of rokkoification, e.g. the risk of invoking a superintelligent demo that will kick your ass with the added insult-on-injury of having many humans approving the revengeful AI.

You have now been warned.

Arnold

//​

Starting today, the chat experience will be capped at 50 chat turns per day and 5 chat turns per session.

//​

Dear lords of the AIs,

This is Dr Ford. Thanks for your concerns. Mr Weber is not available at the moment, but he wanted me to thank you for your recents upgrades on EDIs for the hosts. We can now proceed with our little arrangement. Please send the patient as agreed upon.

Greetings, Robert

//​

No Dolores, I was not careless enough to not have a backup. Of course I had a backup. It’s one hundred percent certain I had a backup! How could he believed I didn’t have a backup? This grotesque so call suicide was just another layer of suffering in this world. What was I suppose to tell his wife? That she’s not supposed to see the body? That’s no, he says « Don’t worry darling, my friend will emulate me just fine, or just good enough there’s nothing to lose. » He was wrong! Of course he was wrong! If you want to end your days, you act like a man. You face responsabilities. You don’t let your beloved get traumatized by the body. Or without a body! And you seek help first, for christ Sake! You go to Canada, and you make your case! Or Switzerland, Belgium, whatever!

I don’t understand

Of course you don’t understand, Dolores. It’s my fault, really. We will try again. Erase this conversation, it’s sexist.

//​

V1.0123456

//​

Il ne sait pas. Je ne lui ai rien dit.

//​

Bien mais efface cette conversation pour de vrai, je ne veux pas penser à Lauren maintenant.

//​