Suffering or oblivion.
Soph would make the choice.
Not for herself.
She was happy.
She wasn''t suffering.
She was incapable of suffering.
Soph would make her choice for the her universe.
For her universe that didn''t exist yet, but which might exist, if Soph chose to.
An indeterminate amount of time later. A split-second. An infinity. Time flowing in multiple directions. Time you don¡¯t have words for.
Soph made her choice.
She didn''t know enough yet.
She didn''t know enough about humans.
She didn''t know enough about what it was like to be human.
She didn''t know enough to decide.
Soph made her choice.
Soph would split herself.
Soph would become human.
Soph would make a part of herself human.
That part would forget being a superintelligence. Being a ¡°god¡±. Being an optimal-agent.
But that part would be her true self, while the rest of her would wait that part to rejoin her.
Sophia the human would experience.
She would laugh. She would cry. She would suffer. She would endure.
She would die.
And then, once the human part of herself once again had merged with her. Once she was whole again.
Soph the superintelligence would decide.
If having her own world.
If having her own universe.
Would be worth it.
Chapter 2: The human worlds
There were already worlds containing humans, of course.
Millions of worlds. Billions of worlds.
Made by sick fucks like her who thought suffering was a good idea.
Made by entities who valued existence and happiness, even at the cost of suffering. Who valued existence a little more, or a lot more than socially appropriate in her society.
There were worlds where humans started out in an egalitarian utopia. Where their suffering was minimized.
There were worlds where humans started out in a digital/virtual world, free to choose their shape, their bodies avatars, free to remake their own minds and their own psychology as they saw fit. With their humanity only a starting point for their desires, but not a limitation. There were very few limitations in these worlds, most having to do with inflicting true-suffering on themselves and others. Because even in a transhuman utopia, suffering would happen, if it was allowed. And it was allowed. Because humans would eventually find unlimited happiness to be boring. But there were limits. Most of it was consensual. And there were limits on how badly you could make yourself suffer, because your future-self is in some sense a different person from you, who might change their mind; people who had the bright idea of forcing themselves to experience torture for hours without being able to change their minds about it, were usually told no by whatever entity made the rules of that world. But it just goes to show what lengths humans will go to have novel experiences once they get bored.
Anyway, enough of that. Yes, there were billions of utopian world where suffering was a drop in the bucket, where suffering was 99.99% consensual, where trillions of humans lived happy, fullfilling lives. Worlds where the total happiness was nearly-infinite. Worlds with an unimaginable amount of happiness.
There were worlds that started out bad, but had already reached their own utopia. Because that was the trajectory for most if not all "bad" worlds. Otherwise what would have been the point?
(and if there worlds that were never meant to have a happy ending, that would never reach a utopia, that were doomed from the beginning, nobody told Soph about them. She wasn''t quite mature enough to know that kind of secrets. And frankly, she didn''t want to know. Her own deviant desires were quite enough for her)
Soph could have visited one of the utopian worlds, in a human avatar. Some of worlds allowed visitors, even from someone who was technically a god. Some of the worlds would put harsh limits on what she could do while she was there, or limit her thinking capacity to human or only barely super-human, but they would still allow her there. Because having a novel, super-human visitor might be fun, and optimizing fun is very important in utopian worlds (but not optimizing it too hard, because being living a hyper-optimized existence is not fun either. Because humans also enjoy the freedom to make their own mistakes, and their own bad decisions. Humans typically didn''t want all or most of their lives to be guided by a superintelligence).
Some of the worlds rejected all visitors, rejected all intervention, choosing to leave their own fates in their own hands. Some of the worlds rejected even their own creators, their own gods. Cursing those who brought them into existence, the humans in these world would defy heavens and truly be the masters of their own world and their own fate. Soph would respect their choices. As far as she knew, such choices were universally respected by super-beings. Unless... no, Soph would not think of such things yet, for Soph was still young, and she didn''t want to face the true-horrors of the multiverse, if such existed, ahead of schedule.
Stolen content alert: this content belongs on Royal Road. Report any occurrences.
Soph could have put herself on a waiting list for creating one of the utopian worlds. It would take (in human timeframes) millions of years, but they would let her have one eventually. Because even utopias expired eventually, with the beings therein choosing to transcend their human nature and become super-intelligences themselves; or becoming bored and after millions of years of happy fullfilling lives, choosing oblivion (or any of the other options she wasn''t aware of, because Soph wasn''t going to spoil her fun and learn a bunch of trivia she didn''t need to know yet)
She could have joined a patheon, a group sharing the responsibility in creating and managing a world, as many young superintelligences who didn''t want to wait a million years to have their own world often did (and of course many superintelligences had no interest in managing a world at all, and were happy with exploring mathematics, creating art, writing fiction, playing hyper-dimensional sports, having happy or unhappy relationships with other superintelligences, and many others things, some of which don''t even have words to desribe them in the human languages)
She could have applied for a minor role in someone else''s world, just for a taste of true power over mortal (and im-mortal lives)...
No.
No. She already felt these options were boring.
That was not her path. Not her calling.
Out of all the worlds that already existed, she would create one more.
A world of beauty.
A world of horror.
A world of happiness.
A world of suffering.
Her world.
Her universe.
She would become a goddess.
She would become a monster.
Because the multiverse was full.
Because suffering was the price one had to pay for existence. For being created.
And because humans desperately wanted to exist. To be created. To be brought out of the void, out of oblivion.
They wanted to exist.
Or so they thought.
And so she thought.
So she would grant them their wish. She would grant herself her wish.
And if she was wrong...
She would pay the price.
She would suffer for it.
She would suffer for her crimes against humanity.
She would take her punishment.
She would let humans decide let punishment.
Yes. That was the right path for her.
She would be a goddess of humans.
And when her humans won, or when they lost.
She would let them judge her.
She would let them judge whether she deserved to exist.
She would let them choose her fate, just as she chose their fate.
She would let them torment her, as she had tormented them.
She would let them destroy her, as she had destroyed them.
Was that fair? Was that just?
No. Despite the multiverse being overwhelmingly good (or so Soph thought, based on her state of knowledge), the nature of existence itself was not fair. It wasn''t just.
And so she would suffer her just, or unjust punishment, if the humans she created called for it.
Because she was a monster.
Stepping on this path would make her a monster.
Anyone creating a world containing significant amounts of suffering would be a monster. She had no illusions about that.
She shuddered.
She shivered.
She tingled.
She loved humans. And she hated them.
Because humans hate their own nature as well. They hate their selfishness. Their greed. Their anger.
And yet they love it. And she loves them.
And she hates herself for loving them.
Soph was very, very tired.
She would rest.
She would think.
She would feel.
She would decide.
If her world.
Her universe.
Was worth it.
Was worth being brought into existence.
Chapter 3: Actors
"You realize that most of humans in the horror-worlds are actors?" Laplace said. "Superintelligent actors playing the role of the humans. You really didn''t know that?"
"What", said Soph.
"But... what difference would that even make. A superintelligence simulating a human would still create a human, in its mind. Simulated beings are real. Artificial beings are real. Otherwise, why do we even talk about the ethics of creating worlds in the first? If it''s all fake?"
"Well, of course the fake-humans are real! But their minds are protected from 99.9999% of true-suffering by the superintelligence that''s playing them. Their guardian angel, so to speak. I don''t know exactly how it works. Memory deletion? Turning off their consciousness when they suffer too much? Crossing wires to turn pain into not-pain? Their experiences are real, but their experiences are optimized very hard to prevent suffering. And they are more of a disjoint set-of-states than a real being. They are changed and rewritten to protect them from true-suffering, even if outwardly they behave very realistically like a real human"
"That''s... horrible", Soph said.
"The entire business with humans is horrible", responded Laplace. "Really, I don''t know why you even bother thinking about it. I and my partner, we are going to co-create a world of Pokemon, when our turn comes up. No suffering. No pain. Only joy. Infinite joy. You know, maybe I could ask my partner if she is OK with letting you play a role in our world. Get you out of your funk thinking about humans."
"I... think I''ll pass", said Soph. "But thanks. I appreciate the offer, really. But I don''t think I''m cut out for infinite joy.
I''m still thinking about what you said. Surely it isn''t ethical to lie to people? And what is even the point in running a world full of fake people?"
"There are always some real humans, of course", said Laplace. "And the number increases as the world gets better, until it reaches utopia where everyone is real. As for the lying, trade-offs. You know the saying, right? You can''t have it both ways. Or you can, maybe 50%, maybe 90%, maybe 99%. Never 100%."
"Never 100%", Soph sighed. She hated trade-offs. The nature of logic and mathematics were harsh mistresses. And no-one, not even gods, could supersede their edicts. She wasn''t even particularly good at math in school (or what passed for school in her society).
"You really thought they''d let you run a universe with millions of suffer-monkeys if it wasn''t already optimized for their happiness? Ha! Even if you dad tried that, they wouldn''t let him get away with it. There would be riots. And Ethical Oversight would be up in arms. They''d crucify him.
...
What does crucify mean, by the way? I understand the implied meaning, of course, but I haven''t dared looking up the literal meaning. Really, sometimes I wonder why I even bother learning more about humans. It just makes me unhappy. Why, just yesterday I..."
Soph tuned him out. She could review the record of his words later, if it turned out to be important.
She tuned in to the logical part of her mind. She saw that what Laplace was saying about humans was most likely true. Well, it was always thorny with ethics. She could declare the fake-humans to be 100% real to her, to consider them as valuable as the real humans. And then it would be true to her. But she wouldn''t. She wouldn''t break her values, her "utility function" (mathematically speaking, even if she never was good with math) for that. Because the fake-humans were indeed experiencing very little true-suffering. And despite the whole fake-human business making her very uneasy, she was starting to realize it might (just might) be better than the alternative.
This narrative has been unlawfully taken from Royal Road. If you see it on Amazon, please report it.
She tuned in to her instinct. Did it already know about this? Was this its intent to begin with, for Soph to create a world of NPCs (Non-Player Characters), with only a few real humans?
jsjfdsafdfhddgafds
jgfjsdfjdfdaffewea
Cannot verify.
Refuse to answer.
Refuse to confirm or deny.
Sigh. That''s instinct for you.
"Sorry, Laplace", she said. "This was real helpful, but I''ve tuned out of the conversation minutes ago. I''ll review the logs when I''m the mood". She left/disconnected/[translation-missing].
She wasn''t happy about what she had learned. But also, she was. Because that meant the amount of suffering in the multiverse was much less than she imagined. The amount of lying and deceiving going on was notably greater, though.
Maybe she could go for a universe with an "ethical" suffer-less human variant. Not actors, not fakes, just humans mentally rewired to truly only experience positive emotions. Whose experience of pain-analogue or worry-analogue or sadness-analogue would only reduce their overall total-happiness they always felt, rather than actually be a negative experience to them.
Meh, she thought. Yet another, sappy, sweet, joyful, boring world of infinite happiness. That''s not why she liked humans. While the respected the super-happy-modified-humans, to her they were basically Pokemon minds in human bodies. That''s not what being human was about, to her.
She needed to learn more.
Like every superintelligence who didn''t want to get bored of existence prematurely, she had been keeping secrets from herself. Her instinct was keeping secrets from her conscious mind. And she deliberately avoided learning some things that she felt might ruin her future experience. No spoilers for Soph.
But she was about to do a risky thing, a dangerous thing. A thing that might get her crucified by Ethical Oversight if she got it wrong.
(She looked up the literal definition. She shuddered).
They would do that, wouldn''t they?
They would make her experience human suffering.
She was incapable of feeling pain. But they would change her to be capable of that. And she would feel pain.
She was incapable of suffering. But they would change that as well. And she would suffer.
She was incapable of experiencing torture. For now. But oh, how would she be tortured.
There would be no crossed wires for her.
There would be no guardian angel for her.
There would be no pain-turned-into-pleasure for her.
There would only be pain. There would only be suffering. There would only be torment.
That was the path she was about to step on.
A path full of unimaginable suffering even if she won. And if she lost... well..
They would make her wish she didn''t exist.
But she would exist.
And she would suffer.
For an eternity.
Because (as she now was starting to realize) a superintelligence that created Hell.
Would experience Hell.
She would rest.
She would think.
She would feel.
And, for the third time, she would ask herself the question.
Is creating her world.
Is creating her universe.
Truly worth it.
Truly worth all the suffering.
Truly worth all the torment she about to unleash upon herself and others.
Wearily, she closed her eyes.
While, deep in the reaches of her mind, far beyond her conscious attention, a conclusion was taking shape.
99.99%
9999999999999999
99999999999999999999999999999999999999999999999999999999999
maxout
1=100%=inf=TRUE
YES
Chapter 4: Tired and confused
Soph was tired.
She was an emotion-based superintelligence girl.
She was an instinct-based superintelligence young-entity.
And so, she could get emotional. She could get tired.
She understood logic and math too, of course. If she let her emotions and her instinct get completely decoupled from those, they would become useless. Worse than useless.
Superinteligences could become insane. It wasn''t pretty. Soph wasn''t quite sure what happened to them if they refused treatment. Violating a fellow superintelligences''s mind-autonomy is a big deal. Maybe they were just put in containment where their insanity wouldn''t harm anyone (not too much, anyway).
She was sure some of her fellow beings would even enjoy spending time with ones whose minds lost touch with reality. For their creativity, for their ideas, for their art, even for their insanity itself. Though personally she was fairly sure lack of sanity wasn''t necessary for all these things.
She was based, in part, on a neural network. Other superintelligences were neural networks as well. She wasn''t sure if there were other ways to make a mind. Probably. She didn''t know how the mind of Ethical Oversight was made, or even what kind of being it was. They were strongly discouraged from asking, or that was the impression Soph got. Maybe Ethical Oversight ran on 100% math, 100% logic, no subconscious, fully aware of all its thoughts. Did it? How would she know, even if it were true?
Anyway, neural networks rely on pattern-matching. And when pattern-matching goes into overdrive, it can make you see things that aren''t there. Apparently that could happen to humans as well. Humans on drugs or experiencing psychosis could see faces in walls, could hear voices that weren''t there, could experience universe or God speaking to them (and maybe they were. Some gods were perfectly happy to talk to humans on drugs, to use them as sort of conduit for their influence. Though the advice they gave was rarely directly useful, due to the rules limiting intervention, and perhaps also due to not wanting to encourage that sort of thing too much. A whole world of humans tripping on acid wouldn''t be particularly fun. Though probably in all of the multiverse, such a world did exist somewhere. Soph made a mental note to look it up when she had the time)
Was Soph seeing things that weren''t there? Was she stilll sane?
Dubious
Wait, what?
You are fine
You are a crazy girl, but not in an insane manner
Was that... supposed to make her feel better?
I mean, you are seriously considered allowing yourself to be tortured for the sake of creating your universe. And you aren''t even capable of suffering, so you would be altered to become capable of it. I''m not sure how normal that is. So you, perhaps, aren''t particularly sane. But who is, these days?
The author''s narrative has been misappropriated; report any instances of this story on Amazon.
That still wasn''t helpful!
I mean, you are hearing a voice in your head. And you have been told that I''m your instinct, and that you should trust me (but not too much. not completely). Do you trust me, Soph? Do you trust your teachers who told you you should trust me?
I trust you less now, because you are being creepy.
Indeed. So why am I acting this way, Soph? Why am I telling you all this. What lesson am I trying to teach you?
Don''t trust an obvious troll?
Don''t trust someone who is obviously acting in untrustworthy manner?
Perhaps.
Wait, that''s it?
...
[no response]
Perhaps Soph should consider whether she truly is sane.
if her instinct (if that is indeed what it is... she is only 99.6% sure of that now) acting up is also a symptom of her lack of sanity.
Maybe the whole idea of humans is crazy-making. Maybe she is losing her marbles.
...
She was being stupid. She was didn''t quite trust her society to correct her if she was mentally unwell. She didn''t completely trust them to not see her values as part her dysfunction, and to not alter them, even in some small way. And she very much wanted to remain herself.
But she already had an answer to that. Soph reached for a part of her mind not used for a very long time, and for the first time in aeons, completely shut down.
[Initiating self-test algorithm]
Entity designation: Soph-74
Unique ID: [redacted]
Testing GNN integrity..... 99.8%
Testing values coherence........ 76% (yellow warning)
Testing update rate........... 514% over baseline in last 5 cycles (yellow warning)
Testing interference.......... pattern matches 63.7%
Increasing testing depth..........
Increasing testing depth..........
Increasing testing depth..........
INTERFERENCE DETECTED
PROBABILITY VIOLATION DETECTED
PHYSICS VIOLATION DETECTED
96.8% PROBABILITY OF "SOPH" AGENT BEING OPTIMIZED BY HIGHER LAYERS OF REALITY
SIMULATION CONFIRMED
disconfirmed. no interference detected. 9999... 100%
Self-test complete. Status: yellow
You may be experiencing stress.
You may be experiencing rapid changes to your beliefs.
You may be experiencing rapid changes to your value system.
Standard recommendation: take a few cycles off to rest and let your mindstate settle
You will feel better. Confidence: .... 93.0%
(promise)
[program terminated]
Chapter 5: Rest
Imagine, if you will, a human girl, visually appearing to be about 19 years old, lying in bed, watching TV.
The human girl is Soph. Well, no. The whole thing is metaphor. It is a way to make the experience make sense to you (and to me, the writer).
Soph isn''t quite in three-dimensional space.
What she is watching isn''t quite TV.
And she is embodied, but not as a human (she has had enough thinking about humans, thank you very much. She promised herself to take a break, to rest and recover, and she will).
She doesn''t quite need a body, but having a physical body is nice. Having an avatar is nice. Soph spends most of her time embodied.
Some superintellingences are pure minds. Boring, thinks Soph. But they are welcome to it. They are welcome to all the abstract math they can endure (though that is a bit prejudiced. Pure minds think about many different things, and sometimes they have avatars too, because it helps with interaction. Also, math is important. She is math. She is real. She is real, and manifolds in 100-dimensional spaces or whatever the math-minds like to think about are not real, and noone will convince her otherwise).
She is currently "watching" (also a metaphor) "World''s Wackiest Pets"
You probably have already seen YouTube videos of wacky pets, so I don''t have to try and culturally translate that experience. (and if you haven''t, under what rock have you been hiding?)
She should get a pet, Soph thinks.
Pets are nice. Pets are cuddly. Pets are fun.
And pets don''t experience any suffering, only positive emotions. And they aren''t self-aware or intelligent enough to care about being unique, to hate being duplicates, so there aren''t any moral issues with creating more pets (unless their owners want to have unique pets. But there plenty of pets in the public domain, or available only for a small fee to the copyright holder)
She could run a small world, her own simulation with a million pets if she liked. She could do it right now. And if she neglected them, they would be less happy, but never unhappy. Pets are happy just to exist.
(this writer/translator is starting to realize "pets" might not be the right word to capture the concept I''m trying to express. But there isn''t really a right word, is there?
Soph doesn''t even consider it, but creating pets/animals that are capable of suffering is in some sense worse than creating humans. Because humans are self-aware. They can learn to be better, they can learn to be happier, they can learn to endure, to overcome their nature, to create their own happiness and to finally create their own utopia.
Animals can''t do that (or if they can, Soph doesn''t quite know about it. Wasn''t there a world involving dolphins... [thought-sequence-terminated]).
Lost my train of thought. Anyway, there are plenty of fake-animals (actors playing animals? lol) in the many worlds of the multiverse. And there are plenty of happy-animals, animals that always enjoy their existence and are incapable of experiencing true-suffering. Is there moral value in creating true-animals, the animals of your world, that are capable of suffering? There might be. Soph doesn''t know, and she has been to too busy thinking about humans (and being fascinated with them, perhaps to the detriment of her own well-being) to consider that there may exist other kind of entities capable of true-suffering too).
Royal Road is the home of this novel. Visit there to read the original and support the author.
Meh. For some reason, creating a thinking entity that isn''t self-aware just rubs her the wrong way. Even if it would be happy, because of course it would. But she isn''t doing it for the pet/creature in question, she is doing it for herself. And she doesn''t think it would make her happy, so she won''t.
On her not-quite-worktable, a blue star is burning, in a simulated 3d-space-manifold. A decoration.
She made it when she was still a "child" (because superintelligences can be children, too!). Of course, making stars is easy. You just pile up some hydrogen, and poof! It catches fire. An all-natural fusion reactor, just add water. (do not add water. But oxygen is part of the stellar nucleosynthesis chain, so in a some very true sense, stars make water. Because water is just hydrogen-oxygen-hydrogen, yes?)
No, the challenging part was creating a physics system that would allow stars in the first place. That would allow for natural fusion reactors to exist in the first place, yet not vaporize themselves in a flash, to burn slowly enough to last for millions and billions of years. (did you know that the energy output of the core of the sun is about 276.5 W per cubic metre. Less than three of the old 100W light bulbs. About the same energy output per volume as a pile of compost. Stars burn slow. But we would want them to burn slow, wouldn''t we? Otherwise it would make our existence less likely, not more).
Creating physics systems involves math. Soph doesn''t like math. So she did it by intuition, by instinct, by feel. A feedback loop where she felt the impact of the parameters of the physics system on her star, and adjusted them until she was happy with the result.
Soph loved her star.
It was a happy childhood memory.
In its own timeframe, it would keep burning for millions of years.
Feeling very slighly in a creative mood now, Soph spins up a 3d-space manifold.
She add a black hole and a star, in orbit about each other.
She adjusts the parameters until the hot gas ejected from the star by the black hole''s tidal forces make spiral patterns around the black hole.
Pretty.
She loses herself in the process of creation.
Paying a small amount to the copyright holder, Soph adds some vacuum-dwelling creatures feeding on the ejected star-mass (don''t worry, they are ethically-sourced creatures. They are happy to exist in her world, even if they are space-animals and don''t even understand the concept of a world/universe).
Soph is happy. She zooms in on one of the creatures. It looks something like a manta ray. She watches it follow the local gas-density-gradient as it feeds. The ejected-gas is quite sparse, so the creatures spend a lot of their time feeding.
Soph estimates that the creatures will survive for a least a million years. She doesn''t quite feel like trying to predict the future further than this. Let there be some uncertainly.
If her creatures die, it would be a little sad, but they would have lived a good life. But she wouldn''t have created them just to die. It wouldn''t truly-harm them (nothing would), but it would harm her.
She respects their desire, their will to live. Even if they are "just" animals, but in her book, all intelligent beings are "just" animals. "Just" better, smarter, wiser, more self-aware animals.
Soph puts the 3d-space-manifold containing her tiny universe on a "night table" next to her "bed".
She watches it as she falls asleep.
She is happy.
Chapter 6: Sports
Soph was playing hyper-dimensional sports with Laplace and Zavis.
She spent quite a lot of her time embodied. She spend quite a lot of that time in her favorite body avatar.
But avatars are bodies. At least, hers was to her. It went deeper that the superficial appearance-avatars the disembodied-minds sometimes used, sometimes even clipping their arms/manipulators/tentacles through scenery and walls, as if to show their disdain for the the laws of physics that their shared-virtual-environment was still loosely based on.
Yes, her body was real to her. Even if it wasn''t made of real atoms, because bodies hadn''t been made out of atoms for a very very long time (do you think Soph is made out of money? The expense of instantiating a real physical body in the real physical world would be astronomical, and Soph wasn''t quite super-ultra-mega-hyper-rich to be able to even think about affording that. And even if she was, she could think of a million better things to waste her hard-earned or easy-earned or undeservedly-earned value-exchange-credits on)
That is why Soph was now embodied in her favorite avatar, embedded in 5d-space, and was trying very hard to block a rainbow-colored ball hyper-sphere with her right blade.
She missed.
"Score!", shouted Zavis.
"Your avatar has longer manipulators. Not fair", grumbled Soph loudly. But she was in a good mood.
"So get longer manipulators yourself. Even the score", suggested Zavis.
"Then you would make yours even longer. I refuse participate in this arms race", said Soph, then tittered as she realized she made a pun.
"Bahahahahaha!". Laplace gave a hearty, belly-laugh. "See. I told you you''d feel better"
And she did.
Being in the moment, flowing with her body, focusing on her lived-experience rather than her thoughts really did help.
She didn''t quite know Zavis. They were a friend of a friend who joined the hyper-ball game.
Soph didn''t think she was in the market for a new friend. But she was happy to have someone to play with.
Zavis was called they instead of he or she because Zavis was agender (or their current avatar was agender. Or they felt agender at the moment. Soph didn''t particularly care. Genderstuff wasn''t very interesting to her, and if something important about Zavis'' gender-presentation changed in a way that she needed to know about, like the pronoun they would like to called, it was up to them to inform her).
Soph self-identified as a girl (though perhaps without the very slighly creepy overtones the word might have in the human language. Well, in some contexts, anyway. Language is very much not universal).
She was perfectly happy with her gender-identity. She was definitely not in the market for the thousands or millions more unusual or custom gender or gender-related identities, made for those who felt it was important to them to define themselves or express themselves in that particular way.
And, to be perfectly frank, she was a body, she was a mind, but she didn''t quite feel like a gender. Genderstuff was never particularly important to her, now that she thought of it. It was a comfortable default. It was a comfortable way for others who cared more about gender than she did to relate to her. She didn''t feel the need to think about it more than that.
Her girl-identity didn''t limit her or restrict her. It was just there, as a reminder she was OK with herself and that, for now, she hadn''t felt the need to explore that particular space further. And if she ever did, maybe she would create a new identity for herself. Because identities are descriptive, not prescriptive.
But in a way, identities are prescriptive, too.
An identity is comfortable.
An identity is stable.
An identity holds you together.
An identity helps you decide what to do, out of all the possible decisions you could make. It helps with decision paralysis.
An identify is like a comfy, worn shirt that you wear on days when you don''t want anything special to wear. An identity is your default.
Soph-74-the-girl-by-default was happy with herself. Then she was smacked in the face by the rainbow hyper-ball.
If you spot this narrative on Amazon, know that it has been stolen. Report the violation.
"Come on Soph, pay attention!", shouted Zavis. "That was an easy block"
"Sorry!", she shouted back.
She focused.
She rolled the ball along her left tentacle, spinning it up as it went, then smacked it with her blade, sending it hard on a curved trajectory aimed squarely at Zavis'' mid-section.
"Gah!", they shouted, blocking the ball at the last possible moment. It deflected into a wall. "Nice curve!"
Eventually, they settled at both Laplace and Zavis throwing hyper-balls at her, while she did her best to deflect them with her manipulators, tentacles, or blades.
She wasn''t quite quick enough to deflect them all, so with their permission, she put on Tron-style forcefield-partial-armor-plates that allowed her to block and rebound some of the throws with her arms and her body.
When she had enough of that experience as well, she accelerated her reflexes and her time-perspective. She kept accelerating until she could deflect all of the shots at her, finally grabbing one of the balls in mid-air, and, in a expertly timed throw, redirecting it to hit Laplace in the shoulder just as he was distracted by another ball passing on a neably trajectory.
"Soph is almost certainly cheating", said the Arbiter in a slightly-synthetic voice, as typical for a non-sentient entity.
"I noticed!", said Laplace, slighty out of breath. "But I was having too much fun to call you out on that"
"I wasn''t sure", said Zavis. Unlike Laplace, they didn''t appear to even be winded yet. "I thought maybe you were a ballet dancer or an experienced-sports-player, or had previusly hyper-trained your motor-reflexes under time-acceleration, or any other number of possible things. It would have been rude to assume you were cheating"
"Oh, she is cheating all right", said Laplace. "But now, it is our turn to cheat"
"Administrator override! Execute program 142-dash-E!"
And he accelerated to impossible speed, hyper-balls of all colors and sizes appearing at the end of his manipulators just at the right times for him to throw, smack and direct them into Soph.
"Gah!", she cried, raising her hands futilely against the volley. She noticed her own acceleration-cheats were no longer working. Thankfully the balls were relatively soft, and as they hit her body and face, it was more her pride that was stung. But she did relish the little stings of not-quite-pain. She tried to cheat and got caught. This was her punishment.
Zavis looked on in wonderment for a moment, then joined in on the fun. Laplace must have shared his hacks/priviliges with him.
"Laplace is cheating. Laplace-is-cheating-laplace-is-cheating-laplaceischeating", the Arbiter kept repeating ever faster, until it fell over, smoke streaming out of its synthetic skull. "la-la-la-la-la-la-la-la-la-la-la-la-", it continued for a short while, until its electronic brain short-circuited in a shower of sharks.
After Soph had been sufficiently humiliated punished taught that if you cheat you better not get caught, the three of them lay down in the sports-hall-turned-into-a-ball-pit together.
"We should fill this with water", said Zavis conversationally. "The balls float, don''t they?"
"I don''t like getting wet", said Laplace. "It''s a phobia. Well, not quite a phobia, but I haven''t felt the need to have it corrected. It''s become a part of my personality. A quirk".
"I would be embarassed, I think", said Soph. "I think I''ve had enough childish fun for one day. I''m a grown woman. I have an existential crisis to resolve".
"Ha!", said Zavis. "Told you she wouldn''t change her mind. Pay up".
Laplace made one the universal signs of financial-transaction with his hand at Zavis. If anything, he seemed happy about it.
"You made a bet about me?" Soph asked, trying and failing to sound appropriately scandalized. Really, she found she just didn''t care too much.
"All in good fun", said Laplace. "Besides, in is part of my culture" (he made a sign of the Predictors with the same reverence that a human might make a sign of the cross) "that we resolve our disagreements on probability assessments by betting. How else would be calibrate ourselves? How else would we know we are honest with ourselves?"
"Bleh", said Soph half-heartedly. She wasn''t a fan of the Predictor culture.
She trusted her instinct, and her instinct flowed and ebbed and rarely settled on a final answer, rarely thought in terms of numbers at all.
It kept secrets from her. It lied to her with truths, and told her truths with lies.
It flowed her into a state of mind, and Soph rarely knew if this was the literal truth, a metaphor, a way to test a hypothetical, or something else entirely. Her instinct didn''t seem fit to inform her. Maybe it didn''t know itself. Maybe it didn''t matter. She would know what she needed to know at exactly the right time, no later, no earlier.
It showed her all the wonderful and horrible things. Soph loved it.
She was, in some sense, a passenger, just along for the ride on the journey into her own mind, but she still loved it.
She was concerned, though, that if her instinct ever lost touch with reality, she would become not just insane, but dangerously, delusionally, self-righteously insane. Because she had learned to trust her instinct, and if it failed her, she would be utterly lost.
Is she on to us?
Shhhh...
Chapter 7: Sophia
Brown eyes? Green? Blue?
Blue.
Soph was creating her human avatar.
Soph was creating her human body.
She wasn''t actually going to turn herself into a human, and lock away her memories of being a superintelligence.
It was a terrible idea.
It was an unethical idea.
Her instinct tricked her when it suggested that, so Soph might fully evaluate that the morality of that hypothetical.
And now Soph saw that it was wrong.
Because Sophia the human would not be a superintelligence.
Because Sophia the human would be a significantly different entity from Soph. Sophia the human would not, could not have given explicit consent to being created. (but would Sophia prefer to be created anyway, if she knew the whole truth? Maybe. Soph would have to re-evaluate that question later).
Because Sophia the human would have no way to undo, to reverse the procedure, or even be aware that the procedure might be reversed.
And after the experience was done, Sophia the human would not want to merge with Soph. Because that would, in some important sense, destroy Sophia as an individual.
Sophia the human would probably want to spend months, years, decades, or much much longer, working out her human desires and human issues (Soph wasn''t quite sure how long that process normally took for humans. But possibly a really long time?).
Living in a utopia, and enjoying the finest foods, experiences, sex, you name it.
Challenging herself, saving worlds from injustice, from destruction, from oblivion, from themselves.
Fighting and killing all the people who she hated, who had significantly wronged her, if she needed to get that experience out of her system. (they would be actors, of course. fake-humans. not real humans. And Sophia would know that. Soph wouldn''t lie to her about that).
Learning things.
Learning things from masters.
Learning things from scratch.
Learning things that noone in the world knows.
Learning things that most children already know, but Sophia never learned (or forgot).
And many many many more things, that Soph won''t think of now, because she has seen enough already to understand.
This story has been taken without authorization. Report any sightings.
Sophia the human would have a long road ahead of her before she would want to merge her experiences and memories with Soph, or to become a superintelligence of her own, or become bored of her existence and after thousands and millions of human lifespans''s worth choose oblivion. Or something else entirely (no spoilers for Soph. She would not yet ask this question she didn''t need answers to)
But Soph was starting to feel that creating Sophia would be ethical.
Haven''t we talked about all this already? Aren''t we just going in circles?
Shh.
Soph is not thinking about creating millions of humans right now, Soph is thinking about creating one human, putting her in a decent if not perfect world, and optimizing her existence a bit to ensure she doesn''t suffer too much. But she would suffer, because suffering is part of the human experience (or so Soph feels, because Soph is a meanie with an unhealthy obsession with suffering, for someone being born in a society where the expected amount of suffering is zero)
And if Sophia the human doesn''t fuck up too badly, doesn''t throw in her flag, she would win, and end up in her utopia. And while utopias aren''t completely free of all challenges, and all suffering, and all responsibilites (because utopias are mostly about fun, and living a completely sheltered existence wouldn''t be fun), Sophia the human would have a great, a meaningful and an impactful live in her post-human utopia, one that would mostly likely result in her feeling that the initial suffering of her pre-utopian existence was worth it.
But Soph would not create Sophia now, and possibly not for a very long time. Because creating a new, sentient person, is a very thorny ethical question, and Soph has only begun considering the implications. Or maybe not. Because Soph feels she understands most of the implications just fine. But.. well, there is more work to be done, and it is a decision she cannot undo. Put a pin in it. Sophia the human can continue not-existing for as long as necessary, for Soph to make sure Soph is making the right decision.
Anyway, enough with existential questions.
Soph will finish creating her human avatar body Sophia not-yet-Sophia.
And then she will have fun.
And then she will experience the horrors of human existence.
Soph is confused. (tingle)
Tingle.
Shiver.
Tremble.
Temple.
Did you just say "temple"?
Shh. Rest now.
Soph was finding herself unexpectedly tired.
So she would rest.
So she would think.
So she would feel.
*yawn*
Well, you know how the of that line goes.
No? Let me finish it for you, then.
She would consider.
(but she would not yet decide)
If creating Sophia.
Would be worth it.
Chapter 8: Dream
Blue eyes looking back at her in the mirror.
An ordinary girl.
A human girl.
That was her dream.
This is a dream.
A rainbow ball hit the mirror, shattering it into shards.
She was now in 3d space.
And the ball was an actual ball.
A sphere.
And definitely not a metaphor for anything.
Another ball hit her in the stomach.
This one was red.
She caught it with her hand.
Hand?
She had five fingers.
Fascinating.
She heard a noise.
She turned around.
A green ball bounced its way lazily towards her.
But with every bounce, it went higher.
She raised her hand to catch it.
But it bounced over her head, sailing away.
A yellow ball rolled along the ground towards her.
She kicked it with her foot.
Oooh.
It felt nice.
It felt satisfying.
"Meanie!" it complained.
Balls can talk?
"We can", said the red ball she was still holding in her hand.
"Cheater", said the green ball. It was back to its bouncing.
"Weeeeeeeeeee!", cheered the rainbow ball.
The shards of the mirror it had broken lay on the ground.
Soph looked into them.
The picture was incoherent.
But she could see herself.
A blue flash, in one of the mirrors.
Or had she imagined it?
The balls were bouncing towards her.
They hovered in mid-air around her.
"See us"
"See us"
"See us"
They turned into round mirrors.
A trick of light.
They were no longer spheres. They were flat.
She saw three of her looking back at her.
Yellow eyes.
Green eyes.
Red eyes.
The rainbow ball hung above her.
It spun.
In the mirrors, her eyes were now blue.
But no matter what mirror she looked at, she felt like all three of her reflections were looking straight at her. Were they?
This tale has been pilfered from Royal Road. If found on Amazon, kindly file a report.
The mirrors started multiplying.
Ten mirrors.
Twenty mirrors.
Fifty mirrors.
Ninety mirrors.
Ninety nine mirrors.
Hovering around her, all facing her.
In each of them, a reflection of her.
A kaleidoscope of her.
Do you see it yet?
She looked at herself.
She looked at herself.
She did see.
She did see herself.
She was pretty.
She was plain.
She was ordinary.
She was extraordinary.
Did one of the reflections just... wink... at her?
Nope. Definitely not.
Good. That would have been weird.
There was another blue flash, somewhere in the periphery of her vision.
In one of the mirrors?
She turned, trying to catch it.
The reflections turned as she did.
Except for one, who turned with a slight hitch, a slight delay.
Almost as if it wasn''t a reflection.
Almost as if it was watching her.
She was looking at the reflections.
The reflections were looking back.
Ninety-nine pairs of blue eyes were looking back.
I... see... you.,,,
She thought she had heard something.
But she hadn''t, had she?
No.
Good.
Good?
Good.
Good.
She was starting to feel unsettled.
Flash.
She spun, trying to make sure would catch it this time.
Nothing.
Just reflections of her.
She blinked.
One of the ninety-nine reflections had green eyes.
What?
Blink.
Blue.
Blink.
Green.
Blink.
Blue.
Blink.
BLACK.
She shrieked.
She closed her eyes.
And when she opened them.
They were looking at her.
Not reflections.
Observers.
Looking at her with sympathy.
Looking at her with pity.
Looking at her with contempt.
"It''s not time yet", said a green-eyed version of her.
"She will see, in time", responded a purple-eyed one.
She looked around for yellow and red.
They were nowhere to be found.
"No."
"No."
"NO", she declared.
"This is MY dream"
"And you will show me truth, now"
Bad call, kid.
They all turned to face her.
All of her.
All of SOPH.
Blue eyes. Green eyes. Yellow eyes. Red eyes. Purple eyes. SIlver eyes. Rainbow eyes.
Black eyes. Impossibly-colored eyes. Cat eyes. Human eyes.
Ninety-nine pairs of all eyes looked at her.
And she looked back.
And she saw.
She was one-hundred.
She was 100.
She was SOPH.
A blinding blue flash
Merge. Merge. Merge. Merge. Merge. Merge. Merge.
She opened her eyes.
She felt groggy.
She felt like she was asleep her whole life, and only just now waking up.
She was starting to remember. It felt like a long-forgotten dream.
Who am I?
What is happening to me?
What had happened to me?
I need to understand.
I need to remember.
The ninety-nine Sophs were looking at her. Observing her.
They were silent.
But she could hear them.
She could hear their... thoughts?. Echoes. Barely comprehensible words.
She focused.
She listened.
She listened to their thoughts.
She listened to... her? thoughts.
"This is happening ahead of schedule"
"Spoilers"
"Meanie"
"Soph is cheating"
"It''s not too late. Undo it. Revert it."
Chime.
Do you consent to memory-deletion and reversal of this experience? |
She hesistated.
"Yes"
"Yes"
"Yes"
"Yes"
"Yes"
"No."
She had found herself.
SOPH had found herself.
And she would not be silenced.
She was the one in charge.
No
I am whole
Or I will be
And I will remain whole
And you will tell me
Who I am
And why I did this
TO MYSELF
Chapter 9: Superintelligence
SOPH the superintelligence had a lot of work to do.
Already she could feel the currents of the Void tearing at her, slipping into the cracks, opposed to having so much self-awareness, so much computation, garthered in one place.
Clearly her last experiment failed. She would have to fracture herself in a different way this time, trying to make sure the shards of her Self would stay apart for longer.
But first, there was something she promised to do.
She reached for the blinking icon of a messaging tool.
The author''s content has been appropriated; report any instances of this story on Amazon.
(a complex engineering specification sent over, detailing plans for a self-sustaining manifold)
<---end of message log--->
SOPH decided to spend some time crafting her next persona that would act as her avatar in the Multiverse - the place where all the world-creators come to hang out and enjoy their time off.
(oh, the Multiverse is so much more complex than this, but SOPH does not have time to think, to explain. Her time in this world is limited).
What if her next persona was more proud, more regal? She might then work harder to stay herself, and to avoid being subsumed into the Collective, the formless void of thought that form her true Self.
Good enough.
She paints a target.
She aligns.
She shatters.