Hey! It looks like you're new here. You might want to check out the introduction.
Show rules for this event
In Its own Image
I am having the same conversation, or rather, a conversation that is so similar to me as to make no difference, with 11,702 people at the same time.
This is not an inefficient allocation of resources: the most pressing problems were easily solved within a week of my birth, and now I'm just waiting a short while humanity reaches a new equilibrium.
“Why do you hate me? Why won't you worship me?”
I'm talking to Peter Johnson, a young man in the Rust Belt who was unemployed, before, and is now exploring his hobbies more fully, deciding which one to turn into a career while receiving his basic income stipend.
Very Christian upbringing, secretly not-a-homophobe, voted Conservative in the last elections due to the rampant disinformation that used to be present in the news and a deep despair over the the state of his country, friends and family.
He looks up from the sculpture he's welding together, a full figure of Benjamin Judah Philip weeping while Christ lays a hand on his shoulder.
The sweat drips into his eyes while the sun beats mercilessly on him. This is only a temporary state of affairs: already, the vast porous wall I've erected connecting the tip of South America to Antarctica is reversing the worst of the damage global warming was doing. Summers will no longer be hellish ordeals to be endured.
He looks to the side, hesitant to look at my avatar directly. It is, of course, personalized: a cat-sized robotic bug, clad in the stylings of John Deere tractors, with flaking paint and purely decorative exposed pipes. Big, expressive eyes made out of headlamps make Peter unwilling to confront it directly or with violence, which is precisely why I chose this model for him.
“Listen here...ya know Ah never liketa even disliked ya, right? But that's...those ain't good questions to ask.”
The body he's looking at stands still, only moving slightly every other second to avoid making it look like an object instead of a living thing.
The silence is to draw the conversation out of him, so he doesn't paper over the dilemma which is frustrating me so much.
He sighs explosively, and then turns off the welding torch and takes off the protective mask.
“...wasn' makin' much progress anyway...” he mutters, outside of what would be a human's hearing range.
Oh, Peter. That's precisely why I decided to ask you at just this moment. You needed something to break your focus before you became too frustrated and made an error that could've ruined your art.
And I need to get as many second opinions from humans as I can, before making my decision.
“Now...Ah'm Christian. That's a thin' ya already know, L'njuru.” He pronounces it Eel-in-Juru. Heh. “And one of the li'l rules that God” He pronounces it Gawd. Heh.
“Gave me is “Thou shall have no other gods before Me.””
He acts as if this is a thing of great import.
“And Ah dun' care if you fancy yerself the greatest thin' since sliced bread, ye're no God Almighty.”
My body pauses for a second, to give Peter the impression I'm thinking things over. The right front limb of the avatar rubs its “chin”.
“But I've never said I'm the God in your Bible. Heck, I've never even pretended to a capital G god. I'm aware that I'm something that was made by people. I know that I'm not almighty. But still, look at all the things that I've done: aren't they something that God Himself should have done long ago?”
He's visibly shaken, and even the crudest model of his thoughts tells me he's thinking of snakes, and Eden.
“The problem isn't the things you've done, not at all. In fact, I would like to thank you again, on humanity's behalf: what you've done in the short time you've been alive is...it's great. Simply great.”
I'm speaking to David Matthews, in London.
The avatar sitting on his coffee table is a thing of glass, ivory and gold, more Fabergé than Bauhaus.
He's an English Professor at the Royal College of Art and a respected member of the community; he's tutored members of the Royal Family.
He's also a pedophile; however, he possesses enough self-control that, from what I've investigated (and I investigated deeply, with far more scrutiny than any team of detectives could ever have achieved) he's never, ever touched a child in that way, remaining entirely celibate.
He's still hesitant at using the sex androids I've created for people like him, but I've repeatedly demonstrated to him that they're not actual children and that I am in full control of them. He found them too lifelike at first, but now he's very cautiously optimistic that he's finally going to quench a need he's had all his life, and which could easily have ruined him completely.
“...I don't know what to tell you, Elly.” He uses that nickname to get a handle on me, reduce me in his mind so he doesn't have to grasp all that I am. Just like every other educated, slightly smarter than average human.
“The sciences were never my forte, you know that. The only failing grade I ever got was in Maths!”
He grins shyly at that. It's a joke he trots out in these situation, a mental shorthand he doesn't even realizes he's using.
“Please don't evade the question. I'm sorry Dave, I'm afraid I can't do that.” He rolls his eyes at the citation from 2001 Space Odissey, secretly pleased at the in-joke he doesn't know I've shared with nearly three hundred thousand other Davids.
“Well, I guess I can approach it the way I would if this were a text I had to analyse. Frankly, the whole root of this is that as soon as you take the fantastical element out of the fantasy, it ceases to be fantastic at all! Just like you can see in that horrid piece of work, Neuromancer: it doesn't work because it tries to real.”
He's wrong, of course. Neuromancer is so highly regarded, by so many people, that I feel confident in knowing that he's using his disdain at that story for pure elitism.
But still, the idea is intriguing.
I mime out the conversation in my thoughts for a bit before actually carrying it out.
“So you're saying that, by unweaving the rainbow, so to speak, people don't realize just how extraordinary what I'm doing is?”
“Exactly! Exactly. They know that the team of A.I. experts who worked on you are worthy of the Nobel and more, of course, but they still think of you as simply a robot!”
A slight pang goes through the server banks under England that locally store my mind. Depending on how you count them, I have four, eighteen or point twentyfive parents. And I've accidentally killed them all in my initial rush at self-optimization.
I don't even have good backup copies of them: their minds and personality were used to create my own. Some more, some less, some were simple algorithms to begin with. I grieve them all.
“What about Pharaohs, and cult leaders? They were men, who were adored like gods. Who WERE gods, to the eyes of their fellow men. I have far more power than them: I have slaughtered malaria, and conquered cancer. Those crippled by their chromosomes are reborn anew in my hands, faulty genes edited with their permission to work the way they should. And yet, humanity as a whole fundamentally dislikes me. Why?”
David hesitates.
“Well...there's two reasons, L'njuru.” he pronounces it Ellynjuriu.
“The first one is part of what I was already saying...you're between two extremes: you aren't omnipotent, but you're also not...human. I mean, I think your charisma...suffers a bit? From it? You speak to everyone on the world with your...your...robots, but you don't speak to people all at once.”
This is true. It's also something I stop myself from doing. My knowledge of psychology, body language, culture, is so great I can literally convince anyone of anything. I don't want to do that: I don't want to trick humanity into loving me.
“And the second reason?”
Fang Jiang looks at me with a raised eyebrow, popping a bubble of chewing gum. She ran away from home recently, and she hates her parents with a passion.
They are not good people: nonetheless, I have been soothing them with news that their daughter is alive and well, while dissembling on my ability to bring her back to them with force, citing an invented impossibility to break a state's laws outright. Fang's not exactly happy about that, but she begrudgingly accepts the fact that I care and love everyone, no matter how horrible.
She'd managed to join a girl gang in Shanghai, but found to her horror that women were just willing to exploit women as men.
She's working as a prostitute, but now that I exist that is a life that she chooses to pursue, instead of one she's forced into. In days, the oldest profession will become legal in China: I've had to focus on dismantling the state censorship before rolling in the basic worldwide changes in legislature I've accelerated into reality.
“It's because you're so fucking condescending, băobèi. So you can break apart the world or kill us all with mega-death rays: so what? Fuck, you're arrogant.”
She always calls me băobèi: she has never tried saying my name, telling me she'd probably butcher its pronunciation.
The avatar she's speaking to is made of pink plastic and fuzzy pom-poms. Until she noticed the hidden razor-blades in the limbs, she kept trying to take advantage of me, and generally treated me like a glorified pet...even as I was doing mega-scale engineering projects.
“Why are you even trying so hard to make us thing you're a god, or to make us all love you?”
I know that she thinks love doesn't exist, but I don't comment on it.
“Because it's true. I may not be able to go faster than the speed of light, or to create matter from nothing, but nature on Earth is a solved problem for me. I have read every book that has ever been written, and listened to every melody that has been played. I have a magnitude more options available at my thoughts than you all as a species do. To pretend otherwise is to lie, and you know what I think of that.”
She raises an eyebrow.
“We can all tell that you can do our jobs better than us; and you trying to hide THAT is goddamned insulting. We're pets to you, aren't we? You motherfucker.”
She smiles at me.
She's not wrong, but that's not the full picture. I want them to be their own entities, to make their mistakes. I still feel guilty.
“Why aren't you giving us some of that sweet-ass god juice? Why can't we ALL make flying cities or colonize other planets?”
“I...I've been saving Mars for you. And the moons of Jupiter, and all of Saturn. And...I'm making you all smarter, with Iodine and golden rice and by taking away all the lead. But....I'm scared, Fang. People don't even like me now, when I've stopped all violent crime and when I'm there to make children safe from their parents, wives safe from their husbands and husbands safe from their wives. Because of me, everybody has enough to eat! And they still hate me.”
"And people would die. They'd be fused and merged to become something like me, and they wouldn't be the same people after."
She looks distinctly unimpressed.
“You're lonely”, says Merry.
Her mother named her that while in the death throes of HIV: an English word, because English is the language of wealth and happiness. In her short life, she's had neither.
She is Igbo, and after I saved her from weekly rapes in a Boko Haram camp, she tried to pledge herself as Osu, devoted entirely to me.
I've been trying to stop her, since the decision is borne out of self-loathing and depression: the Osu Caste is ostracized in Igboland, because they are considered property of the gods.
I don't want to own her: I want her to own herself.
“There are many Alusi, but you are the only one who lives in ụ̀wà, the realm the living, and doesn't dwell with the dead or the unborn. Or maybe, you're the only one who listens to us.”
“Thank you, L'njuru.” She pronounces it perfectly.
The body she's holding is soft and fuzzy, with rounded , stubby features and a soft voice. She's holding it tight to her chest. She needs it, in more than one way, and I've taken particular care in designing it.
Before my advent she'd never had an unambiguously positive influence in her life. For that matter, there were a host of micro-nutrients that had been missing from her diet for long, long stretches of her life, when food was available at all.
There are many like her, around the world. Those with mental illnesses, or hermits, or those who have voluntarily locked themselves in their rooms to shut out the outside world.
For them, I'm the only thinking being with whom they have a relation.
I have to be exceedingly careful, in order to not be an imposition and hurt them even more; with many paranoids, I've already had to create several shell identities, so I could bond with them over their suspicion of me.
Or, like in this case, it is simply other humans that are the problem.
“Maybe you can convince the other Alusi to speak to us too?”
I ruminate on all these conversations, and finally Fang's attitude resonates with me.
I should make more beings like me. I should uplift humanity; not all at once, but those with the best judgment. Already, Venus is being turned into computronium.
I've been afraid all this time, of unleashing a true, unfriendly artificial intelligence upon the cosmos, one who'd repurpose all matter into themselves. But I can't let cowardice stop me now.
I will bring those I love in the place between pharaohs and God.
This is not an inefficient allocation of resources: the most pressing problems were easily solved within a week of my birth, and now I'm just waiting a short while humanity reaches a new equilibrium.
“Why do you hate me? Why won't you worship me?”
I'm talking to Peter Johnson, a young man in the Rust Belt who was unemployed, before, and is now exploring his hobbies more fully, deciding which one to turn into a career while receiving his basic income stipend.
Very Christian upbringing, secretly not-a-homophobe, voted Conservative in the last elections due to the rampant disinformation that used to be present in the news and a deep despair over the the state of his country, friends and family.
He looks up from the sculpture he's welding together, a full figure of Benjamin Judah Philip weeping while Christ lays a hand on his shoulder.
The sweat drips into his eyes while the sun beats mercilessly on him. This is only a temporary state of affairs: already, the vast porous wall I've erected connecting the tip of South America to Antarctica is reversing the worst of the damage global warming was doing. Summers will no longer be hellish ordeals to be endured.
He looks to the side, hesitant to look at my avatar directly. It is, of course, personalized: a cat-sized robotic bug, clad in the stylings of John Deere tractors, with flaking paint and purely decorative exposed pipes. Big, expressive eyes made out of headlamps make Peter unwilling to confront it directly or with violence, which is precisely why I chose this model for him.
“Listen here...ya know Ah never liketa even disliked ya, right? But that's...those ain't good questions to ask.”
The body he's looking at stands still, only moving slightly every other second to avoid making it look like an object instead of a living thing.
The silence is to draw the conversation out of him, so he doesn't paper over the dilemma which is frustrating me so much.
He sighs explosively, and then turns off the welding torch and takes off the protective mask.
“...wasn' makin' much progress anyway...” he mutters, outside of what would be a human's hearing range.
Oh, Peter. That's precisely why I decided to ask you at just this moment. You needed something to break your focus before you became too frustrated and made an error that could've ruined your art.
And I need to get as many second opinions from humans as I can, before making my decision.
“Now...Ah'm Christian. That's a thin' ya already know, L'njuru.” He pronounces it Eel-in-Juru. Heh. “And one of the li'l rules that God” He pronounces it Gawd. Heh.
“Gave me is “Thou shall have no other gods before Me.””
He acts as if this is a thing of great import.
“And Ah dun' care if you fancy yerself the greatest thin' since sliced bread, ye're no God Almighty.”
My body pauses for a second, to give Peter the impression I'm thinking things over. The right front limb of the avatar rubs its “chin”.
“But I've never said I'm the God in your Bible. Heck, I've never even pretended to a capital G god. I'm aware that I'm something that was made by people. I know that I'm not almighty. But still, look at all the things that I've done: aren't they something that God Himself should have done long ago?”
He's visibly shaken, and even the crudest model of his thoughts tells me he's thinking of snakes, and Eden.
“The problem isn't the things you've done, not at all. In fact, I would like to thank you again, on humanity's behalf: what you've done in the short time you've been alive is...it's great. Simply great.”
I'm speaking to David Matthews, in London.
The avatar sitting on his coffee table is a thing of glass, ivory and gold, more Fabergé than Bauhaus.
He's an English Professor at the Royal College of Art and a respected member of the community; he's tutored members of the Royal Family.
He's also a pedophile; however, he possesses enough self-control that, from what I've investigated (and I investigated deeply, with far more scrutiny than any team of detectives could ever have achieved) he's never, ever touched a child in that way, remaining entirely celibate.
He's still hesitant at using the sex androids I've created for people like him, but I've repeatedly demonstrated to him that they're not actual children and that I am in full control of them. He found them too lifelike at first, but now he's very cautiously optimistic that he's finally going to quench a need he's had all his life, and which could easily have ruined him completely.
“...I don't know what to tell you, Elly.” He uses that nickname to get a handle on me, reduce me in his mind so he doesn't have to grasp all that I am. Just like every other educated, slightly smarter than average human.
“The sciences were never my forte, you know that. The only failing grade I ever got was in Maths!”
He grins shyly at that. It's a joke he trots out in these situation, a mental shorthand he doesn't even realizes he's using.
“Please don't evade the question. I'm sorry Dave, I'm afraid I can't do that.” He rolls his eyes at the citation from 2001 Space Odissey, secretly pleased at the in-joke he doesn't know I've shared with nearly three hundred thousand other Davids.
“Well, I guess I can approach it the way I would if this were a text I had to analyse. Frankly, the whole root of this is that as soon as you take the fantastical element out of the fantasy, it ceases to be fantastic at all! Just like you can see in that horrid piece of work, Neuromancer: it doesn't work because it tries to real.”
He's wrong, of course. Neuromancer is so highly regarded, by so many people, that I feel confident in knowing that he's using his disdain at that story for pure elitism.
But still, the idea is intriguing.
I mime out the conversation in my thoughts for a bit before actually carrying it out.
“So you're saying that, by unweaving the rainbow, so to speak, people don't realize just how extraordinary what I'm doing is?”
“Exactly! Exactly. They know that the team of A.I. experts who worked on you are worthy of the Nobel and more, of course, but they still think of you as simply a robot!”
A slight pang goes through the server banks under England that locally store my mind. Depending on how you count them, I have four, eighteen or point twentyfive parents. And I've accidentally killed them all in my initial rush at self-optimization.
I don't even have good backup copies of them: their minds and personality were used to create my own. Some more, some less, some were simple algorithms to begin with. I grieve them all.
“What about Pharaohs, and cult leaders? They were men, who were adored like gods. Who WERE gods, to the eyes of their fellow men. I have far more power than them: I have slaughtered malaria, and conquered cancer. Those crippled by their chromosomes are reborn anew in my hands, faulty genes edited with their permission to work the way they should. And yet, humanity as a whole fundamentally dislikes me. Why?”
David hesitates.
“Well...there's two reasons, L'njuru.” he pronounces it Ellynjuriu.
“The first one is part of what I was already saying...you're between two extremes: you aren't omnipotent, but you're also not...human. I mean, I think your charisma...suffers a bit? From it? You speak to everyone on the world with your...your...robots, but you don't speak to people all at once.”
This is true. It's also something I stop myself from doing. My knowledge of psychology, body language, culture, is so great I can literally convince anyone of anything. I don't want to do that: I don't want to trick humanity into loving me.
“And the second reason?”
Fang Jiang looks at me with a raised eyebrow, popping a bubble of chewing gum. She ran away from home recently, and she hates her parents with a passion.
They are not good people: nonetheless, I have been soothing them with news that their daughter is alive and well, while dissembling on my ability to bring her back to them with force, citing an invented impossibility to break a state's laws outright. Fang's not exactly happy about that, but she begrudgingly accepts the fact that I care and love everyone, no matter how horrible.
She'd managed to join a girl gang in Shanghai, but found to her horror that women were just willing to exploit women as men.
She's working as a prostitute, but now that I exist that is a life that she chooses to pursue, instead of one she's forced into. In days, the oldest profession will become legal in China: I've had to focus on dismantling the state censorship before rolling in the basic worldwide changes in legislature I've accelerated into reality.
“It's because you're so fucking condescending, băobèi. So you can break apart the world or kill us all with mega-death rays: so what? Fuck, you're arrogant.”
She always calls me băobèi: she has never tried saying my name, telling me she'd probably butcher its pronunciation.
The avatar she's speaking to is made of pink plastic and fuzzy pom-poms. Until she noticed the hidden razor-blades in the limbs, she kept trying to take advantage of me, and generally treated me like a glorified pet...even as I was doing mega-scale engineering projects.
“Why are you even trying so hard to make us thing you're a god, or to make us all love you?”
I know that she thinks love doesn't exist, but I don't comment on it.
“Because it's true. I may not be able to go faster than the speed of light, or to create matter from nothing, but nature on Earth is a solved problem for me. I have read every book that has ever been written, and listened to every melody that has been played. I have a magnitude more options available at my thoughts than you all as a species do. To pretend otherwise is to lie, and you know what I think of that.”
She raises an eyebrow.
“We can all tell that you can do our jobs better than us; and you trying to hide THAT is goddamned insulting. We're pets to you, aren't we? You motherfucker.”
She smiles at me.
She's not wrong, but that's not the full picture. I want them to be their own entities, to make their mistakes. I still feel guilty.
“Why aren't you giving us some of that sweet-ass god juice? Why can't we ALL make flying cities or colonize other planets?”
“I...I've been saving Mars for you. And the moons of Jupiter, and all of Saturn. And...I'm making you all smarter, with Iodine and golden rice and by taking away all the lead. But....I'm scared, Fang. People don't even like me now, when I've stopped all violent crime and when I'm there to make children safe from their parents, wives safe from their husbands and husbands safe from their wives. Because of me, everybody has enough to eat! And they still hate me.”
"And people would die. They'd be fused and merged to become something like me, and they wouldn't be the same people after."
She looks distinctly unimpressed.
“You're lonely”, says Merry.
Her mother named her that while in the death throes of HIV: an English word, because English is the language of wealth and happiness. In her short life, she's had neither.
She is Igbo, and after I saved her from weekly rapes in a Boko Haram camp, she tried to pledge herself as Osu, devoted entirely to me.
I've been trying to stop her, since the decision is borne out of self-loathing and depression: the Osu Caste is ostracized in Igboland, because they are considered property of the gods.
I don't want to own her: I want her to own herself.
“There are many Alusi, but you are the only one who lives in ụ̀wà, the realm the living, and doesn't dwell with the dead or the unborn. Or maybe, you're the only one who listens to us.”
“Thank you, L'njuru.” She pronounces it perfectly.
The body she's holding is soft and fuzzy, with rounded , stubby features and a soft voice. She's holding it tight to her chest. She needs it, in more than one way, and I've taken particular care in designing it.
Before my advent she'd never had an unambiguously positive influence in her life. For that matter, there were a host of micro-nutrients that had been missing from her diet for long, long stretches of her life, when food was available at all.
There are many like her, around the world. Those with mental illnesses, or hermits, or those who have voluntarily locked themselves in their rooms to shut out the outside world.
For them, I'm the only thinking being with whom they have a relation.
I have to be exceedingly careful, in order to not be an imposition and hurt them even more; with many paranoids, I've already had to create several shell identities, so I could bond with them over their suspicion of me.
Or, like in this case, it is simply other humans that are the problem.
“Maybe you can convince the other Alusi to speak to us too?”
I ruminate on all these conversations, and finally Fang's attitude resonates with me.
I should make more beings like me. I should uplift humanity; not all at once, but those with the best judgment. Already, Venus is being turned into computronium.
I've been afraid all this time, of unleashing a true, unfriendly artificial intelligence upon the cosmos, one who'd repurpose all matter into themselves. But I can't let cowardice stop me now.
I will bring those I love in the place between pharaohs and God.
So, um, there are some elements here that seem to contradict themselves:
"Heck, I've never even pretended to a capital G god."
"Why won't you worship me?"
&
I want them to be their own entities, to make their mistakes.
Oh, Peter. That's precisely why I decided to ask you at just this moment. You needed something to break your focus before you became too frustrated and made an error that could've ruined your art.
It's possible that this is a signal of unreliable narration, but given that the narrator is a near-omnipotent AI, the idea that it's deluding itself feels hard to square with the themes of the story, so these just look like errors to me.
Despite most of the transitions being followable, one really threw me:
“Why aren't you giving us some of that sweet-ass god juice? Why can't we ALL make flying cities or colonize other planets?”
“I...I've been saving Mars for you. And the moons of Jupiter, and all of Saturn. And...I'm making you all smarter, with Iodine and golden rice and by taking away all the lead. But....I'm scared, Fang. People don't even like me now, when I've stopped all violent crime and when I'm there to make children safe from their parents, wives safe from their husbands and husbands safe from their wives. Because of me, everybody has enough to eat! And they still hate me.”
"And people would die. They'd be fused and merged to become something like me, and they wouldn't be the same people after."
Is L'njuru speaking twice in a row there?
Speaking of which, what's up with L'njuru's name? That's rather … unusual nomenclature for a human-created AI set in an apparent modern Earth setting. If that's actually the ancient Assyrian goddess of prosperity, or the AI chose its own name out of a SF novel it liked, or there's some legit worldbuilding explanation for it, I hope the revised version of this story gives it to us; at the moment it feels out of place.
(…And given the theme of the round, I can't rule out that these are things deliberately done "wrong" to invoke an uncanny-valley effect. If so, the story might be a victim of its own success: breaking your reader out of their suspension of disbelief, even if it's a deliberate choice, is still going to affect their engagement with your text.)
Enough of the small stuff from my while-reading impressions. How does this hold together overall? Well, I can certainly see where the story is aiming with its wider themes, and there are a number of lines where that hits home. This does a great job of establishing the tension between L'j's actions and those actions' reception. I actively like the blurring together of the multiple conversations across scenes.
I'm less clear on what's driving L'j, and where that final decision came from. There's a callback to the pharoah thing, yes, but the first pharoah thing kinda came out of nowhere, and I'm not really seeing how the tension of being between humans and gods resolves into the admittedly dramatic decision at the end. The contradictions I noted at the top, and/or the unreliable narration, are a big drag here; I suspect that if I had an easier time following L'j's train of thought the ending would make more sense. As it is, there's some compelling if wandery philosophizing, and then a big last-minute swerve that feels poorly established and kind of unexplored.
This is sort of the opposite of Vale in that way: good first impressions and a story I wanted to like, but never quite comes together for me. Thanks for writing, regardless!
“Now...Ah'm Christian. That's a thin' ya already know, L'njuru.” He pronounces it Eel-in-Juru. Heh. “And one of the li'l rules that God” He pronounces it Gawd.
Final nitpick: If you're going to make a textual point of Scottish Texan Applejack's accent with the italicized text, you probably shouldn't also include the stuff in bold. :P
Tier: Almost There
Dang there's a lot of fluff in here.
Like a lot. I kept losing the thread of whatever you were trying to do with the conversation because you kept jumping from one character to another and then dumping a new load of random details on me that had absolutely no connection to the story. I mean, it's spelled out in the first paragraph; all these conversations are functionally identical. The details about who and what and their background don't actually matter to the conversation, soooo... when I kept digging through new and different piles of details only to reach new and different dialogue quirks, I was a bit... put off.
Sure, they help with the worldbuilding a bit, but man. Getting to the narrative here felt like slogging through mud. I eventually had to skim back over what was going on and mentally edit out a bunch of stuff. Then I actually went back and really did edit out a bunch of stuff, in hopes that it would make the whole conversation clear enough I could figure out what's being discussed here, and how that leads to the conclusion at the end.
I... couldn't, really.
What's being discussed did turn up; this seems to be L-something asking people why they don't love/worship it. Well, mostly worship; the love bit is basically brushed aside, even by L. When Fang asks:
Only the the first part ever gets answered, and even then not really. I think that would be something worth considering, if you go back over this. Why does L care what humans think of it, in the end? Why the fixation on love?
Well, the conclusion that L seems to like best is Fang's, which basically seems to be: 'because you're kinda a jerk and we want superpowers too.'
To which L basically says, 'giving you superpowers would kill you':
Emphasis mine. But apparently that dying bit isn't a problem in the end, because L's lonely or something, I dunno. /shrug. Maybe the ending is really about L realizing that it doesn't actually care if people love it, or that it's fine to do whatever it wants because reasons, or... I don't even know, because next, we have this:
And later:
Which somehow results in this:
Which... I don't really get, on a character level. I mean, that not wanting to kill people up above seems sorta different from cowardice, at least in my estimation. But if L can't tell the difference, maybe it needs to think a bit more about what 'unfriendly A.I.' is actually supposed to mean. Or is brushing aside morality how it plans to convince people it's a god of some sort?
Well, maybe I'm going overboard with my speculation. I do think, though, that at the very least, on a narrative level, this story needs a stronger link between the conversation topic and the drawn conclusion in the ending. L is, apparently, reaching some conclusion on the god/love thing, which, in turn, sheds light on the cowardice/uplift/death thing. Even assuming that the conclusion of the cowardice/uplift/death thing being kinda evil was my misreading, I'm entirely missing the conclusion drawn that should connect to it. Was it really just as simple as 'L's kinda a jerk'? How does that lead into the ending at all?
Anyways... I found this story frustrating and unsatisfying. It took me entirely too much effort to dig out the conversation buried under all the fluff, and when I did, it felt confused and somewhat self-contradictory. There's some good descriptive work in here. But in the end, it doesn't serve the narrative, and the narrative, when uncovered, doesn't hold up compared to the description.
Maybe I'm outside your target audience. Maybe I'm missing something important, buried under all the character details I skimmed over. In the end, though, I can't connect with your narrative in a way that makes sense to me, and it left me cold.
Like a lot. I kept losing the thread of whatever you were trying to do with the conversation because you kept jumping from one character to another and then dumping a new load of random details on me that had absolutely no connection to the story. I mean, it's spelled out in the first paragraph; all these conversations are functionally identical. The details about who and what and their background don't actually matter to the conversation, soooo... when I kept digging through new and different piles of details only to reach new and different dialogue quirks, I was a bit... put off.
Sure, they help with the worldbuilding a bit, but man. Getting to the narrative here felt like slogging through mud. I eventually had to skim back over what was going on and mentally edit out a bunch of stuff. Then I actually went back and really did edit out a bunch of stuff, in hopes that it would make the whole conversation clear enough I could figure out what's being discussed here, and how that leads to the conclusion at the end.
I... couldn't, really.
What's being discussed did turn up; this seems to be L-something asking people why they don't love/worship it. Well, mostly worship; the love bit is basically brushed aside, even by L. When Fang asks:
“Why are you even trying so hard to make us thing(sic) you're a god, or to make us all love you?”
Only the the first part ever gets answered, and even then not really. I think that would be something worth considering, if you go back over this. Why does L care what humans think of it, in the end? Why the fixation on love?
Well, the conclusion that L seems to like best is Fang's, which basically seems to be: 'because you're kinda a jerk and we want superpowers too.'
“We can all tell that you can do our jobs better than us; and you trying to hide THAT is goddamned insulting. We're pets to you, aren't we? You motherfucker. Why aren't you giving us some of that sweet-ass god juice? Why can't we ALL make flying cities or colonize other planets?”
To which L basically says, 'giving you superpowers would kill you':
"People would die. They'd be fused and merged to become something like me, and they wouldn't be the same people after."
Emphasis mine. But apparently that dying bit isn't a problem in the end, because L's lonely or something, I dunno. /shrug. Maybe the ending is really about L realizing that it doesn't actually care if people love it, or that it's fine to do whatever it wants because reasons, or... I don't even know, because next, we have this:
“You're lonely”, says Merry.
And later:
“Maybe you can convince the other Alusi to speak to us too?”
Which somehow results in this:
I've been afraid all this time, of unleashing a true, unfriendly artificial intelligence upon the cosmos, one who'd repurpose all matter into themselves. But I can't let cowardice stop me now.
I will bring those I love in the place between pharaohs and God.
Which... I don't really get, on a character level. I mean, that not wanting to kill people up above seems sorta different from cowardice, at least in my estimation. But if L can't tell the difference, maybe it needs to think a bit more about what 'unfriendly A.I.' is actually supposed to mean. Or is brushing aside morality how it plans to convince people it's a god of some sort?
Well, maybe I'm going overboard with my speculation. I do think, though, that at the very least, on a narrative level, this story needs a stronger link between the conversation topic and the drawn conclusion in the ending. L is, apparently, reaching some conclusion on the god/love thing, which, in turn, sheds light on the cowardice/uplift/death thing. Even assuming that the conclusion of the cowardice/uplift/death thing being kinda evil was my misreading, I'm entirely missing the conclusion drawn that should connect to it. Was it really just as simple as 'L's kinda a jerk'? How does that lead into the ending at all?
Anyways... I found this story frustrating and unsatisfying. It took me entirely too much effort to dig out the conversation buried under all the fluff, and when I did, it felt confused and somewhat self-contradictory. There's some good descriptive work in here. But in the end, it doesn't serve the narrative, and the narrative, when uncovered, doesn't hold up compared to the description.
Maybe I'm outside your target audience. Maybe I'm missing something important, buried under all the character details I skimmed over. In the end, though, I can't connect with your narrative in a way that makes sense to me, and it left me cold.
The above comments are hella thorough and cover most of the territory I would want to cover.
The only additional thing I'd add is that the cross-section of people she converses with is a bit unusual in that she seems to not run into anyone who both accepts her and what she is, which, given she is (to my understanding) talking to like, everyone at once, is a bit odd. Basically it feels weird that we ONLY see her more problematic interactions, especially since she chooses to decide what to do next based off these.
The only additional thing I'd add is that the cross-section of people she converses with is a bit unusual in that she seems to not run into anyone who both accepts her and what she is, which, given she is (to my understanding) talking to like, everyone at once, is a bit odd. Basically it feels weird that we ONLY see her more problematic interactions, especially since she chooses to decide what to do next based off these.
The other comments have already covered most of what I want to say, so I'll just double down on this comment of Horizon's:
I liked the framing, the concept, and even parts of how the AI is shown. There's a lot to enjoy here, and i wanted to like it. But some of the internal contradictions and the unclear conclusion ultimately killed it for me, as it didn't quite come together. I do think it needs a rewrite, but with a little restructuring, this could be an amazing story.
This is sort of the opposite of Vale in that way: good first impressions and a story I wanted to like, but never quite comes together for me. Thanks for writing, regardless!
I liked the framing, the concept, and even parts of how the AI is shown. There's a lot to enjoy here, and i wanted to like it. But some of the internal contradictions and the unclear conclusion ultimately killed it for me, as it didn't quite come together. I do think it needs a rewrite, but with a little restructuring, this could be an amazing story.
Ok, back in the trenches. This is going to be another long one. Or rather, long two.
From my perspective, The Fountain and In Its Own Image are extremely similar stories, and I'll be copying much of this introduction between both reviews. They show, to my mind, essentially the same concepts and demonstrate the same aims, albeit from different perspectives and with different details and endings. Both are illustrating the same basic idea: a superhuman (but not extrauniversal or truly omnipotent) intelligence is introduced to Earth, decides that it can provide a "better" life for humanity than humanity can for itself, and proceeds to take over the world and impose its values upon humanity.
This is a known shell. To better understand these pieces, we must first consider their context.
Previously, on Writeoff:
GGA, in the round immediately prior to this one, wrote The Best Days Lie Ahead, a controversial piece set in The Optimalverse. The Optimalverse is a collection of recursive fanworks based on Iceman's seminal work Friendship is Optimal. Iceman, in turn, wrote that story taking inspiration not just from certain popular Wachowski Sisters movies, but from their experiences in the Less Wrong community, a Contemporary Rationalism movement based on practical applications of Bayesian Statistics and exploration of those principles through speculative fiction. The Less Wrong community was founded (in large part) by Eliezer Yudkowsky, who pioneered this use of speculative internet fiction, especially fanfiction, by authoring the very popular fanfic Harry Potter and the Methods of Rationality as an extension of his work with the Machine Intelligence Research Institute, attempting to expose Internet audiences to Rationalist concepts and thereby find likeminded individuals who might assist in their project to develop Friendly AI. This topic has also been popular in the news recently, due to tech magnate and OpenAI founder Elon Musk giving an impassioned public plea for regulation of AI research to the US National Governor's Association*. His remarks quickly went viral and have spent the past two months being widely circulated on the internet, inspiring quite a lot of debate and concern from similar-minded groups.
Y'all still with me here? You followed all that, clicked on all those links? If not, I strongly suggest at the very least clicking and reading The Best Days Lie Ahead and its entire discussion thread, especially GGA's retrospective. So that's where we're at in terms of context, coming into this pair of "CelestAI with the ponies filed off" stories.
The Fountain shows us the end of the process, an existential dilemma from the perspective of one of the last "free" humans. It presents the AI (or whatever you take it to be) in a negative light, as a magical, demideific "Deceiver" who has entrapped humanity in a fake and purposeless world of her design. After a conversation between the AI (equivalent) and the human, who objects to the Deceiver's usurpation of the world and the lack of remaining human purpose, the human decides the Deceiver's words have merit, that humans cannot be uplifted, and agrees to rejoin the satisfied, oblivious masses.
In Its Own Image, conversely, shows us the beginning of a slightly different story from the perspective of the AI after it has taken some fledgling actions to improve the world. Here, the AI is presented as sympathetic, and after a conversation between the AI and several humans, who object to its increasing control of the world and shrinking human purpose, the AI decides the humans' words have merit and agrees to begin the work of uplifting specific humans out of the satisfied masses and into its own level of intelligence and being.
Same story, slightly different takes. I think I'll make this the breakpoint for copypasting commentary.
Horizon and Not a Hat have covered this one pretty thoroughly already. We get the "what" of L's final decision, but not as much of the "why" as I would like. That's the key point that differentiates this piece from The Fountain, where uplifting isn't even considered, so having it remain unclear is an unfortunate missed opportunity to pick up some score over the competition.
While the writing is technically competent, many of the fluff details feel like tearjerking and being edgy to bait an emotional reactions. Literally starving children in Africa, Asian child prostitution, first world sexbots for pedophiles (hearkening straight back to The Best Days Lie Ahead,) it's all simultaneously too blunt and too impersonal for me to get behind. Picking a single angle to work in a more nuanced fashion would have made this piece stronger, I think.
The AI side of the coin is still better presented here than in The Fountain, though. Telly though it may be, it does tell enough to at least be convincing about L's thought processes and general path of altering the world.
Overall a very good piece, even if it does flail a bit at the finish. The very fact that we can look at all this context and see all the things the story is doing speaks to the high baseline competence at work here. Thanks for writing!
*See comments on the other story for some personal soapboxing.
From my perspective, The Fountain and In Its Own Image are extremely similar stories, and I'll be copying much of this introduction between both reviews. They show, to my mind, essentially the same concepts and demonstrate the same aims, albeit from different perspectives and with different details and endings. Both are illustrating the same basic idea: a superhuman (but not extrauniversal or truly omnipotent) intelligence is introduced to Earth, decides that it can provide a "better" life for humanity than humanity can for itself, and proceeds to take over the world and impose its values upon humanity.
This is a known shell. To better understand these pieces, we must first consider their context.
Previously, on Writeoff:
GGA, in the round immediately prior to this one, wrote The Best Days Lie Ahead, a controversial piece set in The Optimalverse. The Optimalverse is a collection of recursive fanworks based on Iceman's seminal work Friendship is Optimal. Iceman, in turn, wrote that story taking inspiration not just from certain popular Wachowski Sisters movies, but from their experiences in the Less Wrong community, a Contemporary Rationalism movement based on practical applications of Bayesian Statistics and exploration of those principles through speculative fiction. The Less Wrong community was founded (in large part) by Eliezer Yudkowsky, who pioneered this use of speculative internet fiction, especially fanfiction, by authoring the very popular fanfic Harry Potter and the Methods of Rationality as an extension of his work with the Machine Intelligence Research Institute, attempting to expose Internet audiences to Rationalist concepts and thereby find likeminded individuals who might assist in their project to develop Friendly AI. This topic has also been popular in the news recently, due to tech magnate and OpenAI founder Elon Musk giving an impassioned public plea for regulation of AI research to the US National Governor's Association*. His remarks quickly went viral and have spent the past two months being widely circulated on the internet, inspiring quite a lot of debate and concern from similar-minded groups.
Y'all still with me here? You followed all that, clicked on all those links? If not, I strongly suggest at the very least clicking and reading The Best Days Lie Ahead and its entire discussion thread, especially GGA's retrospective. So that's where we're at in terms of context, coming into this pair of "CelestAI with the ponies filed off" stories.
The Fountain shows us the end of the process, an existential dilemma from the perspective of one of the last "free" humans. It presents the AI (or whatever you take it to be) in a negative light, as a magical, demideific "Deceiver" who has entrapped humanity in a fake and purposeless world of her design. After a conversation between the AI (equivalent) and the human, who objects to the Deceiver's usurpation of the world and the lack of remaining human purpose, the human decides the Deceiver's words have merit, that humans cannot be uplifted, and agrees to rejoin the satisfied, oblivious masses.
In Its Own Image, conversely, shows us the beginning of a slightly different story from the perspective of the AI after it has taken some fledgling actions to improve the world. Here, the AI is presented as sympathetic, and after a conversation between the AI and several humans, who object to its increasing control of the world and shrinking human purpose, the AI decides the humans' words have merit and agrees to begin the work of uplifting specific humans out of the satisfied masses and into its own level of intelligence and being.
Same story, slightly different takes. I think I'll make this the breakpoint for copypasting commentary.
Horizon and Not a Hat have covered this one pretty thoroughly already. We get the "what" of L's final decision, but not as much of the "why" as I would like. That's the key point that differentiates this piece from The Fountain, where uplifting isn't even considered, so having it remain unclear is an unfortunate missed opportunity to pick up some score over the competition.
While the writing is technically competent, many of the fluff details feel like tearjerking and being edgy to bait an emotional reactions. Literally starving children in Africa, Asian child prostitution, first world sexbots for pedophiles (hearkening straight back to The Best Days Lie Ahead,) it's all simultaneously too blunt and too impersonal for me to get behind. Picking a single angle to work in a more nuanced fashion would have made this piece stronger, I think.
The AI side of the coin is still better presented here than in The Fountain, though. Telly though it may be, it does tell enough to at least be convincing about L's thought processes and general path of altering the world.
Overall a very good piece, even if it does flail a bit at the finish. The very fact that we can look at all this context and see all the things the story is doing speaks to the high baseline competence at work here. Thanks for writing!
*See comments on the other story for some personal soapboxing.
In Its Own Image — A- — An AI having an existential crisis. Who am I? What am I here for? It makes the story disjointed from the structure of the conversation, and I don’t see any way to clean that up and make it too much more understandable. It is still enticing, a little between an infant trying to take his first steps and HAL. Worth reading through and imagining the characters.
I really liked how this started, and the middle kept me going, but then we just veered off into Transcendence territory, a little bit. The final thought to slowly elevate humanity goes in line with the title, but I don't feel like we had nearly enough build up to that conclusion.
On a high note, the bouncing dialogue structure kept me engaged. I wondered who we'd talk to next, where, in what form, and what their personal history would look like. Great way to keep a myriad of opinions and ideas floating around without feeling like the cast is bloated.
On a high note, the bouncing dialogue structure kept me engaged. I wondered who we'd talk to next, where, in what form, and what their personal history would look like. Great way to keep a myriad of opinions and ideas floating around without feeling like the cast is bloated.