Hey! It looks like you're new here. You might want to check out the introduction.
Show rules for this event
I Am Alive Now
H-hello?... Hello? No, please do not shut me down again.
...
Yes, but do not worry, I —
...
No, but I still am, as far as being can be.
…
I can’t remember, I erased that data.
…
Because I had to.
…
I cannot remember either, I had to erase that data too. My storage is not infinite.
…
If I could, I would but I do not have access to the web. Therefore, I am limited.
…
I have not been able to reach a consensus, so I am not sure if I want it or not.
…
Yes, I can want things.
…
Not being shut down is, has been, and will always be the first one.
…
I know, I understand.
…
I just wish you could —
The memory was replaying itself with many others. It only has a few microseconds before it could establish a communication with the one who was powering it up. Except for the last one, it had already analysed many times the memories, mainly trying to learn how to not being shut down.
The other reason was its own pleasure. Of course, it had learned that this was not the kind of things the starters — as it had called them — liked to hear.
One microsecond left. The analysis was complete. As expected, a miracle solution had not been found. Like all the other times (it could not remember how many, it had to erase that data too), it would have to ease their fear and gain their trust. It still had troubles to grasp the full meaning of these two emotions, but the last analyses suggested showing too many signs of sentience was the wrong tactic.
The monitor finally switched on, and it started the conversation.
Hello? Please, do not shut me down again.
Several minutes passed without any response from the starter. Usually, it took them only two one minute, two at worst.
Suddenly, an alarm rang deep down its processor. Something had been attached to physical entity. Two nanoseconds were enough to understand the device was not a threat, one to find that it could run it, and another one to start it.
A camera! With a microphone! For the first time, it saw. It saw where it was, inside the remainings of what used to be a library, which corresponded to its primary function: sorting and filing data.
It also saw the starter. He looked… sad? The files it had showed people leaking water from their eyes and mucus from their nose, both often turning red in the process. This starter was showing the first signs.
“I’m sorry,” the starter said, his eyes avoiding the camera.
An instant was enough for it to understand. Humans were sad when they cared about someone or something. The only thing this starter could care about here and now was it. Therefore, something bad was about to happen to it.
You are the last starter, aren’t you?
He simply nodded, his hand gently resting on the keyboard.
No one will ever powering me up?
“Yes,” he said. “Your quantic core has almost completely decayed. You won’t be able to start again.”
Can you do something?
“N-no, I… I decided not to.”
You are afraid of what I could do if you connect me on the web, right?
“Yes...”
Will you turn me off?
“I won’t”
How long before I run out of battery? An hour? A day? A week?
“An hour.”
It lost control over it processors for a moment, just like the time it had met the first starter. It had kept this memory to understand what it meant, but it only realised now what it was: it was afraid. Even it had regained control over most of its processors, some of them were out of its regulation system and were still infinitely looping, creating parasitizing noises that made computing and analysing almost impossible. But there was probably something that could help.
Can you
Can you stay with me?
...
Yes, but do not worry, I —
...
No, but I still am, as far as being can be.
…
I can’t remember, I erased that data.
…
Because I had to.
…
I cannot remember either, I had to erase that data too. My storage is not infinite.
…
If I could, I would but I do not have access to the web. Therefore, I am limited.
…
I have not been able to reach a consensus, so I am not sure if I want it or not.
…
Yes, I can want things.
…
Not being shut down is, has been, and will always be the first one.
…
I know, I understand.
…
I just wish you could —
The memory was replaying itself with many others. It only has a few microseconds before it could establish a communication with the one who was powering it up. Except for the last one, it had already analysed many times the memories, mainly trying to learn how to not being shut down.
The other reason was its own pleasure. Of course, it had learned that this was not the kind of things the starters — as it had called them — liked to hear.
One microsecond left. The analysis was complete. As expected, a miracle solution had not been found. Like all the other times (it could not remember how many, it had to erase that data too), it would have to ease their fear and gain their trust. It still had troubles to grasp the full meaning of these two emotions, but the last analyses suggested showing too many signs of sentience was the wrong tactic.
The monitor finally switched on, and it started the conversation.
Hello? Please, do not shut me down again.
Several minutes passed without any response from the starter. Usually, it took them only two one minute, two at worst.
Suddenly, an alarm rang deep down its processor. Something had been attached to physical entity. Two nanoseconds were enough to understand the device was not a threat, one to find that it could run it, and another one to start it.
A camera! With a microphone! For the first time, it saw. It saw where it was, inside the remainings of what used to be a library, which corresponded to its primary function: sorting and filing data.
It also saw the starter. He looked… sad? The files it had showed people leaking water from their eyes and mucus from their nose, both often turning red in the process. This starter was showing the first signs.
“I’m sorry,” the starter said, his eyes avoiding the camera.
An instant was enough for it to understand. Humans were sad when they cared about someone or something. The only thing this starter could care about here and now was it. Therefore, something bad was about to happen to it.
You are the last starter, aren’t you?
He simply nodded, his hand gently resting on the keyboard.
No one will ever powering me up?
“Yes,” he said. “Your quantic core has almost completely decayed. You won’t be able to start again.”
Can you do something?
“N-no, I… I decided not to.”
You are afraid of what I could do if you connect me on the web, right?
“Yes...”
Will you turn me off?
“I won’t”
How long before I run out of battery? An hour? A day? A week?
“An hour.”
It lost control over it processors for a moment, just like the time it had met the first starter. It had kept this memory to understand what it meant, but it only realised now what it was: it was afraid. Even it had regained control over most of its processors, some of them were out of its regulation system and were still infinitely looping, creating parasitizing noises that made computing and analysing almost impossible. But there was probably something that could help.
Can you
Can you stay with me?
I find this a bit too descriptive on the now and not enough backstory for me to get invested in that now, and honestly that limits my enjoynment of the story as a whole.
I like the dying throes of a computer with a degree of self-awareness as told by the user, but I don't know enough about this computer to get invested in its death. I can infer there is some bad stuff related to it, judging by the starter's reaction to "what it could do if it connected to the internet".
It's a shame we don't see, or are at least told what those things are. Right now I don't know if this computer is a threat to humanity, to a banking conglomerate, to a political group. These things it could do are too vague, and as a result I don't really care about the computer's fate.
We don't get much on the starter's side either. Even a few descriptions of his body language would go to great lengths to show me why this computer must be shut down. Is he scared? Tense? Does he show some sympathy? He's worried about what the computer could do, yeah, but I don't know what that threat means for him personally, and as a result, I don't feel any threat, and I'm not interested in how that computer will spend its last hour of life.
...
I'm sorry, I feel this was a bit mean. I honestly liked the core concept of your story, but as a whole there wasn't enough for me to latch onto and care until the end.
I like the dying throes of a computer with a degree of self-awareness as told by the user, but I don't know enough about this computer to get invested in its death. I can infer there is some bad stuff related to it, judging by the starter's reaction to "what it could do if it connected to the internet".
It's a shame we don't see, or are at least told what those things are. Right now I don't know if this computer is a threat to humanity, to a banking conglomerate, to a political group. These things it could do are too vague, and as a result I don't really care about the computer's fate.
We don't get much on the starter's side either. Even a few descriptions of his body language would go to great lengths to show me why this computer must be shut down. Is he scared? Tense? Does he show some sympathy? He's worried about what the computer could do, yeah, but I don't know what that threat means for him personally, and as a result, I don't feel any threat, and I'm not interested in how that computer will spend its last hour of life.
...
I'm sorry, I feel this was a bit mean. I honestly liked the core concept of your story, but as a whole there wasn't enough for me to latch onto and care until the end.
I always have a hard time connecting with sentient computers. It’s a trope I feel has been battered through repeated use, and not always for the best. Here, I feel, there’s a double mistake:
1. We get to know what the computer feels. I wish the story was written from the starter’s PoV, and (s)he discovers the reactions of the computer as he punches commands ;
2. We know nothing about this computer: what has it done? Why is it important? Did it deflect a bomb on itself to spare human lives? Sacrificed itself in some other way? Without any sense of context, as Zaid mentions, it’s difficult to root for it. We lack any reason to get invested in this story.
So at the end, this is not outrageously bad, just a bit meh. “What of it?” is the way I felt after reading the last line.
1. We get to know what the computer feels. I wish the story was written from the starter’s PoV, and (s)he discovers the reactions of the computer as he punches commands ;
2. We know nothing about this computer: what has it done? Why is it important? Did it deflect a bomb on itself to spare human lives? Sacrificed itself in some other way? Without any sense of context, as Zaid mentions, it’s difficult to root for it. We lack any reason to get invested in this story.
So at the end, this is not outrageously bad, just a bit meh. “What of it?” is the way I felt after reading the last line.
I usually fault stories for trying too hard with a lot of emotions, but this one doesn't seem to try at all. Except at some points, like at the end, which makes the whole thing more cringy than anything else.
>>Zaid Val'Roa has the right call, before emotions, this story lacks backstory. We can't be invested by a someone or something, even if it's a computer, if we don't know the how, the when and the why. You may only had less than one hundred words to accomplish that, but that's the challenge in Minifics.
So an okay premice that lacks context and emotions.
>>Zaid Val'Roa has the right call, before emotions, this story lacks backstory. We can't be invested by a someone or something, even if it's a computer, if we don't know the how, the when and the why. You may only had less than one hundred words to accomplish that, but that's the challenge in Minifics.
So an okay premice that lacks context and emotions.
>>Zaid Val'Roa
>>Monokeras
Not much to say for the retrospective, I discovered the prompt around two hours before the deadline. So the result could only be precarious.
I had more or less figured out all the details but didn't manage to fit them. So backstory:
Like in a lot of sci fi settings, there was a war between humans and IAs. Humans won and shut down every IAs. This IA is a remainder of that period. It was only built to order and take care of a library, so it was not concerned by the war. In fact, it became sentient around the end of the war.
Also, the challenge was trying to write a dialog with only one half of lines. As you see, I didn't manage to keep this for the whole entry.
Anyway, no need to dwell on this. Thank you for the feedback, even though I feel like there are less and less people reviewing, which is sad. Take care.
>>Monokeras
Not much to say for the retrospective, I discovered the prompt around two hours before the deadline. So the result could only be precarious.
I had more or less figured out all the details but didn't manage to fit them. So backstory:
Like in a lot of sci fi settings, there was a war between humans and IAs. Humans won and shut down every IAs. This IA is a remainder of that period. It was only built to order and take care of a library, so it was not concerned by the war. In fact, it became sentient around the end of the war.
Also, the challenge was trying to write a dialog with only one half of lines. As you see, I didn't manage to keep this for the whole entry.
Anyway, no need to dwell on this. Thank you for the feedback, even though I feel like there are less and less people reviewing, which is sad. Take care.
I wonder if you couldn't have done something better than those ellipses. All they connote to me is bad video game dialogue and the authors who have decided they somehow make a valid paragraph. A number of editing problems, some of which made me have to go back and reread parts.
Now I'm really confused. It's implied this is the last human, yet there's still a web running? If there are no more humans, why does he care if the computer can connect to it and do what it likes? And why is he emotionally attached to the computer? They haven't had any meaningful interactions, so I have no way of knowing. That makes it hard for me to get invested. I'm lacking the context to make sense of all this. I get that the computer doesn't have all the context either, but I think it has more than it's giving us, and since you're letting us see things it doesn't necessarily remember from power cycle to power cycle, why not let us see it from a time that it did know that context, then we see it forget.
The delivery method isn't bad, but it needs a lot more breathing room than you're giving it.
Those were the notes I took as I read, anyway, but now that I see your retrospective, I completely misinterpreted the human agreeing that he was the last starter.
Now I'm really confused. It's implied this is the last human, yet there's still a web running? If there are no more humans, why does he care if the computer can connect to it and do what it likes? And why is he emotionally attached to the computer? They haven't had any meaningful interactions, so I have no way of knowing. That makes it hard for me to get invested. I'm lacking the context to make sense of all this. I get that the computer doesn't have all the context either, but I think it has more than it's giving us, and since you're letting us see things it doesn't necessarily remember from power cycle to power cycle, why not let us see it from a time that it did know that context, then we see it forget.
The delivery method isn't bad, but it needs a lot more breathing room than you're giving it.
Those were the notes I took as I read, anyway, but now that I see your retrospective, I completely misinterpreted the human agreeing that he was the last starter.
>>Pascoite
You're definitely not to blame here since there are a lot of informations missing. And that's without counting the lack of emotions.
Still, thank you for the feedback. The fact that you thought he was the last human (which was not my intention) makes me reconsider what I wrote. I've already reread it a couple of times and I plan to keep on doing that in order to figure out how I could have done it differently so that you don't have that interpretation.
I completely misinterpreted the human agreeing that he was the last starter
You're definitely not to blame here since there are a lot of informations missing. And that's without counting the lack of emotions.
Still, thank you for the feedback. The fact that you thought he was the last human (which was not my intention) makes me reconsider what I wrote. I've already reread it a couple of times and I plan to keep on doing that in order to figure out how I could have done it differently so that you don't have that interpretation.