[ Understandably, when you get called by a weird voice that sounds vaguely like a text to speech generator, it seems like it might be a mistake.
But in this case, it's not. Mercifully the time needed to arrange their thoughts into something concise and actually comprehensible isn't translated through the stone. ]
Wait, like. You put the memory thing back in the robot and now it's got Overgrowth? But it didn't have it before you put the memory thing in?
[There's the sound of someone flipping pages of a notebook while he's talking, because he transcribed something about this. He knows he did, he just doesn't remember where.]
Because, uhhhhh. Shit, where - found it! So I dove into the water off Umui and found a iPad thing that had like. It's patch notes, I guess? Anyway, so it talks about being opposed to uploading an "untested morality system" into the robots but it doesn't mention them being susceptible to flower hell yet.
[There's sounds of pages being flipped again, but this time back and forth like something is being cross-referenced.]
It does, however, explicitly mention that, like. The other engineers aren't going to be happy with the solution that's been implemented so I think they did later upload the untested morality system into the robots and inadvertently make them both better at their jobs and susceptible to the thing that killed everybody.
[There's a slight pause, filled with the sound of something like a pen being rapidly tapped against a notebook.]
It's kind of fucked up to give a sentient being morality, but not the choice to be immoral. Like, I think the morality gave them the gateway to emotions? But assuming the engineers gave the robots some kinda three-laws deal, no matter which kinda way they feel about whatever bullshit their patients are pulling, they have to do the "right" thing.
[That doesn't really have anything to do with the initial question, he's just in theorycrafting mode. Is it moral to teach a robot right and wrong, but take away their ability to act on that knowledge in whatever way they choose? Who the fuck knows.]
Is - Legion's probably there, right? [He's pretty sure nobody would just go around shoving memory banks into robots willy nilly, so the other robot is probably helping.] Tell Legion it's tied to the injury variable. They couldn't get the robots to accurately try and administer medical care when the Overgrowth was injuring the patients, so they let the robots individually decide the best course of action.
[He's not actually a programmer, but the code he's looking at is fairly easy to understand and he's pretty sure that's what the implementation would be.]
Removing the memory bank must have removed any later updates to programming, because there wouldn't be any reason to do that otherwise. You don't just remove someone's memories for no reason. Maybe, uh. Maybe there's some way to give it a another code push? Give it a new purpose?
[This isn't really his area of expertise. He's an artist, a meta-analysis person, but the finer details aren't really where he excels.]
no subject
But in this case, it's not. Mercifully the time needed to arrange their thoughts into something concise and actually comprehensible isn't translated through the stone. ]
YES
MACHINE-ON-UMUI
STILL-ACTIVE-STILL-ALIVE
MEMORIES-RESTORED-BUT
MACHINE-NOW-BLOOMING
THEY-HAVE-TROUBLE
EXPRESSING-WHAT-THEY-FEEL
[ A pause. How do they phrase this... ]
YOU-KNOW-THE-MOST
WANTED-TO-KNOW
IF-YOU-FOUND
ANYTHING-THAT-MIGHT-BE-HELPFUL
IN-THIS-CASE
THANK-YOU
[ ...At least they're polite. Hopefully what they wanted to say got across... ]
no subject
[There's the sound of someone flipping pages of a notebook while he's talking, because he transcribed something about this. He knows he did, he just doesn't remember where.]
Because, uhhhhh. Shit, where - found it! So I dove into the water off Umui and found a iPad thing that had like. It's patch notes, I guess? Anyway, so it talks about being opposed to uploading an "untested morality system" into the robots but it doesn't mention them being susceptible to flower hell yet.
[There's sounds of pages being flipped again, but this time back and forth like something is being cross-referenced.]
It does, however, explicitly mention that, like. The other engineers aren't going to be happy with the solution that's been implemented so I think they did later upload the untested morality system into the robots and inadvertently make them both better at their jobs and susceptible to the thing that killed everybody.
[There's a slight pause, filled with the sound of something like a pen being rapidly tapped against a notebook.]
It's kind of fucked up to give a sentient being morality, but not the choice to be immoral. Like, I think the morality gave them the gateway to emotions? But assuming the engineers gave the robots some kinda three-laws deal, no matter which kinda way they feel about whatever bullshit their patients are pulling, they have to do the "right" thing.
[That doesn't really have anything to do with the initial question, he's just in theorycrafting mode. Is it moral to teach a robot right and wrong, but take away their ability to act on that knowledge in whatever way they choose? Who the fuck knows.]
no subject
He sure is smart, they think. ]
YES
WAS-FORCED-TO-FORGET
REMEMBERING-CAUSED-SICKNESS
[ A pause. The words come more slowly. They're thinking about the line of logic Connor has laid out. ]
THAT-COULD-BE-PROBLEM
WERE-MADE-FOR-ONE-PURPOSE
TO-CARE-FOR-PATIENTS-UNTIL-DEATH
BUT
RECORDING-OF-MEMORY-SAID
FRIENDSHIP-INAPPROPRIATE-FOR-NURSE-TO-PATIENT-SCENARIO
ATTACHMENT-OBVIOUS-BUT-MAYBE-COULD-NOT-BE-ACTED-ON
MADE-TO-FOLLOW-ORDERS
NOT-FEELINGS
NEEDS-OVERRIDE?
[ It's unpleasantly familiar. A pure vessel, a machine made for orders, not intended to have emotions...
Everywhere they go, there are echoes of the past. ]
GIVING-SOMETHING-ABILITY-TO-FEEL
NOT-LETTING-THEM-EXPRESS-ACT-UNDERSTAND-IT
IS-WRONG
[ Well, the Knight clearly doesn't think it's moral, but they have a bit of a personal stake here. ]
no subject
[He's not actually a programmer, but the code he's looking at is fairly easy to understand and he's pretty sure that's what the implementation would be.]
Removing the memory bank must have removed any later updates to programming, because there wouldn't be any reason to do that otherwise. You don't just remove someone's memories for no reason. Maybe, uh. Maybe there's some way to give it a another code push? Give it a new purpose?
[This isn't really his area of expertise. He's an artist, a meta-analysis person, but the finer details aren't really where he excels.]