lightlessfuture: (i heard a nostalgic song) (Default)
little ghost ([personal profile] lightlessfuture) wrote2018-04-12 10:15 am
Entry tags:

ic inbox

[ Somehow I forgot to put one of these up. ]
yourattention: (i gotta tell ya life without you)

[personal profile] yourattention 2018-11-07 12:16 pm (UTC)(link)
Wait, like. You put the memory thing back in the robot and now it's got Overgrowth? But it didn't have it before you put the memory thing in?

[There's the sound of someone flipping pages of a notebook while he's talking, because he transcribed something about this. He knows he did, he just doesn't remember where.]

Because, uhhhhh. Shit, where - found it! So I dove into the water off Umui and found a iPad thing that had like. It's patch notes, I guess? Anyway, so it talks about being opposed to uploading an "untested morality system" into the robots but it doesn't mention them being susceptible to flower hell yet.

[There's sounds of pages being flipped again, but this time back and forth like something is being cross-referenced.]

It does, however, explicitly mention that, like. The other engineers aren't going to be happy with the solution that's been implemented so I think they did later upload the untested morality system into the robots and inadvertently make them both better at their jobs and susceptible to the thing that killed everybody.

[There's a slight pause, filled with the sound of something like a pen being rapidly tapped against a notebook.]

It's kind of fucked up to give a sentient being morality, but not the choice to be immoral. Like, I think the morality gave them the gateway to emotions? But assuming the engineers gave the robots some kinda three-laws deal, no matter which kinda way they feel about whatever bullshit their patients are pulling, they have to do the "right" thing.

[That doesn't really have anything to do with the initial question, he's just in theorycrafting mode. Is it moral to teach a robot right and wrong, but take away their ability to act on that knowledge in whatever way they choose? Who the fuck knows.]
yourattention: (sincerely me)

[personal profile] yourattention 2018-11-09 04:48 am (UTC)(link)
Is - Legion's probably there, right? [He's pretty sure nobody would just go around shoving memory banks into robots willy nilly, so the other robot is probably helping.] Tell Legion it's tied to the injury variable. They couldn't get the robots to accurately try and administer medical care when the Overgrowth was injuring the patients, so they let the robots individually decide the best course of action.

[He's not actually a programmer, but the code he's looking at is fairly easy to understand and he's pretty sure that's what the implementation would be.]

Removing the memory bank must have removed any later updates to programming, because there wouldn't be any reason to do that otherwise. You don't just remove someone's memories for no reason. Maybe, uh. Maybe there's some way to give it a another code push? Give it a new purpose?

[This isn't really his area of expertise. He's an artist, a meta-analysis person, but the finer details aren't really where he excels.]