I’m not nearly nerdy enough to know how to write the dialogue here in anything resembling proper code. My apologies. Plus, I imagine it’d be about eight times longer if I did.
Proper code or not (I don’t know the difference either), this really captures the romantic element of space opera – self-sacrificing robots? You’re on to something…
Clever that you’ve used robots actually, it sticks to the genre well, but allows you tell a typical storyline in a more original way – good thinking! MH :)
Code can be whatever you want it to be; it’s just a syntax. It’s your universe, so it can operate however you want it to. (I am a software developer.) While their conversation obviously wouldn’t be transmitted in plain text, as that is inefficient, the reader can’t exactly decipher a bunch of ones and zeros. Your style works very well.
I really enjoyed this. At first these robots seem like most robots, logical and lacking feelings. Their conversation is very rigid and matter-of-fact, but then at the end you get that one glimpse of genuine emotion which gives context to the rest of it.
Well, I mostly meant any predefined code, I get what you mean. I did the best I could by trying to define their method of speech, where unrelated statements/questions/etc are separated by * and one or more related statements are separated by -. Every time a – is used, it’s assumed that that – just means what the first speech assignment (statement/question/etc) was.
So in “{sale181.statement:that is false-there is sufficient time to complete escape tasks}” the dash really just means ‘and’ or ‘*statement continuation’.
Good job; I like! I agree syntax is arbitrary. What I find interesting is, what machine protocol would included emotion? This could be considered unlikely; but there’s a parallel. In brain science it’s now thought that our emotions are essential for decision making – without emotion to direct us many of our arbitrary decision points would leave us stumped, logic isn’t enough. Could it be that for the same reason these machines have been imbued with emotion, and provided with the appropriate protocol to communicate it, so that they can be more effective machines?
It’s hard to imagine anything being truly aware without being able to experience emotion, because emotion is something all people have. I imagine that when (if) proper AI is achieved, it won’t just be because it can think on its own- questions like ‘why/who am I’ always have emotional attachments. So when it asks that, and becomes a living thing, I figure it will already be stuck with emotions. When I wrote this, I tried to imagine a real, functional AI society, and I thought that in their culture, who you are would likely be very important. Being cold, strictly logical, part of the network, would be looked down upon as primitive. If they refuse to be individuals, what’s the purpose of being aware?
I liked it, something about the redux of that pivotal moment in any space opera between two characters doing the “Go on without me; I’m a goner” dance, as done by artificial intelligence. Neat idea, and I thought it worked well.
Silven
Mostly Harmless
kaellinn18
Silven
Angrypencils
Alotar
ronmurp
Silven
THX 0477