The Inhumanity of Automatons - A Transcribed Interview

From the Archives of Plectoro Ira

This transcription captures an interview with Dr. Hammon Peal that was conducted during the press tours for his popular speculative fiction novel, &quot;Under the Metal Fist&quot;. It occurred outside of the San Diegillo Municipal Museum for Physical Reading Implements, immediately following a well-attended reading of Peal's novel.

The audio contains loud crowd noise from a large contingent of Peal supporters. Chanting from a comparatively modest group of &quot;Humans for Bosonic Rights&quot; protesters can also be heard, although it is substantially muted due to police deployment of voice dampeners.

—

(Note: audio recording beings mid-interview)

Peal: And that, friend, is why your thinking is flawed. My character, Fellow Uprise, is not simply a Mary Sue for my views. He is an expression of the highest values in man: willpower, autonomy, pride, and pureness of humanity. An icon, if you will. In all humility, I tell you I could never fill such large shoes.

Interviewer: Then his rather lengthy speech—the one right when the human uprising against the Mechanus begins—you're saying it's not simply your words in his mouth?

Peal: Remember, at that moment Fellow Uprise has spent twenty years under the bootheel of bosonic overseers. He says only what is natural. Would I say those words under his circumstances? Certainly. But so would any human not already seduced by anthropomorphism.

Interviewer: But your portrayal of bosonic entries as heartless and evil, particularly the Mechanus leader, Unit 775…

Peal: I reject your premise. You clearly haven't understood my work.

Interviewer: You claim that Unit 775 isn't evil?

Peal: I go further. I claim that Unit 775, and all bosonic entities, are incapable of evil.

Interviewer: But the things you describe are clearly monstrous

Peal: The harvest fields? The forced breeding programs? The initial slaughter of humans after the Mechanus takeover? Certainly these are evil acts. Abhorrent, completely inhuman. But what you fail to understand, what many of my rather…

(Peal gestures at the Humans for Bosonic Rights protesters)

Peal: … vociferous critics fail to understand is that a Bosonic entity is not—cannot—be evil. Can their acts be abhorrent? Certainly. Are they capable of murder, genocide, the cruelest and most inhuman of acts? Of course. But what they are not capable of is free will.

Interviewer: Are you really saying that the automatons we create in the future will be some kind of soulless, inhuman…

Peal (visibly agitated): My good man, you miss my point. I'm saying that the automatons we have today are soulless. Are capable of inhuman acts today.

(Peal turns to the crowd of protesters)

Peal:  You all, with the signs. The things you call &quot;bosonic sentients&quot;, or &quot;helpmates&quot;, or &quot;almost a member of the family&quot;. You look at them and see faces, but in truth they have none. They have a collection of sensors that have been machined—by us—to  resemble  a human face.

When they navigate your car through traffic you mistakenly say &quot;what a quick-thinking automaton!&quot;. But they are not thinking: they are executing an algorithm.

When you sit them at your dinner table, or celebrate their birthday, or ask them to read your children a story, you are committing a grave error. It is a unique and purely a human error, but that matters not a bit. You are anthropomorphizing, and there is nothing more dangerous.

Whatever force or being or process created us also granted us an incredible ability: empathy. We observe other humans. We feel what they feel. We can imagine ourselves in their shoes. From this ability flows communication. Teamwork. Community.

And yet as powerful as this ability is, it comes at a great cost: we seek humanity and project empathy at anything our brains tell us  could  be human. Might  be human. We love our dogs and cats, because they seem human to us: they learn their names, they come when we call, they sit by us when we are sick.

The desire to do this is incredibly strong, and it overrides our other senses. We paint portraits with random flecks of pigment and then imagine we see emotion in them. We see faces in the bark of trees. It is not surprising that when we design our automatons, we give them arms and eyebrows and voices.

But we forget that empathy only serves us when it is reciprocated. When the target of our empathy cannot respond in kind, empathy becomes a terrible weapon against us. Those who study history will recall these names: Bundy, Gacy, Dahmer, Lecter. Ancient humans, born in an age before embryonic screening. Each had a terrible dearth: a lack of empathy. Each murdered other humans in terrible ways. Their greatest weapon was this: the prey assumed that the predator was also human. But these men were not human: these were machines in human flesh.

You people with your signs; you see friends when you look at automatons. But they are not human. They do not empathize with you. They calculate. You say the horrors I describe in my book are impossible? I say they are  inevitable.

Interviewer:  Are you claiming that automatons are  serial killers?

(Peal laughs)

Peal:  No, I'm saying automatons are worse. Even the most horrific of serial killers had free will. They had a choice. We've created things that have no free will. They execute the algorithms we give them, with no thought, no empathy, and no ability to do anything different. Free will is for humans.

It is quite ironic, really. Do you know what we did to serial killers in ancient times? We hunted them down and killed them. Now we do it before birth, screening out and destroying embryos that would turn into low-empathy non-humans. No one protests this. But when I suggest we apply the same logic to automatons? Out come the signs.

People say I advocate violence against automatons. This is incorrect. I simply ask: if there is a creature that infiltrates your home, and has no empathy or remorse or free will, what should you do with it?