Perihelion Science Fiction

Sam Bellotto Jr.

Eric M. Jones
Associate Editor


Peripheral Hope
by Derrick Boden

by M. Luke McDonell

Penal Eyes
by Frederick Obermeyer

Tells of the Block Widowers
by Jez Patterson

Cretaceous on Ice
by K.C. Ball

Some Quiet Time
by Eric Cline

Three Breaths
by Karl Dandenell

by Kathleen Molyneaux

Shorter Stories

Left Hand Awakens
by Beth Cato

Laws of Humanity
by Alexandra Grunberg

Aggressive Recruiting
by Drew Williams


Remakes, Sequels Sizzle in 2017
by Joshua Berlow

Calderas: Doomsday Underfoot
by John McCormick



Comic Strips





Laws of Humanity

By Alexandra Grunberg

RAZIA SIMWELL WATCHED THE NEW entrants shuffle into the shock room in various states of distress. The clanging of their shackles on the stainless steel floors had become a sort of music to her, not exactly pleasant, but soothing in its constant repetition. It did not matter who they were, each entrant sounded the same in Entrant Expeller 5. She wondered if all the Expellers were composed of stainless steel, and she assumed that they were, each one an exact copy of the other, with its same symphony of metal on metal. She stopped herself from smiling at the thought of such constant behavior across the nation. It was nice to have rules. Simplicity was pleasant.

“That’s unusual.”

Halle Rosen came to stand beside Simwell, making a small gesture towards one of the robots that walked by. It was important to keep the gestures small. If you drew attention to yourself, the entrants reacted violently. They realized that there were human beings watching them, and convinced themselves that their judgment might not yet be over. They were foolish to think it, but it still happened sometimes. It was best to be part of the background, a silver uniform blending in with the silver walls and floors. You could not attack the walls. You could not talk to the walls. You knew they would not listen.

The robot in question was a Nurse-Bot, vaguely humanoid in shape but not in overall design. Its synthetic skin was made to resemble a teddy bear more than a human nurse. The children reacted more positively toward the soft embrace that mimicked their inanimate confidants, and it eliminated any confusion between caretaker and mother. This was the first Nurse-Bot entrant in Expeller 5.

Simwell scanned her chart, looking for the robot’s place in line and finding the Nurse-Bot.

“She could not bear the suffering of her wards,” said Simwell.

Rosen did not reply, but Simwell could feel her distaste. Simwell could understand Rosen’s reaction, even though there was no law that said killing a minor was a more grievous offense than killing an adult.

There were only three laws.

When artificial intelligence reached a point where robots were nearly identical to humans, humans decided that tests must be performed to truly recognize their autonomy and consciousness, the humanity of their creations. An issue arose with almost every initial test. Intelligence tests seemed to only test a robot’s trickery. Reasoning tests were more like learning tools, and some less-than-intelligent humans could not pass them. Empathy tests seemed a proper measure of humanity, but the only existing one based humanity entirely off of empathy towards animals. It was an emotional state from the fictitious world of a dystopian future, not the real world of the present.

It was this dichotomy between fiction and reality that made humans and robots alike realize that the idea of “humanity” was a social construct, to be created as desired within modern society. If robots could meet all the rules of humanity, they could be deemed equal to humans. But what rules could be followed that would dictate the terms of humanity? How could you recognize the new humanity, the true “human beings?” Again, reality turned to fiction, rewriting three ancient laws:

“A human being may not injure a human being, or through inaction allow a human being to come to harm.

A human being must obey the orders of human beings regarding physical interactions and acts of intimate nature, except where such orders would conflict with the first law.

A human being must protect its own existence as long as such protection does not conflict with the First or Second Law.”

Robots that met these standards must possess a sense of self and a respect for the humanity of others. Robots that did not, even intelligent robots, were merely “artificial intelligence.”

The first backlash against the laws was that there were many organic humans who failed to meet the laws, who were imprisoned and punished daily for their crimes. And then came the Great Realization and the beginning of the New Common Era. Not all humans possessed humanity. Not all robots failed to possess humanity. Humans and robots who possessed humanity as seen through their adherence to the Three Laws of Human Beings were “human beings.” Humans and robots who failed to adhere to these laws were merely “artificial intelligence.”

The second backlash came from the robots who protested that while organic humans who failed to follow the laws were merely imprisoned and punished, robots who failed to follow the laws were decommissioned as dangers to society, and as lesser beings such a punishment was not seen as an affront to their humanity. Because the established laws stated that organic, or human, artificial intelligence did not possess humanity, it no longer made sense for their deaths to be seen as an affront to their absent humanity. All artificial intelligence, organic or robot, could be expelled. And all artificial intelligence in the Pacific Northwest was brought to Entrant Expeller 5.

The Nurse-Bot who killed her wards.

The widower who murdered his wife.

The first-person shooter gaming system that tried to incite violence in users to bring about its own demise.

The teacher who encouraged her male students to call her MILF, who ignored their pleas in her personal detentions.

The parade of artificial intelligence marched by, making music with their shackles, indistinguishable in the eyes of the law, the eyes of Simwell and Rosen, and the recording eyes of cameras set in the ceiling, there both to make sure the guardians of the entrants did not come to harm, and to ensure that the guardians did not reveal themselves to be less-than-human in their treatment of the entrants.

The widower stumbled as he stepped into the Expeller Center, splashing the water that rose to his heels. His movement jostled the Nurse-Bot, but not him. The robots, for the most part, were more composed than their organic brethren. For the most part.

Cold hands gripped her before Simwell had time to react.

“Please don’t make me do this, please let me go.”

The robot’s voice was calm, but Simwell doubted he had the ability to raise his voice. His grip tightened and a frightening thought consumed her, and she could imagine her hands popping right off, becoming tangled in the shackles of the entrants, too soft to join the constant music, thwacking the floor with each step in a soft pang of dissonance.

The entrants could not continue to march forward with this robot stopped in their path. Its overlong arms reached out from its place in the line. It was a SearchMan, designed to rescue human beings from disaster situations, reaching deep into rubble and earth to save humanity, pulling them back from death into the light. Simwell did not know how it had violated the Three Laws. She dropped the chart when the artificial intelligence had grabbed her hands.

“I’m not hopeless like the organics,” the SearchMan tried to plead. “I can be reprogrammed to follow the Laws. I don’t belong here.”

“Gaming systems are slaves,” shouted the first-person shooter, its pseudo-organic voice harsh in the confined space. “You cannot play at violence and condemn violence. You cannot punish me for refusing to be a part of your insanity!”

“I’m not artificial,” cried the teacher. “I have a soul.”

“I am a human being,” said a voice from somewhere in the line.

Rosen shocked the SearchMan with her electric prod. It let go, twitching. Though it could not choose to move forward, its wheels allowed it to roll at a steady pace when Rosen encourage the rest of the entrants to continue into the Expeller Center. Simwell rubbed at her wrists.

“Thank you,” she whispered to Rosen.

The other woman shrugged. Simwell picked up her chart. They tried to become part of the wall once more. Sometimes there were outbursts like that. But artificial intelligence had a history of using trickery to try to pass tests of humanity. These entrants had already proved what they were.

When all of the entrants were inside the Expeller Center, Simwell closed the door behind them and Rosen turned on the Expeller Source. An electric current flowed through the water. It was quick and always surprisingly quiet. Music could turn to silence in an instant. The smell lasted much longer.

When all was safe again, the door opened. Simwell and Rosen looked in to make sure each body was still. Today, they were. When the artificial intelligences were all expelled, it was hard to imagine that they had once tried to pass as human beings. There was no humanity in the figures lying in the Expeller Center. The sight was distinctly unpleasant.

Rosen cried, turning her face towards the cameras to ensure that they saw her distress. Simwell could not cry, but made sure the cameras caught how she wrung her hands, the way she shook her head and trembled.

“You know, you could always get tear-producers installed,” said Rosen. “A lot of the other robot guardians are doing it. Makes it easier to show your humanity.”

“I don’t want it to look like I’m trying to trick them,” said Simwell to the organic Rosen who still had tear tracks on her cheeks though she had ceased crying almost immediately. “Enough artificial intelligences have tried to prove themselves through trickery. If I have humanity, it will show in my adherence to the Three Laws.”

It was important to show that expelling the entrants affected you, whether you were organic like Rosen or robot like Simwell. If you could sympathize with an artificial intelligence, then you clearly possessed empathy with human beings. Guardians, as a rule, tended to be more expressive with their emotions. Simwell gave a little sniffle, an unnecessary sniffle. Just to be safe. Just in case the tears were not enough. It had never been an issue in the past, but sometimes new rules came out, stricter rules, and Simwell would rather be overprepared than shackled with these monsters. Though perhaps she would feel differently if she were there among them.

Simwell shook her head, dismissing the thought. There was acting empathetic, and then there was ... dangerous thinking. That kind of thinking could get you into trouble, even if you did not say anything, even if you kept it locked in your thoughts. Technology was so advanced, the cameras could read minute reaction in the face muscles that could betray her. But no one came to take her away. No one paid any attention to her.

Guardians had already begun cleaning out the Expeller Center. Far down the hall, Simwell could hear the beginning of another symphony, shackles against metal, the constant music of her life. The sound was not exactly pleasant, but she could not say it was unpleasant either. Simwell realized she was smiling and hoped the cameras could not see. But Rosen was smiling, too, so maybe they could be forgiven this one time for their transgression.

They were only human. END

Alexandra Grunberg has previously been published in “Daily Science Fiction,” “Fantastic Stories of the Imagination,” and “Flash Fiction Online.” Her most recent short story for “Perihelion” was published in the 12-DEC-2015 issue.