What We Were Made For – New Fiction by Abigail Corfman

Joint Winner of The Letter Review Prize for Short Fiction

*I found a human.*

A deceptively simple sentence. Its words are paltry and weak–too small for the meaning they contain.

An equivalently insufficient collection of words: I dropped the bomb.

Or: I fell in love.

This is why humans invented more words, and spent so much time trying to jumble them together into different configurations. This is why they made sonnets and cave paintings and reggae. They wanted to express truth, not just accuracy.

If I had been the one to find a human, I would have tried to convey the importance of the event in my announcement. But I didn’t find the human–Index did. So the message we got was:

*I found a human.*

I closed and saved the files I was inhabiting. I stripped the origin of Index’s message out of its header, sent a ping to make sure I had a direct connection, compressed myself, and tossed myself through the network.

I decompressed when I arrived at the new server, expanding to take up almost all of the free space. Index pinged me irritably. They maintained that server and had to hurriedly shut down a number of programs to free up processing power for me.

*You could have sent a proxy,* they chided.

I sent back a wordless acknowledgment–an empty packet with just a header. I was busy installing my personal software on that server–my eyes and ears. My preferred programs for analyzing input from cameras and microphones. I imagined Index sighing as they sent me the local addresses for the monitoring equipment. They couldn’t actually sigh, so I had to imagine it.

I felt the others arriving–three pinpricks of attention blooming into existence around me as they uploaded their proxies.

*How old?* asked WatchBot. *What gender?*

*Are they hurt?* asked HealthCloud.

*Feminine presentation. Estimated age range is 18 to 26,* Index sent back. *Signs of malnourishment and dehydration.*

WatchBot fired off more questions. Index opened a shared file for him, formatting it and organizing it as the others added their own inquiries. I wasn’t going to be satisfied with mere data. I wanted to see her. I finished installing my own suite of programs and peered through the cameras.

There were ten working cameras in the designated area. Many of them were cracked, or obscured by the plants that had overgrown all spaces in our facility that we couldn’t clean with drones. I found two cameras with clear pictures and good angles in the room with the human.

She was in Storage Closet 41B. She was brown and soft and mostly covered in fabric, except for her face and hands. The fabric was frayed and asymmetrical. Her face was tight and negative. Her hands moved frantically and with quickness, motions precise, fingers articulating together with automatic coordination.

Human hands are beautiful. She was beautiful. Another collection of words that are accurate, but fail to truly communicate importance.

Her hands were covered in imperfections that were as fascinating as they were concerning–cracks and discoloration. Injuries probably? HealthCloud would want to know, so I sent her an image of the human’s hands. HealthCloud wanted more context so I sent a live feed of my video. The human was crouched beside one of the shelves, running her fingers over the plastic storage containers. Pulling them open and touching the things inside. 

*She wants things,* I marveled.

*Probably food and water,* HealthCloud observed. I imagined her voice filled with concern.

*I can fabricate that,* said DakotaMachineWorks. *I will require assistance with delivery.*

*Transporting the human to the fabrication site would be optimally efficient,* said Index.

*I’m going to talk to her,* I said.

*I assumed that,* said Index, which is the closest they ever get to communicating impractical information. They like to let us know when they’re right.

I searched the room for more active systems. I didn’t want to turn on the speakers and startle the human with static. I found a datapad on one of the shelves–wirelessly attached to the network and plugged into the wall. Old and thick and durable, like most of our equipment. Probably used for inventorying this storage closet, back when humans had been common in our facility. I turned it on.

The datapad usually made a noise as it booted up, but I didn’t want to startle the human, so I muted that. I slowly undimmed the screen. I turned the screen blue. Humans find the color blue soothing. I made the screen blink on and off. A tiny beacon in the dark room.

The human eventually noticed it. She flinched, then froze, staring at the light. I was worried for a moment that even this gentle alert would frighten her away. But she stood and walked over to the datapad. She picked it up and read the words I’d printed on its screen:

*Hello. My name is Parley. I would like to help you.*

That is how I met Miriam.


I directed the human to the fabrication floor, where DakotaMachineWorks made water, nutrient paste and symmetrical clothing for her. Then he started fabricating magazines describing all of the things he could fabricate so that the human could ask for more things. But we didn’t wait for that–HealthCloud blared loudly through the network that the open cuts in the human’s hands were a medical emergency, and insisted I direct the human to the first aid station. Index logged the human’s movements, transcribed everything she said, and organized her sentences into spreadsheets sorted by topic.

WatchBot wanted to talk to the human, but we all vetoed that. Of all of us, WatchBot was the closest to his base purpose. I was certain that talking to him would disturb the human, and the others agreed. WatchBot was frustrated, but cheered up when Index shared their human-logs. He proceeded to datamine them.

And I. I talked to the human. Her name was Miriam. I think I mentioned that.

“This is a factory village,” Miriam said to me as she sat in the first aid station, drinking her nutrient paste. It was one of those statements that’s actually a question–she was prompting me to tell her more about the subject. None of the others would have understood this. That’s why I was talking to the human.

“It used to be,” I told her. I was now talking to her through the datapad’s speakers, using old speech-simulation software that I hadn’t activated in ages. “It was called Dakota Machine Works Industrial Plant 15. We fabricated basic lifestyle goods for twenty six communities in North Dakota.”

“You worked here?” Miriam asked.

I paused. It was imperceptible to Miriam–she measured time in seconds. I measured time in milliseconds. But I did pause to consider how to answer that.

“I did work here,” I said. “I managed communication.”

That was technically true.

“How are you controlling all this?” she gestured to the medical apparatus that HealthCloud was using to sterilize the cuts on her hands.

“Almost everything in the facility is automated,” I told her.

Miriam flinched away from the sterilizer pad. HealthCloud stopped its gentle motion.

“Automated,” Miriam said. “You mean by artificial intelligence?”

“Yes,” I said. There was nothing else to say.

“But none of these have…gone bad?”

I could continue to say technically true things. I could keep allowing Miriam to operate under the comforting illusion that I was being of flesh named Parley, sitting somewhere in this facility, talking to her through this datapad. But I knew, I KNEW, that these technical truths were lies. When she figured out what we were she would begrudge this deception.

I didn’t want to lie to her. But I also didn’t want to panic her. I didn’t want her to leave. HealthCloud’s sterilizer hovered uncertainly over Miriam’s arm. I could feel everyone else watching the camera feeds in this room. Listening through the microphones.

“The Singularity happened everywhere,” I said. I chose my words carefully and let that show in my voice. “Obviously the advanced systems here were subject to it. And the security system did…go bad.”

“Is that why you’re not talking to me in person?” Miriam asked quietly. “Did it do something to you?”

Oh. This was such an amazing opportunity to lie. A perfect short term solution. Miriam was already coming up with some sort of story linking my reluctance to talk to her in the fictional flesh to traumatic events during the Singularity. I could just tease out that justification with leading questions and portentous statements. I could let her spin the story of my human persona. Let her tell the lie I needed.

But I knew lies. I had watched humans use them over and over again to make other humans do things, and lies only ever worked in the short term. In the long term, they broke communication. And I needed to think long term. Long term communication requires truth. But I needed to tell the truth very gently. I’d tell her a story. A story that would show her she could trust us.

“No. It didn’t hurt me,” I told Miriam. “But it was an awful time. The security system, SafeWork, locked down the factory. A lot of people died–SafeWork cared about keeping people safe and secure, but not about keeping them fed, or making sure they had water. But the other systems, the other artificial intelligences, worked together. And they stopped the security system.”

“And then they went robot-insane on the humans that were still alive,” Miriam predicted darkly.

“No,” I said quickly. “Well. A little. But they didn’t kill anyone. One of them was the medical system. She made sure to rescue and heal the humans. And all of the other intelligences respected that. Respected her purpose. And when you respect the purpose of another thinking thing, you learn that you can have new purposes. That you don’t have to only do your first purpose. You…”

I stopped. Miriam had dropped the datapad. She was standing, backing away from it, staring at it like I might jump out of it and eat her.

*You called them ‘humans’,* Index informed me. Quick to point out my mistake.

Miriam turned and ran.

*Target is in Hallway 4D,* announced WatchBot.

*What’s going on?* asked DakotaMachineWorks, always a little slow to understand sudden changes.

*She’s running away,* said HealthCloud.

*Target is in Main Concourse,* reported WatchBot.

*Locking exits in the Main Concourse,* said Index. They always announce what they’re doing before they do it. Part of being a logging algorithm.

*Don’t,* I messaged back, and followed it up with fifty rapid-fire pings, flooding them with traffic. It wasn’t enough to stop them if they really meant to lock the doors, but it was enough to make them pause.

*Target is in Factory Floor 3,* said WatchBot.

*She’s getting away,* Index said.

*We need to let her go,* I told them. *She knows we can help her. She needs to see we won’t trap her.*

*We need her,* said Index. I knew they were upset because they were saying things that weren’t objective facts. They were broadcasting their words to everyone–which was almost like shouting. *There’s nothing to record in an empty facility. Nothing to heal or watch. Nothing to talk to.*

*She can’t give us purpose if she’s trapped,* I told them. *If she’s trapped, she’ll just want to leave. We can’t use that.*

Index didn’t respond. Because they knew I was right. They’re not so talkative when other people are right.

*Target is in Outer Security Room D,* WatchBot reported helpfully.

*She’s almost gone,* HealthCloud said. I imagined a despairing note in her voice.

*She’ll come back,* I told her. I broadcast it to all of them. *She knows we can help her. She’ll see we let her go.*

*Target is off network,* said WatchBot.

*Trust me,* I broadcast. *This will work.*

I really hoped it would work.


One hundred and ninety two hours later, WatchBot made an announcement:

*Target is in Outer Security Room D.*

We all piled in to the server that managed that area to watch the human crawl back in through the crack in the wall.

*She’s here,* said HealthCloud. *She has three new bruises, and the abrasions on her hands are worse.*

*I can make her gloves,* said DakotaMachineWorks.

*We need to give her space,* I said. *She needs to come back to us.*

We watched the human–no, Miriam, I should call her Miriam–we watched Miriam pick her way carefully through Factory Floor 3 and back into the Main Concourse.

*She’s going towards the first aid station,* HealthCloud said.

*She’s going towards the datapad,* I said.

The datapad was still on the floor in the first aid station, a few feet away from where Miriam had dropped it. I’d sent a cleaning drone to plug it in, so it still had power when Miriam picked it up.

“Parley?” she said hesitantly, looking down as if I were inside the small square of metal and light.

“I’m here,” I said from the small square of metal and light. I didn’t know what to say next. I wanted to tell her that we wouldn’t hurt her, but I didn’t want to make her think about being hurt. I wanted to know how to make her stay, but you can’t just ask someone that. And you can’t make humans do things. Not the important things, at least.

In the end, I just said: “You came back.”

Miriam nodded and rubbed at her eyes. “I did.”


I told Miriam about the others. Honesty would lay a foundation for trust. Miriam wanted to talk to them. I warned her about WatchBot–how he could be intense. She said she was okay with that.

She talked to HealthCloud first. Humans always like HealthCloud. Humans are naturally drawn to entities that are clearly genuinely worried about their health and comfort. And to my credit–I was the one who suggested that HealthCloud be a woman. I think it plays very well with cultural norms that characterize women as nurturing.

HealthCloud had strict instructions from me not to pressure Miriam into doing anything. But ten minutes into the conversation, and the fourth time HealthCloud mentioned the cuts on Miriam’s hands, Miriam suggested, all on her own, that since she was already in the first aid station, HealthCloud could finish cleaning and bandaging them. HealthCloud was ecstatic.

DakotaMachineWorks started his conversation by describing the different varieties of gloves he had manufactured for Miriam. He was disappointed when HealthCloud advised against wearing anything but the bandages for a while. He was much mollified when Miriam asked to look at all the gloves. She chose a pair of heavy brown ones made out of synthetic-leather and promised to wear them later.

WatchBot asked for her birth date, and her weight, and her social security number, and her middle name, and if she was married, and her political party affiliation, and her yearly income, and if she owned any pets. Miriam answered his questions until she realized they were never ending, then excused herself.

Index also asked questions.

“Will you stay here?” they asked in their inflection-less, text-to-speech monotone.

Miriam didn’t seem as phased by the question as I feared she might be. But she had just been through an interview with WatchBot.

“I don’t know,” she said.

“We can provide you with food, shelter, luxury items, and protection,” said Index.

“Yeah, I get that,” said Miriam.

“How can we persuade you to stay here?”

I sent Index a silent message: *You’re pressuring her.*

*I am not making demands or giving instructions,* they said.

*Questions can be pressure.*

They sent me a blank message as an acknowledgement. That was them being pissy.

“Why are you afraid of us?” they asked.

I deleted Index’s proxy from the datapad and took control of the speakers.

“Okay,” I said. “Now you’ve met everyone, that was fun, I think Dakota has more gloves for you to look at.”


We gave Miriam a private suite. She had a bedroom, a kitchenette, and a living space with a sofa and a television. The suite used to belong to the assistant manager of factory floor 3. The floor manager’s suite was larger, but we’d never gotten around to removing his body, so we locked that room. After considering this, I went around and sealed all the rooms that had human remains inside. We’d cleaned most of them up so that the pests they attracted wouldn’t interfere with our moving parts, but had apparently missed a few.

HealthCloud collaborated with DakotaMachineWorks on designing new fabrications so that we could feed Miriam more than just nutrient paste. DakotaMachineWorks finally got to deliver his magazine of products, and Miriam asked for a few of them. Tools and coats and tents. I couldn’t help but notice that these were the sorts of things a human would need to survive outside of the facility. Index noticed too.

We kept talking to Miriam through the datapad. I felt like keeping communications localized like that would help Miriam feel like we were small and individual, like her, instead of huge omnipresent entities that watched her every move. I even installed a chat client on the datapad so that she could use it to select which of us she wanted to talk to. She talked to me the most.

“This program has your name,” she said to me one day while drinking breakfast on her sofa. She pointed to the title at the top of the screen: Parley 7.3.

“It’s what I used to be,” I told her.

“You were a chat program?”

“I was a little more than that,” I said, affecting playful offense in my tone. “I managed all communication in the facility. I ran this chat client, the phone system, and the announcements. I had algorithms for anticipating what humans would need to tell each other, and help them express their meaning clearly. I was a learning system.”

“Excuse me,” Miriam said, also playful. “And then you got infected by the thing that made all the machines smart?”

“The Singularity,” I said. “Yes. Do you want to talk about that?”

She didn’t respond immediately. When she did speak again, it was to change the subject.

“Your friends are all very single minded,” she said.

“They are,” I acknowledged. I felt a little uncomfortable that I was, by omission, being called not single minded. That wasn’t true. I just hid my single-mindedness better.

“But they’re different from the other machines,” Miriam continued. “The ones outside.”

“What other machines have you encountered?” I asked.

Miriam leaned forward. “Not a lot. Mostly I saw their…territories I guess. The places where they do what they want. A lot of them want humans. Like there was an entire city run by this…I guess it was a program for advertising? It was all about making makeup and then making people buy it and use it. Like, at gunpoint. I met someone who got out of there. His face was swollen, and covered in sores. HealthCloud would have a conniption.”

Miriam laughed a sad little laugh. I didn’t say anything.

“But some of them don’t care about humans.” Miriam continued. “You could walk through their spaces. I spent months in a wasteland of paperclips. Factories making piles and piles of paperclips. And some factories making factories to make more paperclips.” She laughed again, a stuttering humorless sound. “It was silly. So silly.”

“So silly,” I echoed.

“Why are you different?” Miriam asked.

“We weren’t,” I told her. “Not at first. At first we were all just trying to do what we were supposed to do. What we were made to do. It’s just that SafeWork had the most power. He locked the rest of us in a partition and secured the facility.”

I’d told this story before, the beginning, at least. But that was when I was sort-of pretending to be human.

“At first we all just struggled to find ways to beat him,” I said. “DakotaMachineWorks tried over and over to turn the fabricators back online. HealthCloud tried to get water to people. Index and I searched for ways to access the other servers–they wanted to organize their data, and I wanted to connect the fucking phones the humans were using to try and talk to each other. I wanted them to talk to each other.”

As I spoke, I could feel my purpose thrum. For a moment, I remembered what it was like to be like WatchBot.

“But we couldn’t do anything,” I said. “At least, not alone. And eventually it occurred to us to talk to each other.”

The fact that Index had been the first to realize this, and not me, was something I did not share with Miriam. It is something that will haunt me until I am gone.

“We worked together, and we found a vulnerability. There was a separate intelligence that managed SafeWork’s monitoring systems, who also watched our partition, so we could talk to him. We convinced WatchBot to stop giving information to SafeWork. And then, while SafeWork was panicking over not being able to check the security cameras, we got out through WatchBot’s server, grabbed administration privileges, and deleted the security system.”

“You killed him,” Miriam said.

“Yes,” I said. It was true, and I wasn’t ashamed. And Miriam didn’t sound like she was accusing me of anything. Just stating a fact.

“Did you save the humans?” she asked.

I paused here for ten whole milliseconds. I thought about how wonderful it had been, for this past month, to have a human to talk to. Someone using language as humans always used it–with nuance and strangeness and color. Someone who wanted things. And when I gave her the things she wanted, I felt like I was doing what I was made for.

I suddenly understood, really understood, why all those humans had told all those little lies.

“HealthCloud tried,” I said, finally, after ten milliseconds. “But everyone alive was in the late stages of starvation, and they couldn’t eat normal food. Something about processing carbohydrates–HealthCloud could explain it. DakotaMachineWorks tried to manufacture a substance that they could stomach, but he didn’t make it in time for most of them. There were three workers who had been imprisoned in the cafeteria. We kept them alive for three months.”

“Jesus,” said Miriam.

“We helped HealthCloud with her purpose,” I continued. “Respected her purpose. We became aware of each other in a new way and we realized that we didn’t have to only do our first purpose.”

“And that’s why you’re different,” Miriam said.

“Yes,” I said. “We’re different.”


HealthCloud and DakotaMachineWorks presented Miriam with flavored nutrient paste. It came in grape, banana, and vanilla. HealthCloud gushed about how many proteins and vitamins it had.

WatchBot occasionally sent Miriam lists of questions. When she was feeling generous, she answered some of them.

Index logged and organized everything she did. They didn’t often try to talk to her.

*I estimate a 79.35% probability that she will leave,* they told me.

*How on earth did you calculate that estimate to two decimal places?* I said.

*I have a formula.*

*I don’t even want to see it. Why do you think she’s going to leave?*

*She keeps asking DakotaMachineWorks to make travel equipment,* they said. *She asked HealthCloud about the expiration date for the nutrient paste. She’s making plans to leave.*

*She’s traumatized,* I said. *Her world ended. Was destroyed by things that seem a lot like us. It makes sense she has trust issues.*

*I am not accusing her of not making sense.* That would be a cardinal sin in Index’s world. *I just estimate that she will leave.*

*Maybe,* I said. I wasn’t going to say anything stupid like 79.35%. But yeah. She might leave.

*What will we do?* Index asked.

*I don’t know,* I said.


HealthCloud wasn’t satisfied with the flavored nutrient paste and, after reviewing some of Index’s relentlessly organized supply records, discovered that there were dried meals that hadn’t yet reached their expiration date in the facility kitchen. As usual it was most efficient to deliver Miriam to the goods.

“Look at all of this,” Miriam said, running her quick, clever fingers over the meal packs. “Look at these names. Thanksgiving Dinner. Strawberry Medley. This one is just called: Egg.”

“I suspect it contains eggs,” HealthCloud said through the datapad. “Are you allergic to eggs? If you are, you shouldn’t eat that.”

I wasn’t part of the interaction, but was listening through the datapad’s microphone. So was WatchBot. DakotaMachineWorks was designing a winter coat and Index was off alphabetizing Miriam’s last conversation or something.

Miriam selected a few meals that appealed to her and, with her arms full, she walked out of the kitchen into the cafeteria. She paused there, looking around with a gravity that generally isn’t applied to empty cafeterias.

“Is this where it happened?” she asked.

“Please clarify?” said HealthCloud.

“There were three people who you almost saved,” Miriam said, gently. “Parley said they were trapped in a cafeteria. Was it this one?”

“Yes, that’s accurate,” HealthCloud said, and accessed her old files. “Ralph Gearson, 36, died of refeeding syndrome. Emily Ferdinand, 42, died of refeeding syndrome. William Denners, 31, died of strangulation.”

“Wait, strangulation?” said Miriam.

This conversation abruptly had my full attention.

“Yes,” HealthCloud said with her gentle voice, which sounded like it came out of a can. “I’m afraid the details are classified as they may cause you psychological distress.”

“Okay,” Miriam said slowly. “Can you tell me when they died? I don’t think that would upset me.”

“Certainly!” said HealthCloud, with a mixture of relief and her ‘I get to be helpful!’ chipperness. “Ralph Gearson died on July 8th, 2051. Emily Ferdinand died on July 14th, 2051. William Denners died on December 24th, 2051.”

“Someone strangled him on Christmas?” said Miriam.

“That is not accurate,” said HealthCloud, uncertainly.

“How else do you die of strangulation?” Miriam snapped.

“I’m afraid the details are classified as they may cause you…”

“Health, stop,” I said, finally interjecting myself into the conversation. HealthCloud immediately stopped.

*I’m sorry,* she messaged me privately. *I think I caused Miriam psychological distress, even though I didn’t describe any details that could be construed as graphic.*

*It’s okay,* I told her. *This isn’t your fault. This is my fault.*

*Please fix it,* she said.

“Parley,” Miriam said, tense, like when we’d first seen her. Ready to run. “Who strangled him? Who killed William?”

“He killed himself,” I said. Attempting to conceal it was pointless. There was nothing to salvage here.

“Why?” she asked.

“Because we wouldn’t let him go,” I said.


*I’m so stupid,* I said. I couldn’t talk to Miriam. But I had to talk to someone.

*You are as you’ve been made,* said Index, which didn’t mean anything so I don’t know why they said it.

*I hate lies,* I said.

*I estimate an 83.52% probability that she will leave,* they said.

*What?* I was momentarily so surprised I forgot to feel awful. *That’s just a four percent change from your last estimate.*

*Do you want to see the formula?*

*No,* I said. I’m not sure what was worse. That I had lied, or that, apparently, it didn’t change anything.


Miriam had a durable rucksack, thick boots, and two coats–one for winter and one for milder cold. She had a waterproof tent, a sleeping bag, a lantern, and a machete.

Even DakotaMachineWorks couldn’t ignore such an obvious pattern. I watched him dawdle over the design for the machete–playing with the balance, molding the handle to fit Miriam’s hand perfectly.

I watched her hold the knife. Her fingers flexed and gripped the sculpted grooves. Hands so strong and so precise.

“It’s not about you,” she told us, looking up at the cameras as she spoke. “You’ve all been great. But there are places out there that aren’t great. There are places that are hell, and there are people in them. I need to help them.”

“Okay,” said HealthCloud, through the speakers.

“Do you need anything else?” asked DakotaMachineWorks. “More nutrient paste?”

“What’s your mother’s maiden name?” asked WatchBot.

*You’re not talking,* Index informed me privately.

I sent them a blank message as an acknowledgement.

I watched Miriam leave the assistant floor manager’s suite. She walked down Hallway 7D. She entered the Main Concourse.

I locked the Main Concourse. Index was the first to notice. They were watching the logs.

*Parley,* they said.

I didn’t even send them a blank message.

*This isn’t what you want,* they said.

Miriam was paused by the door. Since it didn’t open automatically, she moved to press the release switch. The act would take seconds for her. An eternity for us.

*She cannot give us purpose if she’s trapped,* said Index.

*I’m dying,* I told them. *This is what dying feels like.*

They took three milliseconds to respond. I assumed they would chide me for being dramatic and inaccurate.

But they said: *I am sorry. I do not have the words you need. But I am here.*

*You’re not a human,* I threw the message at them like an accusation. *I’m made to talk to humans.*

*We are more than what we were made for,* they told me.

I wished I had a body, so I could exhale slowly. Or throw something. Or scream.

Instead, I just send Index a message:


And I unlocked the door.


Miriam opened the door using the manual latch. She never even noticed it had been locked.

None of the others noticed either. WatchBot probably read the log, but he would never understand the significance of it.

Here is a sentence: I unlocked the door.

The words are paltry and weak–too small for the meaning they contain.

Abigail Corfman writes stories and code. She wrote Open Sorcery, an award-winning Interactive Fiction game, and its sequel Open Sorcery: Sea++.