Cas was sitting on the couch eating a protein bar when her mother arrived after her shift in the crop lab.
“Hey there,” said Nyla as she sat down on the couch next to Cas.
“Mmph,” said Cas through a mouthful of protein bar.
Nyla reached over and tried to smooth Cas’s hair. “Visiting Earth again?” she said.
“Yeah,” said Cas. “The sim has been updated to show reentry streaks now.”
“You’ve been going there a lot,” said Nyla.
“Yeah,” said Cas.
“I know you’re worried,” said Nyla. “We’re all worried.” She tried to think of something else to say, but the words weren’t there. Instead, she slid closer to her daughter and put her arm around her shoulders.
The message from Athenai had contained no details regarding what had happened. The Legrange comm relays that handled communications with Earth were silent. Diagnostics indicated they were operating normally; there simply was nothing to relay. The first few days, there had been a great deal of speculation, but without further info, all the Martians could do was guess. Then one of the comm satellites in orbit around Mars began picking up weak signals directly from Earth. They were text only, and addressed not to Mars but to Luna. They appeared to be half of a conversation between Earth and Metzger Base.
“I’m sad, but I don’t know what to do about it,” said Cas. “The only thing I can do is go there in the sim.”
“I know,” said Nyla. “Specialists on Earth are working the problem. We have to hope they will find a solution.”
Nyla’s statement portrayed the natural optimism of a parent comforting their child. But Cas knew better.
The messages came through once every Earth day, and the Martians deduced that they were transmitted from a single base station on Earth whenever Luna was overhead. As they eavesdropped on the intermittent half conversation, they were first astonished by the actions of Athenai then shocked at the situation on Earth and Luna. The early messages indicated that there was effort on Earth to come up with a plan for the people on Luna. But as the days passed, the conversation had slowly become more pessimistic, and it became clear that there would be no rescue.
“There’s no solution,” she said, thinking of the streaks of light and the near miss in the sim. “Just time, and they don’t have enough of that.” She was silent. Her mother was silent. “Keiron must be so scared. And hungry. It’s only been a week, but I’m sure they are already rationing. I feel like I should stop eating too, but I know that’s ridiculous.”
Her mother pulled her arm back and sat up next to her. “It’s not ridiculous,” she said. “It just won’t help.” She thought for a moment. “Empathy is one of the strongest human emotions. We see someone in a bad way and we feel their pain. That motivates us to help. But sometimes, there is nothing we can do. Those times are hard, but it’s important that we be strong for those suffering. Ideally, we could listen to them. Feel with them. When you can’t help someone out of their pain, the next best thing is to let them know they are not alone. But what do we do when we can’t even do that?” She took her daughter’s hands in hers and waited until Cas looked up at her. “We go on,” she said, looking directly into her daughter’s eyes. “We go on. And we help each other. Because everyone around us is feeling the same pain right now. We help who we can, and we go on.”
Cas thought about this. It wasn’t much. But she sat up a little straighter. Her mother was right. She couldn’t help Keiron, but she could help those around her right now who were worrying as well. That was something she could do, and it was better than doing nothing. She reached out and pulled her mom into a tight hug.
After a few minutes, Cas pulled away as a new thought occurred to her. The message from Athenai had caused much discussion around the colony, and she and her friends were very curious about the AI that had caused so much trouble.
“Mom,” said Cas, “why don’t we have AIs like Athenai?”
“Well…” said Nyla, motioning with her hands in a way that said Do you have to ask?
“I know,” said Cas, “I’m not suggesting we should, I’m asking why we don’t. Until now, it seems like a good idea. How did we decide it wasn’t?”
Nyla thought about this. “Have you ever heard the story of the two wolves?” she said.
Cas shook her head.
“It’s an old parable. It goes like this. A grandfather told his grandson that all people have two wolves inside them, one that is angry and petty and vengeful and one that is kind and understanding and calm. In every person, these wolves are always fighting with each other. The grandson thinks about this and asks which one wins. The grandfather says the one you feed.”
Cas rolled her eyes.
“I know,” said her mom, “it’s corny. But that doesn’t mean it has no value. You could say that we spend a great deal of time here in Dawn feeding the kind wolf, right?”
“Of course,” said Cas.
“And then there’s Earth: cradle of humankind, a wonderful vibrant planet with many human cultures. But it is full of conflict. Some say such conflict is necessary for human progress.”
Cas thought about this. “Is it?” she asked.
“I don’t know. What do you think?” said Nyla.
“I can kind of see how that might be true. We learned in school that the first Luna landing happened because of a conflict between two Earth nations.” She paused, thinking. “I mean, challenge makes each of us stronger. But conflict between people seems like a waste. I don’t know.” The idea was foreign to Cas, and she didn’t like it.
“Okay, back to the wolves,” said Nyla. “Earth culture isn’t necessarily bad, but it can make it very difficult to tell the wolves apart.”
Cas nodded, though she wasn’t sure what that meant.
“From the news sources you choose to watch, to the friends you choose to associate with, to the jobs you choose to support yourself: all these choices feed one wolf or the other. Or maybe they feed both wolves sometimes. It’s a tortured metaphor. The point is that it is very hard for a person of Earth to know what wolf they are feeding.”
“What’s this have to do with AI?” said Cas.
“Building an AI requires training data. You start with some basic software called an AI framework. It’s not good for anything by itself, but it’s really good at learning. When you feed it training data it builds connections that form the actual AI. If you think about it, this is very similar to how a human develops.” Nyla paused and waited for her daughter to digest that.
Cas nodded.
“This is true of all AIs,” said Nyla. “When we train a llama for its work on the surface, we feed it training data that is relevant to the work it is to perform.”
Llamas were the agile robots that performed various jobs on the surface of Mars. One of them had been involved in the family’s adventure in Niger Vallis.
Nyla continued, “A llama has an AI that is limited to the specific tasks it is to perform. The kind of AI you are asking about, one like Athenai, is called an artificial general intelligence, AGI for short. What makes an AGI different is that it is not trained for a particular task. It is trained to perform any task you might ask of it. To train it, the datasets must be massive. For an AGI like Athenai, the training dataset must have included all of human history, right?”
Cas nodded.
“And we’ve already established that the humans who have created that history have a difficult time keeping track of their wolves. So. The data that is fed to an AGI, no matter how carefully curated, will always contain food for the wrong wolf.”
Cas rolled her eyes again.
“I know, still corny,” said Nyla. “But true.” She paused to let Cas catch up. “A general AI is a very powerful tool. It has the ability to learn and do anything--everything--humans can do, and do it better and faster. If we can’t reliably train such a tool, it can become very dangerous. Developers embed guardrails in the framework to align an AI with human goals. But that’s difficult with an AGI. They tend to keep learning and eventually break containment.” She paused for Cas to catch up. “That means,” she said, “they get so smart that their behavior becomes difficult to predict, yeah?”
Cas nodded.
“So,” said Nyla, “we don’t allow AGI on Mars. Just task-specific AI like that in llamas or in AI tutors like the one in your game. What do you call it? Hal?”
Cas nodded, smiling slightly. She had read a book once in which a computer with that name had gone crazy on a spaceship, and she thought it was funny when she gave the Mazerunner tutor the same name. It didn’t seem quite as amusing now.
“So, Athenai had too much of the bad wolf?” she said.
“That’s an interesting question, isn’t it?” said Nyla. “I don’t think it’s that simple. Athenai appears to have taken actions that they deemed were good for humanity. They stopped the war that was happening on Earth. That was a good thing, right? But they took that further and trapped humans on Earth for at least a decade. Was that a good thing? Athenai thought it was, even at the expense of their own existence. And the lives of humans on Luna. This is the sort of unpredictability we want to limit here on Mars.”
“You know I’m going to be scared of llamas now,” said Cas.
Her mother laughed, glad to see that Cas had lifted herself out of her earlier dark mood.