After several hours of distractions and detours, Morgan finally reaches Deep Storage. She’s here to get her arming key, so she can set the station to self-destruct.
As she enters, Alex calls her up. He’s locked the door behind her, sealing her in.
This is actually an attempt at protecting Morgan. On one hand, Alex really doesn’t want his sister to blow up the space station and destroy all of their work. This place is new to memory-wiped Morgan, but this place is home to Alex. The two of them have spent the last few years studying the Typhon and unlocking fantastic new technologies. From Alex’s point of view, his sister has gone a little crazy due to her personality drift and is acting irrationally.
But while he doesn’t want her to destroy the station, he also doesn’t want to see her get hurt. He’s actually helped you out at a couple of points during the adventure. As much as he wants to see your quest fail, it’s even more important to him that you survive.
I love Alex. He’s one of my favorite villains.
Cringe Executive Officer
I recently watched the Netflix original The Old Guard. I liked the premiseA small group of immortal people, who use their long lives to try and do good in the world by taking down madmen and tyrants., but the whole thing was ruined by the childish cartoon villain at the center of the story. He was the typical strawman corporate tycoon, constantly talking about how much he wants to make more money. He was pure cringe.
That’s not how corporate leaders talk! Most corporate leaders aren’t nakedly motivated by money like this. If you got to know them, you’d probably discover that their real goals are more likely fame, personal glory, sex appeal, expensive hobbies, or social standing, and making shitloads of money is simply a means to that end.
Even if they really only care about money for its own sake, CEOs don’t brazenly say so in front of other people. People criticize Mark Zuckerberg for being a weirdo alien lizard robot, but even with his particular social handicaps he still knows better than to walk around talking about how awesome money is. Maybe deep down a CEO just cares about money, but on the surface they’ll talk about things in terms of saving jobs, serving customers, creating new technologies, and solving large-scale societal problems.
The CEO Villain
As our society has become more technology-oriented, tech CEOs have become popular villains, occupying roles previously reserved for politicians and crime bosses. That’s fine. Stories need to change with the times. My problem is that Hollywood has no idea how to write this sort of villain. CEO villains end up being strawman loonies who will shove puppies in a blender to make an extra 1%. That’s not interesting, and it doesn’t feel genuine.
EA CEO Andrew Wilson is a heartless jerk that cares nothing for the art of videogames, the artists that create them, or the audience that consumes them. His decisions are entirely focused on direct short-term profits. He is the stereotypical money-chasing CEO, and even he has the wit to avoid saying so. He doesn’t go around talking about how much he loves money and wants more profits. He always frames his obnoxious decisions in terms of “giving consumers what they want” and “innovating”.
This is why I like Alex Yu so much. He’s a villainous CEO who doesn’t talk like a Ferengi at a shareholders meeting. He’s a bad man, but he’s bad in a lot of understandable ways. He does evil things, but he didn’t set out to do evil. He drifted towards evil through a long series of personal compromises.
This is NOT Caveman Science Fiction
I really dislike the trope known as Science is Bad. The story takes the rhetorical position that curiosity is hubris and we shouldn’t meddle with things we don’t understand. You can see the trope parodied in the Dresden Codak comic Caveman Science Fiction.
Stories need some sort of challenge for our protagonists to overcome. We can do the action story thing and have some evil jerk doing nefarious deeds. That’s fine if you’re writing a good guys vs. bad guys story, but if you’re a fan of the sci-fi genre then you’re probably more interested in the science and less interested in villains and their motivations. If you’re trying to write a story about Science Stuff then Dr. Damien B. Nefarious is a distraction and his various machinations are a waste of page space.
But if we’re here for Science Stuff then the economics of storytelling suggest that it’s better to ditch the evil antagonist – or reduce him to the role of a Pandora-style catalyst – and have the mystery and danger come from all the cool Science Stuff that the book is about. At the same time, “Scientists invent a Cool Thing and everything is fine” generally makes for a boring story. We want stories to feature science, and stories need conflict. So we write stories where science-things go horribly wrong and create lots of thrilling danger for our protagonists to overcome and mysteries for them to solve.
A single story like this is fine in isolation, but across an entire culture this constant flood of fictional scientific disasters begins to feel anti-science. Authors need to be careful with their messaging or the whole thing can come off as embarrassingly pro-luddite.
The first Jurassic Park movie falls into this trap with Professor Jeff Goldblum constantly reminding us that there are things we just shouldn’t mess with and things we weren’t meant to know. A more sensible message might be, “Maybe you shouldn’t revive apex predators from millions of years ago and then put them on display for thousands of clueless civilians.” But instead the writer takes the more shorthand approach and suggests that it’s foolish to even experiment with this branch of knowledge.
A lot of popular culture science fiction has this sort of thing going on. Aside from the yucky implication that this can frame ignorance as virtue and incuriosity as wisdom, there’s also the problem that it generally makes for a predictable story.
Prey sort of splits the difference. Sure, Alex Yu is undone by hubris, but no character in the story ever suggests that it was fundamentally wrong or foolish to study the Typhon. Alex is driven, reckless, and casual about the horrific sacrifices he’s willing to force others to make for the good of mankind. When things begin to unravel he’s optimistic to the point of self-delusion about his ability to regain control.
The message here isn’t that science is bad, but rather that bad science is bad. I can get behind that.
Alex Yu Bad Guy
I like Alex Yu because he feels real. I imagine that Alex justifies a lot of his reprehensible actions by saying that he has “no choice”. And in many cases he’s right! He is indeed trapped in “damned if you do” / “damned if you don’t” situations. The problem is that he’s trapped in those circumstances because of poor decisions he’s made in the past. He’s spent years spreading lies of omission, cutting corners, and making moral compromises in the name of some eventual greater good. He didn’t realize it, but he’s gradually created a world that runs entirely on lies and secrets. He can confess the truth and his kingdom will unravel when everyone turns on him, or he can continue his lies and the kingdom will self-destruct due to its own ignorance.
The thing is, Alex really is onto something with this neuromod technology. The ability to share knowledge and skills like this would literally change the world for the better. This would be the biggest boost to human potential since the printing press. This technology is indeed worth risking lives for. The problem with Alex isn’t the risk-taking, it’s the dishonesty and literal human sacrifice.
Alex doesn’t tell the people aboard Talos 1 about the Typhon. If he was honest with people, then the people on the station could be educated about the risks. Everyone would have a predetermined plan to follow in the event of a containment breach. People that weren’t up for that sort of risk could quit and go back to Earth, and it would be trivial to replace them with people who were willing to embrace the job, risks and all. Heck, we have people right now that are willing to go to die on Mars with no way home, just because that’s the next big leap for our species. Instead Alex has built a world where there are no fire alarms, no fire drills, nobody knows what to do in the event of fire, nobody knows what fire looks like, and everything is made of wood.
Alex hides the truth from everyone, which means he also needs to hide the problems from everyone, thus forming a feedback loop of escalating secrecy and coverups. His secrecy is inherently self-destructive.
Alex doesn’t tell people that neuromods are based on the Typhon. Maybe this would make people wary of using them. That’s fine! Everyone could decide for themselves how much risk they were willing to take on. Most sane people would pass on the early neuromods, opting to let someone else take the first leap. The thing is, the promise of neuromods is incredible. Regardless of the risk, some people would jump at the chance to use them. Which means Alex would always have a pool of willing volunteers. He could use that eagerness to set up a proper test with sensible safeguards. “Hey, you can have this neuromod that will let you play guitar like Steve Vai. But you need to agree to live in quarantine for X months while we observe you.” (Or whatever. I don’t know how to run a proper scientific / medical study, but I’m pretty sure smart people could devise rules and guidelines for how you ought to do this sort of thing.) If anything went wrong and their brains melted, the blame wouldn’t fall entirely on Alex, because he would have the informed consent of the test subjects.
But instead Alex keeps everything a secret, which means everyone is randomly jamming neuromods into their face, unaware of the possible risks. In this case Alex is lucky and the neuromods don’t seem to cause serious problems.At least, not until the containment breach. Which is good, because if there were problems it would have necessitated a massive cover-up.
(Actually neuromods might cause long-term problems. The story isn’t clear on this. There are reports of some people exhibiting symptoms of anxiety and paranoia, but I couldn’t determine if this was a problem with early-gen neuromods, or if it was a result of working with the technology in the Psychotronics lab. Or maybe it was the result of living on this spooky space station for extended periods? Or maybe it’s a problem with living in close proximity to the Typhon for too long? This topic didn’t get a proper study before the station fell. Also there’s the problem of Morgan’s personality drift. Maybe that’s a long-term problem with neuromods, or maybe that’s just a side effect of installing and stripping out mods again and again. These are all important things that need to be studied openly, but that doesn’t happen because Alex is in a constant state of denial and cover-up.)
Alex hides security lapses. That spares him from scrutiny and criticism, but it also prevents everyone from taking proper steps to correct those lapses.
At several points in the game we find people who learn “too much”, and freak out. Alex is forced to silence these people to protect his secrets. At one point a researcher finds out that neuromods contain Typhon cells and he freaks out, saying that he needs to “tell everyone”. Alex replies somewhat ominously: ‘Calm down. Don’t do anything rash. I’m sending someone to help you.’ Stuff like this wouldn’t be a problem if everyone already knew.
The story never explicitly states that Alex has anyone killed, Er. Aside from the “volunteers”. I’ll talk about them in a minute. but he very likely takes these people and memory-wipes them by removing their neuromods. I’ll bet he hands out free neuromods to every manager arriving on the station, just so he has a way to wipe them if they learn something they shouldn’t.
Morgan’s actions were more overtly evil. Her tests where she fed prisoners to the Typhon were deeply, profoundly immoral. There was no ambiguity in her actions. Alex was complicit in these crimes, but Morgan was the chief instigator.
But despite Morgan’s overt evil, in the end it was control-freak Alex who did the most damage. Morgan’s heinous tests killed a small handful of people, while Alex’s aversion to accountability placed the entire station – and indeed the human race itself – at risk.
The Neuromod Conundrum
Earlier I talked about the virtuous potential of neuromods. If we have a way to download and share knowledge as easily as we share music and books, then this creates the potential to turn humanity into a species of brilliant scholars and innovators. What if anyone could become a brilliant surgeon or physicist in just five minutes? Just imagine the boost to innovation if researchers could be masters of multiple disciplines at once. Someone trying to create anti-aging treatments is likely to find progress easier if they can master genetics and biology and chemistry and internal medicine before age 25. Someone with the energy of youth but the knowledge of the old would be really good at pushing scientific progress forward.
On the other hand, neuromods come from Typhon creatures, and the only way to make those creatures is to feed them humans. You could argue that this means neuromods are inescapably immoral, regardless of any possible boon they might offer humanity. They are based on human sacrifice, and therefore they’re inherently evil.
Maybe that’s what the author intended to say, but I can’t escape the impression that perhaps we can find a way around this problem. For example, what happens if you feed the Typhon a bunch of lab rats? The Typhon feed on “consciousness”, and maybe the mind of a rat is too simple for a Typhon. Or perhaps Typhon are limited by the minds they consume, and therefore feeding them rats would yield a bunch of useless dum-dum Typhon that aren’t sophisticated enough to be made into neuromods?
I don’t mind that the author never answers this question. If we can’t make more Typhon with animals, then fine. What bothers me is that I can’t find any evidence that anyone tried.
What about willing donors? Instead of harvesting death-row inmate “volunteers”, what if we used actual volunteers? Perhaps there are terminally ill patients that are willing to donate their last few days of life to further the cause of neuromods. Maybe they’d do this in exchange for money, or maybe they’d do it as a “fuck you” to cancer. I’m not an ethicist and I don’t know how the general public would react to this proposition, but I can imagine a world where people regard this sort of thing as morally equivalent to (say) organ donors. It sounds creepy to me here in 2021, but organ donation sounded really creepy to people 100 years ago, and we’re basically okay with it now.
The point I’m getting at is that there could be other routes to making neuromods that are less obviously evil. Maybe you could do it with animals. Maybe you could do it as a form of hospice “care”. We can make hamburgers without killing animals these days. If we’re willing to go to that much trouble for a sandwich, then maybe we could figure out a way to make neuromods without needing to kill people.
The problem with these routes is that you’d need to have an open and honest conversation about them. Alex is unwilling to allow other people to judge him or second-guess him, so he can’t allow that conversation to take place. Thus he’s trapped himself in this situation where it’s either direct human sacrifice, or nothing.
I’m Just Doing my Job!
Another thing I love about Alex is that he shows genuine enthusiasm for his work, which you don’t often see in villainous scientist types. He isn’t doing this for personal glory. He isn’t doing this because he wants to show the world they were wrong to doubt him. He’s not in it for money, or power, or to impress a girl. He really is excited to unlock all of this human potential. He loves his sister and he likes working with her. Also, while the story doesn’t delve too much into his upbringing, I get the impression that he feels he has something to prove to his parents. These are all very relatable, non-evil motivations.
Alex didn’t wake up one morning and decide to start murdering people and wiping minds so he could get rich. He was tempted by his natural curiosity to find out what the Typhon can do and what humanity can gain from them.
Once he discovered that Typhon cells could be used to copy knowledge, the potential was obvious. And once he and Morgan discovered that Typhon-based technology could grant people literal superpowers like telekinesis, he couldn’t just walk away. Eventually they ran low on viable Typhon specimens. At that point it’s easy to see how he talked himself into his current behavior. “Hey, these inmates are horrible people that have killed and tortured others. And they’re going to die anyway, right? Why not use their deaths to help humanity? We could save countless lives with this technology. They kinda owe it to humanity anyway. I’m actually doing a public service!“
Of course he couldn’t tell people he was doing this, so he was obliged to lie about it. And once you start with lies, you’re trapped by them. You need to guard your secrets, which prevents you from asking for help when things go wrong. You’ll need additional lies to cover up the Big Lie, and those lies will require additional supporting lies. Pretty soon your job isn’t doing research, but maintaining an ever-growing deception.
Alex is a coward, a weasel, and a bastard. His quest to bring neuromods to the world is theoretically a noble one, which he pursues by deeply unethical means. He’s arrogant enough to think he can do things alone and that arrogance eventually leads to the fall of Talos-1 and the death of nearly everyone on the station. But he made these mistakes for understandable reasons that come from his personality and upbringing.
All of this makes him so very human to me. His decisions are so much more interesting than the antics of another buffoonish strawman tech billionaire obsessed with “profits”.
 A small group of immortal people, who use their long lives to try and do good in the world by taking down madmen and tyrants.
 Heck, we have people right now that are willing to go to die on Mars with no way home, just because that’s the next big leap for our species.
 At least, not until the containment breach.
 Er. Aside from the “volunteers”. I’ll talk about them in a minute.
What did web browsers look like 20 years ago, and what kind of crazy features did they have?
Punishing The Internet for Sharing
Why make millions on your video game when you could be making HUNDREDS on frivolous copyright claims?
Games and the Fear of Death
Why killing you might be the least scary thing a game can do.
A video discussing Megatexture technology. Why we needed it, what it was supposed to do, and why it maybe didn't totally work.
Gamers Aren’t Toxic
This is a horrible narrative that undermines the hobby through crass stereotypes. The hobby is vast, gamers come from all walks of life, and you shouldn't judge ANY group by its worst members.