For the past quarter of a century, Star Trek: The Next Generation has been regarded as a bastion of intellectual entertainment that approached how humanity would operate in its finest hour. It’s hardly surprising since the show was meant to encourage us, the viewers, to greatness—to a future where creator Gene Roddenberry envisioned we would never want for food, shelter, or material possessions. Where we would explore and philosophize and pursue creative endeavors to our hearts content. It sounds like a glorious future; it should be a glorious future.

But… there’s a problem with Data.

When I was very small, my favorite character on The Next Generation was Data. Which makes sense, as Data was the Spock stand-in of the series, and Spock was my real favorite. Before everyone starts roasting me alive for not appreciating how unique Data is in his own right, let me assure you that I find Data to be perfectly singular. But in his original inception, he fulfilled a function—the straight man who doesn’t understand all these wacky emotional humans. He was a variation on Spock, one that all Treks (and many others shows besides) have in some capacity. Spocks are often essential to the science fictional experience because they allow us to view humanity from the outside.

It is here that the similarity ends, however. Spock was working to suppress his humanity, at least initially, and then to find a way to balance it with his Vulcan half. Data was doing just the opposite—working to become more human with every experience, piece of knowledge, and new hobby he picked up along the way. And that… depresses me.

Which is probably confusing at first blush, so allow me to elaborate:

Spock’s portrayer, Leonard Nimoy, is fond of pointing out that his character’s struggles are in their essence, entirely human. That we are all, in our everyday lives, looking to balance exactly what Spock is: emotion and logic. The place where these dueling natures meet and the importance of their co-existence are the building blocks of his entire character arc. The fact that Spock finally comes to terms with his need to embrace both the human and Vulcan halves of himself is a solid progression; at the end of the day, Spock has two legacies. Allowing them to live side by side in him without anger or confusion is a healthy place for him to end up.

But Data is not half human. Rather, he is created by a single man (and his wife, we later find out) with a massive ego and the brain to match. A guy who was so full of himself, he decided to make all his kids look exactly like him. In reality, Noonian Soong was doing via scientific means the same thing that many humans decide to do—to extend his legacy with progeny. He and his wife Juliana considered the androids they built as their own children. But rather than respect the newness of what he had created, Soong worked hard to make his kids fit in. He created a brand new species and decided that it was only as good as it was human.

Does anyone else see my problem with this?

Data is childlike in many ways due to operating with a limited experience set. And one of the ways that he remains childlike is in his reluctance to question what Noonian Soong wanted for him. The android takes his father’s desires as gospel—if he was intending to create an android that could pass as a human, surely that is what Data must become. Nevermind the fact that emotions are capable of being realized by countless species that the android himself has encountered. It is an equivalency problem; in Data’s positronic mind Human = Good. Of course he should emulate them.

And the majority of Data’s friends and crewmates never bother to disabuse him of that notion. More distressing, they constantly project their own human viewpoints onto his development and behavior. The episode “In Theory” is a perfect example of one of these situations amped up to its most cringe-worthy. Jenna D’Sora assumes that because Data is kind to her, because he shows concern for her emotional well-being, that he must have romantic feelings for her. After striking up a relationship with him, she shows dismay at learning that Data is running a program to accommodate their status, that he can multitask when kissing her. This despite the fact that she had been told by Data that he has no emotions. Move a few words around in the scenario: let’s say that D’Sora had been a man and Data had been a Vulcan woman. That D’Sora had pressed entering into the relationship because any Vulcan female who asked after his well-being had to be romantically inclined toward him.

That scenario just got super uncomfortable, didn’t it?

Of course, we’ve met an android who presumed that he could be more than simply human—Data’s psychopathic predecessor, Lore. What’s noteworthy is that Lore was “more human” than Data was before his deactivation; he possessed an emotion chip that allowed him to feel as humans did. Unfortunately, he lacked the empathy to use that ability to evolve. It’s telling (and common to science fiction in general) that most examples we get of advanced mechanical beings on Star Trek use their impressive abilities to try and wipe us out, either by accident or design: from the M-5 computer to V-Ger to Lore, becoming more often means that humans are on the Quick and Easy Offing Menu. Data, one of the very few examples who is not in the habit of snuffing out Terrans, is apparently only inclined in that direction by virtue of wanting to be one of us.

And this perspective is incredibly limiting, especially when Next Gen is constantly expounding on Data’s status as the very sort of “New Life” Starfleet means to seek out. Why not let Data be unique, then? Why not let him know that he’s supported if he choses to own the parts of himself that are not human at all? Examples are always useful in forming behavior, I grant that, and he is on a ship where we are the primary species he comes into contact with. But the only person who ever seems to intimate that Data could be something far beyond human is Captain Picard. He is the only person who comes close to asking Data harder questions, to examining exactly what Data’s emotions or lack thereof incorporate into his being, to telling Data that he doesn’t have to always make the same choices a human would make if they aren’t the choices he’d prefer.

Every other person on the ship is either tickled or irate when Data makes a human faux pas, and that’s often treated as comic relief within the confines of the show. But why is that comical? Why isn’t it instead looked upon as narrow-mindedness for refusing to consider the ways in which their fellow crewmember and friend is vastly different from them? When Spock was harangued aboard the Enterprise, at least we knew that he was being teased toward the humanity within him that he refused to admit. It wasn’t “Vulcans are bad, humans are good,” (at least, not amongst the bridge crew), rather “Vulcans are good, but you are also human and that’s good too.” Doctor McCoy was the first person to razz Spock into an emotional reaction over tedium, but was incredibly protective and furious if anyone ever tried to force emotional displays out of his friend. Data, on the other hand, is simply being laughed at for not knowing that his reactions are odd.

Which is sort of bullying. But it’s fine because he doesn’t have the emotions to know it’s hurtful, right?

Lal, Data’s created “offspring,” provokes even more interesting questions on this front. Data offers her much more freedom than he was allowed—he lets her chose her own skin (and gender, which he and Counselor Troi are very adamant about for some reason) from thousands of composites that he has created. So here’s a question: if Lal had chosen the Andorian skin she considered, would he have expected her to emulate Andorians? Is Data incapable of understanding why anyone would wish to behave in a way that he considers contrary to their appearance? And if so, who is responsible for instilling that belief in him?

Even more unsettling is Troi’s reaction to the whole process. She is mainly concerned with making sure that Lal is attractive and easy to socialize. Being Andorian in appearance might make it difficult for all the people (that’s humans, by the way) on the ship to relate to her. When she sees a human male possibility, she remarks that he’s attractive, so there shouldn’t be any problems. In other words, humans—in this enlightened age—are still so vapid and appearance-obsessed that they will only be comfortable with Lal if she appears as the same species and is good-looking by their modern standards. Moreover, they insist that Lal come to this gender-appearance decision immediately, and that she choose carefully because this will be who she is forever.

Um, why? She’s an android, she should be able to change her appearance if and when she pleases. Humans themselves are capable of changing their genders if they find that the one they were born with doesn’t suit them. Why isn’t Lal afforded the same options? Perhaps Data lacks experience with a suitably diverse population to know this about humanity, but what is Counselor Troi’s excuse? What is wrong with the 24th century?

These problems are compounded in the episode where Data meets his mother Juliana, “Inheritance.” In a scene that roughly parallels Spock’s mother talking to Doctor McCoy in “The Journey to Babel,” Juliana tells Geordi about the things that young Data used to do that other humans might find amusing. (You know, before they wiped his early memory and replaced it with the memories of the colonists on Omicron Theta. Because that’s totally a legitimate thing to do to anything that you’re planning on treating like a human being.) She has a chuckle over how Data originally never wanted to wear clothes, which made the settlers very uncomfortable around him. Because Data didn’t see the need for them, Juliana and Dr. Soong gave Data a Modesty Protocol to ensure he would want to wear clothes and make everyone less nervous.

Because in the 24th century, the nudity taboo is still so strong that Data—who I feel the need to remind us all, is still not human—must be altered fundamentally to ensure adherence to human cultural norms. (By the way, Dr. Soong, would you care to explain why you felt the need to make Data anatomically correct in the first place? I’d be real interested on that account.) Because he’s supposed to be easy on the eyes for us, to blend in. And it’s hilarious when he doesn’t, isn’t it? I understand that parents love to tell stories like this about their kids, but those stories do not usually end with “And then I opened little Harry’s brain and reorganized some synapses so that he would never take his pants off in front of grandma again.”

This is not evolved, highbrow humanity at its finest. This is shoving anything different in a box because considering how the universe looks from Data’s perspective would just be plain silly! I mean, he wants to walk around naked because he physically has no need for clothes! That’s not logic, that’s lunacy—what a character! Look, I am all for celebrating humanity in fiction, but it is a poor way of doing it by suggesting that everything in the universe would be better if it were more like us. That’s not a celebration, it’s ego. Ugly, poorly-informed ego.

It doesn’t make me hopeful for our future when I watch how people treat Data. It makes me wonder how we will ever become evolved enough, open-minded enough to be what Jean-Luc Picard insists we are. Flaws are part of human nature, yes, but superiority and even the most mild of prejudices are learned. We can do better. Even Star Trek can do better.

For Data’s sake.

Emmet Asher-Perrin just wants Data to enjoy being an android for once. You can bug her on Twitter and read more of her work here and elsewhere.