We’ve all done it: picked the wrong line at the grocery store. As every line around you speeds effortlessly by, yours shuffles on like a funeral procession. Either you’ve underestimated how much produce the person in front of you could pack into their cart, or there’s a problem with the cash register. In any case, long after it becomes obvious that you’ve made a grave mistake, you still don’t switch lines. Instead, you roll your eyes, tap your toes, and keep on waiting. Why?

This is an everyday example of the sunk cost fallacy, a concept in psychology that accounts for the way humans tend to stick it out with costly decisions. As the theory goes, the more we invest—whether it be time, money or emotions—the less likely we are to abandon our initial choices. It’s often cited as the reason why we struggle to end unhealthy relationships, or finish expensive but mediocre meals, or watch past Season 7 of The Office. After we commit, we humans tend to shackle ourselves to our own decisions.

Of course, if people were completely rational, the sunk cost fallacy wouldn’t exist—hence, the “fallacy.” But last week, scientists at the University of Minnesota reported that humans are not the only species that fall prey to this curious behavioral phenomenon: Rats and mice suck at calling it quits, too—suggesting that there may well be an ingrained evolutionary driver for this behavior.

Previous research into the sunk cost fallacy had produced mixed results in animals, with rodents and birds inconsistently exhibiting the behavior from study to study. To transcend the species divide, University of Minnesota neuroscientists Brian Sweis, Mark Thomas and David Redish decided to design a set of experiments to examine the fallacy in both rodents and humans.

For the rodent part of the experiment, researchers made 32 mice (and later, 10 rats) fast for several hours. Then, the hungry rodents were introduced to a maze dubbed “Restaurant Row,” in which they foraged for food pellets from four different food counters. Each eating establishment advertised a different flavor: banana, chocolate, grape or “plain.” The only thing standing between the rodents and the mini meals was time: For the chance to chow down, they had to endure a timed countdown of up to 30 seconds.

The rodents’ decisions were parceled into two “zones.” First, an “offer zone” in which a fixed-tone pitch informed them of the wait time that stood between them and their reward—essentially, an upfront advertisement of the cost a rodent would need to pay. Once the rodents committed to pursuing a treat, they entered the “wait zone” to endure the countdown, but still had the option to back out and explore other options.

To the researchers’ surprise, when faced with a smorgasbord of choices, mice and rats exhibited the same behavior: The more time they spent in the wait zone, the more likely they were to brave it till the end. In all cases, the fact that an individual had already committed time and effort seemed to strengthen its resolve.

Since human food preferences are more complicated than those of rodents, researchers in a different lab led by Angus MacDonald used a different reward for the people part of the experiment. Instead of flavored pellets, human subjects spent 30 minutes debating whether to watch videos of kittens, dancing, landscapes or bicycle crashes. Similarly to the rodents, two barriers were placed before the videos: a screen indicating the necessary wait time required to view each video (the “offer zone”), then a separate interface with a ticking timer (the “wait zone”). When the time elapsed, the video played, and the subject was asked to rate it on a scale of one to five stars. Just as before, humans could press “quit” at any point in the wait zone and move on to the next video.

Videos aren’t food pellets. But Sweis was thrilled to see that his experimental design was sound: when sent to “forage” for a reward, humans were just as likely to employ the sunk cost fallacy as their rodent counterparts: More past commitment dictated more future commitment.

There was another twist, which might sound familiar. The longer each test subject waited for a reward, the more highly they “rated” it: Humans submitted more five-star ratings for long-awaited videos, and rodents lingered longer after consuming costly morsels—a proxy, said Sweis, for enjoyment. Half of the rodents’ precious hour for foraging was actually spent sitting next to food bowls they had recently emptied. Sweis believes this is a way to rationalize costly decisions after the fact: you wouldn’t have paid this much if it wasn’t worth it.

“This is a very exciting finding—that we observe this in common across species,” says Valerie Reyna, a professor of neuroscience and behavioral economics at Cornell who was not affiliated with the study. “This gets at the very fundamental mechanisms connecting reward to choices.”

Uma Karmarkar, a professor of neuroscience and consumer behavior at the University of California, San Diego, praised the study’s rigorous design. “It’s always challenging to figure out what kinds of biases in humans might be conserved across species,” Karmarkar explains. “The drive for doing so is hopefully clear: The more conserved these behaviors or biases might be, the more likely they are to represent conserved circuits and the more models we have to study them.”

Why are we ensnared by the sunk cost fallacy? Sweis offers several possibilities. Perhaps part of the reason may be that the future is unpredictable. We don’t always have the best metrics by which to judge the returns on our investments. So we’re forced to gamble on the accuracy of our own predictions. The sunk cost fallacy might be a self-defense mechanism, a way to reinforce our confidence in the effort we’ve already put in—essentially, a way to save face with ourselves.

Or, Sweis continues, it could have to do with the fact that all the work you’ve put in drains your physical and emotional motivation. It’s often a lot more work to quit what you’re doing and start with another option from scratch. In this light, the goal you’ve already begun moving towards can look all the more appealing—and the closer you get, the better it looks.

But if the theories about wasted resources are true, says Sweis, then the offer zone should look like the wait zone: The more time we deliberate our options, the more likely we should be to pursue them. In other words, waffling in this zone still accrues costs. But at least in the experiment, this wasn’t the case: Instead, the amount of time spent in the offer zone had no effect on whether a rodent or human went on to pursue their food pellet or video.

Sweis realized this meant decision-making process was split into two distinct phases. In the first, we consider our choices, which are still open-ended. But once we commit to a decision, we enter a second frame of mind, in which we grapple with whether or not to stick with our decision.

“This blows away a lot of standard theories about where sunk costs come from,” says Redish. “The fact that the zones are different means it has to be a different process in each.”

“[The study] allows us to pull apart some of the pieces that go into sunk cost fallacy and understand them a little better,” adds Karmarkar. “By identifying different processes, they’ve offered a new perspectives on some of the elements of this problem.”

Sweis has other evidence that different parts of the brain control these two phases of decision-making. In previous work, the team showed that different drugs target these systems independently in mice: Cocaine disrupts rational deliberations prior to commitment, while morphine compromises our ability to cut losses after making poor decisions. Sweis even identified, and successfully manipulated, a neural pathway in mice that seems to be involved in the re-evaluations of hasty decisions in the wait zone.

Much less is known about the neural circuitry at play as we deliberate in the offer zone. Redish thinks some of it has to do with our aversion to regret. Previous work conducted by the team shows that mice, like humans, express remorse about poor decision-making, and the fear of experiencing this negative emotion can inform future choices. No one, it turns out, likes being wrong.

Of course, there’s one big unanswered question about the current study: Is it really sound to compare hungry rodents seeking sustenance to humans pursuing the hedonistic pleasure of watching videos? “These animals are working for their livelihood, for survival, [while] humans are working for a luxury item,” Sweis explains. “[These different scenarios] can activate different parts of the brain.” Future studies should find more comparable tasks for the two groups.

While much work remains to be done, disentangling the neurochemistry that underlies these two components of loss aversion could help doctors create future treatments for psychiatric issues, including eating disorders or drug addiction. What’s becoming clear is that there may not a one-size-fits-all treatment for neurological malfunctions—and as time goes on, treatment regimens could be specifically tailored to the specific circuits at play. Importantly, Redish points out, behavior is also trainable: As we continue to dissect the components of decision making, it may be possible to incorporate more psychological tools and even games as therapeutics.

“In order to get there, we have to first understand how the system works,” he says.