Whether you prefer to play it safe or wade into risky business for larger payoffs, your decision process may largely depend on a tiny bundle of cells deep in your noggin.

By tagging and tweaking those cells in the brains of high-rolling rats, researchers were able to turn them from ballsy to cautious decision-makers. More specifically, the rodents switched their preference away from pulling a lever that released a jackpot of sugary treats 25 percent of the time to another lever that served up smaller-sized treats 100 percent of the time.

The finding, published in Nature, backs up previous studies in humans showing that drugs that interfere with those same brain cells can lead to gambling problems. The study also offers a neurological explanation for differences in risk-taking behavior as well as a target for new treatments for gambling addictions.

“These findings indicate interesting directions for further study,” the authors conclude, which “will benefit from deeper knowledge of how precisely defined cell populations, brain regions, and connections support risky choice.”

The brain cells at the center of the study are dopamine receptor type-2-expressing cells, or D2 cells. Bundles of these specific cells are found in the nucleus accumbens, a small region in the center of the brain that’s involved in motivation and rewards and remains important in addiction.

Early clinical work found that a dopamine-related medication used in the treatment of Parkinson’s disease can mess up the activity of D2 cells. This can lead to increased risks of gambling problems.

To better understand the role of D2 cells in risky decision-making, researchers at Stanford trained a bunch of rats to recognize two different levers. From repeated exposure, the rats learned that one lever offered up modest-sized sugar treats with every paw push. The other lever, however, would release just a tiny sweet treat 75 percent of the time and massive candy winnings 25 percent of the time. Once the rats were familiar with each of the levers, the researchers exposed them to both levers at once and noted which levers they pushed and when.

Like humans, the rats tested had a range of risk-preferences. Some frequently wanted to take their chances with the slot-machine-like lever, and others preferred the safe choice. This was particularly noticeable after each type of rat experienced a "loss"—defined as when a rat pulled the inconsistent lever and came up with just a tiny treat. After such a loss, the risk-averse rats were more likely to switch to the sure-thing lever. But the daredevil rats kept with the risky lever, trying for the jackpot.

When the researchers looked at the rats’ brain activity, they found that the D2 cells were fired up after losses—and that their level activity seemed to predict the rats' next moves. Rats with the most D2 activity after a loss became cautious and moved to the sure-thing lever; rats with relatively little D2 activity kept taking chances with the risky lever.

Next researchers engineered the rats to make their D2 cells fire up on command. The researchers then took the risky rats and had them experience a loss. By firing up their D2 cells right after the loss, the researchers switched the rats from risk-takers to chickens who opted for the safe lever. (Researchers did the same thing with the cautious rats and it didn’t alter their previous preference for the safe lever.)

The Stanford group’s findings may suggest that D2 cell activity alters how the rats—and potentially humans—perceive losses or undesirable outcomes, according to neurobiologist Nick Hollon of the Salk Institute and neuropharmacologist and psychologist Paul Phillips at the University of Washington. The pair wrote a commentary on the piece that also appears in Nature. If a loss is felt particularly hard, the idea goes, the rats would be reluctant to keep taking risks. As such, the researchers speculate their findings might offer insights to the brain workings that affect financial and economic decisions in humans.

Nature, 2015. DOI: 10.1038/nature17400 (About DOIs).