Much as a human observer might posit the relationships between people based on physical features like eye color, hair color and height, the classifier did the same using neural connections. Functional fingerprints appeared most similar between identical twins, followed by fraternal twins, nontwin siblings and, finally, unrelated participants.

Research assistant professor Oscar Miranda-Dominguez — a member of Fair’s lab and the first author on the study — was surprised they were able to identify adult siblings using the models trained on children. The models trained on adults could not do this, possibly because the adults’ higher-order systems had already fully matured, making their features less generalizable to young, developing brains. “A further study with larger samples and age spans might clarify the maturation aspect,” Miranda said.

The model’s ability to draw nuanced distinctions between family members, he added, was remarkable, because the researchers had trained the classifier to delineate only “related” and “unrelated,” rather than degrees of relatedness. (Their 2014 linear model was able to detect these subtle differences, but more traditional correlational approaches were not.)

Although their twin sample was not big enough to finely parse genetic influences from environmental ones, there’s “no question” in Fair’s mind that the latter plays a large part in shaping the functional fingerprint. Their supplemental materials described a model to differentiate shared environment from shared genetics, but the team is careful not to draw firm conclusions without a larger data set. “Most of what we’re seeing here is about the genetics and less about the environment,” Fair said, “not that the environment doesn’t have a big influence on the connectome, too.”

To dissociate the contributions of shared environments from those of shared genetics, Miranda said, “one way to proceed could be to find the brain features that can distinguish identical twins from nonidentical twins, since the two types of twins share the same environment but only identical twins share the same genetic contributions.”

Although all the neural circuits they examined demonstrated some level of commonality between siblings, the higher-order systems were the most heritable. These were the same areas exhibiting the most variation among individuals in the study four years prior. As Miranda pointed out, those regions mediate behaviors stemming from the nexus of social interaction and genetics, perhaps predicting a “family identity.” Add “distributed brain activity” to the list of traits that run in families, right after high blood pressure, arthritis and nearsightedness.

Seeking Signs of Brain-Predicted Age

While Fair and Miranda in Oregon characterize the genetic underpinnings of the functional connectome, at King’s College London the research fellow James Cole is hard at work using neuroimaging and machine learning to decrypt the heritability of brain age. Fair’s team defines brain age in terms of the functional connections between regions, but Cole employs it as an index of atrophy — brain shrinkage — over time. As cells shrivel or die throughout the years, neural volume decreases but the skull remains the same size, and the extra space fills up with cerebrospinal fluid. In a sense, past a certain point in development brains age by withering.

In 2010, the same year that Fair co-authored the influential Science paper that generated excitement around harnessing functional MRI data to assign brain age, one of Cole’s colleagues led a related effort published in NeuroImage, using anatomical data, because the difference between the inferred brain age and chronological age (the “brain age gap”) might be biologically informative.

According to Cole, aging affects each person, each brain and even each cell type slightly differently. Precisely why such a “mosaic of aging” exists is a mystery, but Cole will tell you that, at some level, we still don’t know what aging is. Gene expression changes with time, as does metabolism, cell function and cell turnover. Yet organs and cells can change independently; there’s no single gene or hormone that drives the whole aging process.

Although it’s widely accepted that different people age at different rates, the notion that various facets of the same person might mature separately is slightly more controversial. As Cole explained, many methods to gauge aging exist, but not many have been combined or compared just yet. The hope is that by measuring many tissues within an individual, researchers will be able to devise a more comprehensive assessment of aging. Cole’s work is a start at doing this with images of brain tissue.

The theoretical framework behind Cole’s approach is relatively straightforward: Feed data from healthy individuals into an algorithm that learns to predict brain age from anatomical data, then test the model on a fresh sample, subtracting the participants’ chronological age from their brain age. If their brain age is greater than their chronological one, this signals an accumulation of age-related changes, possibly due to diseases like Alzheimer’s.

In 2017, Cole used algorithms called Gaussian process regressions (GPRs) to generate a brain age for each participant. This allowed him to compare his own assessment of age to other existing measures, such as which regions of the genome are turned on and off by the addition of methyl groups at various ages. Biomarkers like methylation age had been previously used to predict mortality, and Cole suspected brain age could be used to do so as well.

Indeed, individuals with brains that appeared older than their chronological age tended to be at a greater risk for poor physical and cognitive health and, ultimately, death. Cole was surprised to learn that having a high neuroimaging-derived brain age didn’t necessarily correlate with a high methylation age. However, if participants had both, their risk of mortality increased.

Later that same year, Cole and his colleagues extended this work by using digital neural networks to assess whether brain-predicted age was more similar between identical twins than fraternal twins. The data came straight off the MRI scanner, and included images of the whole head, complete with nose, ears, tongue, spinal cord and, in some cases, a bit of fat around the neck. With minimal preprocessing, they were fed into the neural network, which, after training and testing, generated its best estimates of brain age. In keeping with the genetic-influence hypothesis, the brain ages of identical twins were more similar than those of fraternal twins.

While his results indicate that brain age is likely due in part to genetics, Cole warned not to neglect environmental effects. “Even if you do have a genetic predisposition to having an older-appearing brain,” he said, “chances are if you could modify your environment, that could more than outweigh the damage that your genes might be causing.”

The help that neural networks provide to this effort to read brain age comes with trade-offs, at least for now. They can sift through MRI data to find differences between individuals, even when researchers don’t know what features might be relevant. But a general caveat of deep learning is that no one knows what features in a data set the neural net is identifying. Because the raw MRI images he is using included the entire head, Cole acknowledges that perhaps we should call what they are measuring “whole-head age” rather than brain age. As someone once pointed out to him, he said, people’s noses change over time, so what’s to say the algorithm wasn’t tracking that instead?

Cole is confident this isn’t the case, however, because his neural networks performed similarly on both raw data and data processed to remove head structures outside the brain. The real payoff from eventually understanding what the neural networks are paying attention to, he expects, will be clues about what specific parts of the brain figure most in the age assessment.

Tobias Kaufmann, a researcher at the Norwegian Centre for Mental Disorders Research at the University of Oslo, suggested the machine learning techniques used to predict brain age almost don’t matter if the model is properly trained and tuned. The results from different algorithms will typically converge, as Cole found when he compared his GPRs to the neural network.

The difference, according to Kaufmann, is that Cole’s deep learning method reduces the need for tedious, time-consuming preprocessing of MRI data. Shortening this step could someday speed up diagnoses in clinics, but for now, it also protects scientists from accidentally imposing biases on the raw data.

Richer data sets might also permit more complex predictions, like identifying patterns indicative of mental health. Having all the information in the data set, without transforming or reducing it, might therefore help the science, Kaufmann said. “I think that’s the big advantage of the deep learning method.”

Kaufmann is the lead author on a paper currently under review, constituting the largest brain-imaging study on brain age to date. The researchers employed machine learning on structural MRI data to reveal which brain regions showed the strongest aging patterns in people with mental disorders. Next, they took their inquiry one step further, probing which genes underlie brain aging patterns in healthy people. They were intrigued to note that many of the same genes that affected brain age were also involved in common brain disorders, perhaps indicating similar biological pathways.

The next goal, he said, is to go beyond heritability to unravel the specific pathways and genes involved in brain anatomy and signaling.

Although Kaufmann’s approach to decrypting brain age, like Cole’s, focuses on anatomy, he underscored the importance of gauging brain age in terms of connectivity as well. “I think both of these approaches are extremely important to take,” he said. “We need to understand the heritability and the underlying genetic architecture of both brain structure and function.”

Cole, for one, has no shortage of further research endeavors in mind. There is something compelling about the need for artificial intelligence to understand our own, underscored by advances that illuminate the connection between genes, brains, behaviors and ancestry. Unless, of course, he finds he’s been studying nose age all along.

This article was reprinted on Wired.com.