New Year’s Eve is a time for reflection about the year that passed and a time to set goals for the future. Should we keep doing what are we doing, or should we tackle new challenges? If you’re seven, or twelve, or twenty, it’s easy to think about new ambitions: learn Spanish, learn to paint, do a flip off your skateboard. But what if you’re older?

For me, much of the past year revolved around discussions prompted by a book of mine that was published in January, called “Guitar Zero,” about the science of learning and my own adventures in learning guitar at the age of forty. The basic premise was that the scientific evidence for a widespread view called the “critical-period effect” was far weaker than widely supposed.

The critical-period effect is the idea that you can’t do certain things—like learn a language, or learn an instrument—unless you start early in life. It’s a discouraging thought for anyone past adolescence. But, recently, the evidence for this idea had started to unwind. Barn owls have, for years, been a model illustration of critical periods. Young barn owls could readily adapt to a kind of virtual-reality experiment in which a prism distorted their perception of the world; older owls couldn’t. Or so the textbooks all say. But Brian Knudsen, a neuroscientist at Stanford, kept probing and found that there was, in fact, a simple way of teaching old owls new tricks: by breaking up a difficult job into small, bite-size pieces. Old owls couldn’t learn as fast as young owls, but they could come a long way if they took things incrementally, rather than all in one bite. I fancied myself as an adult owl and did the best I could to tackle the guitar bit-a-bit, keeping my expectations low and my persistence high.

While I was writing, I imagined that I was alone in my quest; the conceit was that I was going to practice for ten thousand hours, because nobody else my age would ever be willing to invest that kind of time. But in the past year, I’ve been deluged with e-mails from other adult learners. A journalist wrote to say that her seventy-six-year-old father had learned the guitar late in life, and had just told her that he was starting a band with his friends called “The Three Grandfathers.” In Portland I met (and jammed with) Rick King, an engineer who was keeping an Excel spreadsheet tracking every hour of his practice, having returned to the guitar in his sixties after surviving a heart attack. Looking back, it was silly to think that there was anything unique about what I was doing.

The central premise, however, turned out to be more right than I imagined: the suggestive evidence that I saw from the studies on barn owls was taken to a new level in human studies. Consider, for example, amblyopia, another long-standing example in the literature on critical periods. Amblyopia is a visual disorder in which the two eyes don’t properly align; sometimes it’s called “lazy eye.” The standard medical advice is to treat your child early, by getting them to wear an eye patch over the good eye (in order to strengthen the weak one). If you don’t treat the problem early, you can just forget about ever fixing it. Just after my book went to press, however, Dennis Levi, the dean of the School of Optometry at Berkeley, conducted a brilliantly simple study that was easy to conduct, yet would have seemed like a waste of time to anybody steeped in critical-period dogma. Levi and his collaborator stuck eye patches on the good eye of adult amblyopics, aged fifteen to sixty-one, whom everyone else had written off on the presumption that they could not learn anything new. He then set his subjects down at a video game—a first person shooter called Medal of Honor: Pacific Assault, to be exact—and told them to have fun. Levi found that his subjects got better at virtually every aspect of visual perception he could measure. It wasn’t that it was too late for adults to overcome amblyopia, it was that the myth of critical periods had kept people from trying.

Learning a new skill can also have unexpected benefits. Recently, the neuroscientist Nina Kraus published a pile of new studies that show that learning about music can facilitate getting better at other things, like language skills and hearing in noisy places—and can do so in ways that last for decades. (Her first studies were with children; other studies are now in progress to see if the same holds true for lessons taken by adults). Music training can help the brain better decompose the elements of sound—in ways that Kraus was able to directly measure in the lab—and seems to improve working memory, too. And, in another recent study, a team of Canadian researchers found evidence that a mere twenty days of music lessons can lead to better scores on a test of verbal intelligence.

Moreover, whether or not picking up a new skill makes you smarter, it can certainly make you happier. We can’t all be rock stars. But, as the cliché goes, the journey can be every bit as rewarding as the destination.

A New Year’s resolution shouldn’t just be about becoming great at something. It should be about becoming a better or happier or more fulfilled person. Whether your dream is to play piano, cook steak sous-vide, or finally learn to speak French, the lesson from all this new research is clear: there is no better time than now to take on something new. Happy New Year!

_Gary Marcus is a professor at N.Y.U. and the author of “Guitar Zero: The Science of Becoming Musical at Any Age,” and has written for newyorker.com about Noam Chomsky, the facts and fictions of neuroscience, moral

machines, and Ray Kurzweil.

_

Photograph by Martine Franck/Magnum.