By Christian Jarrett

If you spend time building your physical strength and stamina in the gym, you can expect to carry these benefits into everyday life. It will be easier for you to lug heavy shopping bags around or run for the bus. You will likely reduce your chances of developing cardiovascular and other illnesses. A new review of brain training games in Psychological Science in the Public Interest – the most comprehensive ever conducted – shows that unfortunately the same principle does not hold for these games. When you spend time completing mental exercises on your phone or computer, you will most likely only become better at those exercises or very similar tasks. Currently available evidence suggests you probably won’t see consequent improvements in your performance at school or work, or reductions in your chances of experiencing age-related mental decline.

Commercial brain training games, which involve simple memory, attention and reaction time tasks that become more difficult as you improve, are hugely popular. It’s become a multi-million dollar industry set to become multi-billion dollar in a few years. People are drawn to the games as a fun and convenient way to boost not only their brain power, but also, and as promised by the products, their brain health and their success in life.

But in 2014, a group of more than 70 psychology and neuroscience experts signed an open-letter warning that brain training companies, such as Lumosity, Posit Science and Cogmed, were making inflated claims about the benefits of their products, especially in relation to preventing dementia or reversing age-related mental decline. This prompted a rival, larger group of experts and practitioners to retort in their own open-letter that the evidence for the games’ benefits is compelling and growing. In the latest twist, in January this year, the U.S. Federal Trade Commission fined Lumos Labs (the makers of Lumosity brain training) $2 million for making unsubstantiated claims about the benefits of their products.

When expert opinion divides in this way it can be difficult to know who to trust. That’s why this new review is so timely and important. It represents a significant advance for several reasons. Stretching to 84 pages and including critical evaluation and interpretation of all 374 published studies that have ever been cited in support of the benefits of brain training, it is certainly comprehensive.

It is also objective. The reviewers, led by Daniel Simons at the University of Illinois at Urbana-Champaign, set out the standards of best practice for evaluating brain training interventions – ensuring baseline tests of performance, randomly allocating participants to adequately sized brain training and active control groups, and pre-registering the design, among other things – and they assess the current literature against this benchmark. Simons and his six psychologist colleagues also have no conflicts of interest.

Their verdict is stark. Simons and his team find that much of the published evidence is of poor quality, for example because of a lack of a proper control condition against which to compare the benefits of brain training, and a failure to measure objectively any real-life benefits. The field as a whole has also largely failed to attempt to control for the effects of participants’ expectations that they will experience benefits after completing brain training.

The researchers do single out the ACTIVE trial (Advanced Cognitive Training for Independent and Vital Elderly trial) for particular praise for its methodological rigour, but even this trial did not meet all the best practice requirements for investigating brain training. Overall, Simons and his colleagues conclude that the evidence that brain training leads to real-world benefits, beyond gains in the training exercises themselves, is “inadequate”.

The reason, the researchers explain, is likely because the mental exercises involved in brain training games are “decontextualised”. To improve our mental abilities as applied in real-world settings requires practice and experience in those domains. A person who spends many hours on brain training games but never engages in any real-world challenges is like the karate pupil who has only ever performed solo exercises in the dojo. Woe betide they ever find themselves in a fight. “We know of no evidence for broad-based improvement in cognition, academic achievement, professional performance, and/or social competencies that derives from decontextualized practice of cognitive skills devoid of domain-specific content,” the reviewers write.

What advice do Simons and his team have for consumers? Brain training games are unlikely to do you any harm, except perhaps to your pocket. But it’s worth remembering that time spent on the games is time you could spend doing something more beneficial, such as, to quote Simons et al, “… learning things that are likely to improve your performance at school (e.g., reading; developing knowledge and skills in math, science, or the arts), on the job (e.g., updating your knowledge of content and standards in your profession), or in activities that are other-wise enjoyable.”

This is not the end of the story – it’s possible future research will provide new evidence that is more favourable to brain training. The review sets out advice for researchers to conduct better quality trials in future (whether they attract the funding to do so is another matter) and Simons’ team have created an open-access website where they will post links to any new brain training trials and publish any errors that are discovered in their analysis of the current literature.

—Do “Brain-Training” Programs Work?

—The response of Posit Science to the new review.

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest