1350 words

Research into neural plasticity has been fruitful the past few decades. However, people like Steven Pinker in his book The Blank Slate attempt to undermine the effects of neural plasticity in regards to TBI and IQ, for instance. However, the plasticity of our brains is how our brains evolved (Skoyles and Sagan, 2002). So since our brains are so plastic, then doing certain tasks may help in terms of ‘processing speed’, reaction time and overall cognitive ability, right?

Science Daily reported on a new meta-analysis that took 15 years to complete that looked at how action video games affect reaction time and cognitive performance. What they found was something that I have talked about a bit: that playing these types of games increases one’s reaction time and even their cognitive ability. Unfortunately, the paper is not on Sci-Hub yet, but when it is released on Sci-Hub I will go more in depth on it.

The authors (Benoit et al, 2017) looked at 15 years of papers on action video games and cognitive performance from the year 2000-2015. They focused on war and shooting video games to gauge whether or not there was a causal effect on action video game playing and cognitive performance. They got two meta-analyses out of all of the research they did.

They studied 8,790 people between the ages of 6-40 and gave them a battery of cognitive tests. These tests included spatial attention tasks as well as testing how well one could multi-task while changing their plans in-line with the rules of the game. “It was found that the cognition of gamers was better by one-half of a standard deviation compared to non-gamers.” Though this meta-analysis failed to answer one question: do people who play games have higher cognitive ability or do people with higher cognitive ability play more games? The classic chicken-and-the-egg problem.

They then looked at other studies of 2,883 individuals and partitioned them into 2 groups: groups of people who played action games like war and shooter games whereas the second group played games like SIMS, Tetris and Puzzle (I would loosely term these strategy games as well). They found that both groups played for 8 hours per week, netting 50 hours of gameplay over 12 weeks.

What they found was that the results were overwhelmingly in favor of war and shooting games improving cognition. The interesting thing about these analyses was that it took years to get the data and it is from all over the world, so it doesn’t only hold in America, for instance. Though, in the abstract of the paper (all I have access to at the moment) Benoit et al (2017) write:

Publication bias remains, however, a threat with average effects in the published literature estimated to be 30% larger than in the full literature. As a result, we encourage the field to conduct larger cohort studies and more intervention studies, especially those with more than 30 hours of training.

This is in-line with numerous other papers on the matter of cognitive abilities and action video games. Green and Bavelier (2007) showed that video game players “could tolerate smaller target-distractor distances” whereas “similar effects were observed in non-video-game players who were trained on an action video game; this result verifies a causative relationship between video-game play and augmented spatial resolution.” They found that action video games ‘sharpened vision’ by up to 20 percent. Green and Bavelier (2012) also show that playing action video games may enhance the ability to learn new tasks and that what is learned from playing these types of games “transfers well beyond the training task.”

Green and Bavelier (2003) show that playing action video games showed better visual attention in comparison to those who did not play games. Even those who did not game saw improvement in visual attention which, again, shows that video games have an actual causal effect on these phenomena and it’s not just ‘people with higher cognitive ability choosing to play video games’. (See also Murphy and Spencer, 2009 who show that “There were no other group differences for any task suggesting a limited role for video game playing in the modification of visual attention.“)

Dye, Green, and Bavelier (2009) show that action video games increase reaction time (RT). Variables like videogame-playing when testing cognitive abilities are a huge confound, as can be seen, since people who play action video games have a quicker reaction time than those who do not—which, as I’ve shown, has a causal relationship with game playing since even the controls who did not play action games saw an increase in their RT. Achtman, Green, and Bavelier (2008) show yet again that action video game playing enhances visual attention and overall visual processing.

Green (2008: iii-iv) in an unpublished doctoral dissertation (the first link on Google should be the dissertation) showed the video game players “acquire sensory information more rapidly than NVGPs [non-videogame players]”.

Applebaum et al (2013) showed that action game playing “may be related to enhancements in the initial sensitivity to visual stimuli, but not to a greater retention of information in iconic memory buffers.” Bejjanki et al (2014) show that action video game playing “establish[es] … the development of enhanced perceptual templates following action game play.” Cardoso-Leite and Bavelier (2014) show that video games enhance “behavior in domains as varied as perception, attention, task switching, or mental rotation.”

Boot, Blakely, and Simons (2011) show that there may be a ‘file-drawer effect’ (publication bias)in terms of action video games increasing cognition, which Benoit et al (2017) acknowledge and push for more open studies.

Unsworth et al (2015) state that “nearly all of the relations between video-game experience and cognitive abilities were near zero.” So, there are numerous studies both for and against this (most of the studies for this being done by Green and Bavelier), and so this meta-analysis done by Benoit et al (2017) may finally begin to answer the question: Does playing action video games increase cognitive ability, increase visual attention and increase reaction time? The results of this new meta-analysis suggest yes, and it may have implications for IQ testing.

Richardson and Norgate (2014) in their paper Does IQ Really Predict Job Performance? state that there are numerous other reasons why some individuals may have slower RTs, one of the variables being action video game playing, along with anxiety, motivation, and familiarity with the equipment used, meaning that if one is experienced in video game playing—action games specifically—it may cause differences between individuals that do not come down to ‘processing speed’ or native ability, as is usually claimed (and with such low correlations of .2-.3 for reaction time and IQ, other factors must mediate the relationship that are not genetic in nature).

Now, let’s say the effect is as large as Benoit et al (2017) say it is at one-third of a SD. Would this mean that one would need to attempt to control for video game playing while testing, say, IQ or RT? I believe the answer is definitely pointing in that direction because it is clear—with the mounting evidence—that action video games can reduce RT and thusly confound certain tests. Action video game playing may be a pretty large confound in terms of the outcomes of IQ tests if these new meta-analyses from Benoit et al (2017) hold up. If this does hold up and playing action video games does affect both RT and cognitive ability at one-third of an SD (about 5 points), then the case can be made that this must be controlled for due to confounding the relationship.

In sum, if these effects from this new meta-analysis hold and can be replicated by other studies, then that’s a whole other variable that needs to be accounted for when testing IQ and RT. RT is a complicated variable and, according to Khodaddi et al (2014) “The relationship between reaction time and IQ is too complicated and revealing a significant correlation depends on various variables (e.g. methodology, data analysis, instrument etc.).” This, is in my view, one reason why RT should be tossed out as a ‘predictor of g‘ (whatever that is), as it is not a reliable measure and does not ‘test’ what it is purported to test.