A new article from a number of psychology researchers attempts to throw cold water on studies that claim to show that playing games leads to important cognitive benefits.
But in a new piece written for Frontiers in Cognition magazine (and thoroughly summarized in this Gamasutra blog post), FSU assistant professor Walter Boot, psychology doctoral student Daniel Blakely and University of Illinois collaborator Daniel Simons point out methodological flaws in many of these studies that they say throw the results into question.
Though these studies routinely show gamers have higher cognitive abilities than non-gamers, the authors argue that this could simply mean that those with higher cognitive abilities are more likely to become gamers, rather than that the games themselves are imparting any specific benefit.
Fliers seeking study participants that are “expert” gamers are could heighten this self-selection bias, the authors argue, by signaling that participants should be able to “perform on challenging, often game-like computer tests of cognition.”
In addition, gamers may be more motivated to perform well in these tests because they “come into the lab knowing exactly how they are expected to perform,” as Blakely puts it, while the novices have no such motivation.
While the authors don’t entirely discount the possibility that games could have positive cognitive effects, they say no study yet has met the methodological “gold standard” required to help prove the relationship.
In future studies, the authors suggest researchers select participants covertly, using surveys in which video game experience is just one of a number of evaluated metrics, and account more for other possible confounding variables that could lead to observed differences.
In 2009, a French research study found that playing the popular Brain Age series of titles did not lead to the cognitive improvements suggested by publisher Nintendo and Japanese neuroscientist Ryuta Kawashima.