Thursday, December 3, 2015

Is Online Brain Training Real?

Almost every morning I hear an advertisement for an online 'brain training' service called Lumosity on the radio. As someone who is pretty interested in brains, training, and pretend words - I decided to look into Lumosity and see what it's all about. Here's what I found:


First of all, brain training is nothing new

Since our ancestors discovered writing, math, and science, we have been interested in improving how we think. And we've gotten good at it. We've learned that people who practice certain mental skills tend to get better at those skills than people who don't. If you practice reading, you'll be able to read more complicated books. If you practice solving math equations, you'll be able to solve more complicated problems. If you practice guitar, you'll be able to play more complicated songs. And if you write a bunch of shitty blog posts, hopefully you'll eventually figure out how to write some good ones. 

Just about everyone has an opinion on what kinds of activities enrich you mind or rot your brain but I think that it's important that we examine our intuitions methodologically. It is contentious whether we can objectively measure what activities can make somebody smarter but I think we know enough that we can rule out activities that definitely don't. In the last couple of years more and more people are using neuroscience and psychology to justify products for cognitive enhancement. Here are just a few:

  • http://www.lumosity.com
  • http://www.neuronation.com/
  • http://brainage.nintendo.com/
  • http://www.brainmetrix.com/


My concern is that while these approaches might have good intentions, it’s important to take a step back and critically examine whether they actually work.


TLDR: The biggest issue with new-age brain training services is that they haven’t been validated 

When someone come up with a new method in science, somebody has to make sure that it works the way that we think it does. If you come up with a new treatment for ADHD we would like to know that it actually reduces the symptoms. If you discover a new species of octopus, we need to make sure that you didn't re-discover one that we already know about. If you come up with a brain training service and you sell it for $50 a year to people worried about their mental health...you should make sure that it works. Good science is full of sanity checks and re-dos. The history of science is littered with mistakes and assumptions from well-meaning people who didn't check their work well enough. The best way to make sure that we take more steps forwards than backwards is to constantly make sure we know what we think we know. Doing this requires validation. None of the popular marketed brain training services have been appropriately validated to show that they work. This is a tricky point to get across so I think that the best way to explain what I mean is to just walk you through an example:



The Example
This is the flanker task:
i1.png


If you sign up to participate in a psychology experiment in college, this might be a task that the experimenter will ask you to do. You’ll sit at a computer and a bunch of arrows will pop up on the screen, maybe in different patterns. And you are told to use the use the arrow keys on the computer to indicate which direction the central most arrow is pointing as quickly as you can. This task is interesting to neuroscientists and cognitive psychologists because it’s pretty good at measuring how well people can filter out distracting information. That is to say - if you can ignore the arrows that aren't important and notice the one that is important, you have to be using attention of some sort.


Now this is one of Lumosity’s games that claims to train "attention and response inhibition", called ‘Lost in Migration’:
screenshot-lumosity-3.jpg


In this game, players use the arrow keys to indicate which direction the central-most bird is pointing without being distracted by the other birds - as quickly as they can. They pop up in different patterns in different locations on the screen.


Can you notice any similarities between the Flanker Task and Lost in Migration?

Clearly these tasks are nearly identical. It seems like most of Lumosity's games are designed to closely resemble a neuropsychological task. Cognitive psychologists have come up with lots of games like the Flanker task that are thought to measure different parts of your cognitive processes such as memory, response inhibition, spatial processing and pattern recognition. The idea behind a lot of lumosity’s games is to ‘gamify’ these psychology tasks. If the only purpose of Lumosity was to practice and play games that are similar to neuropsychological tasks, I'd say that they do a pretty good job. But I don't think that's why many of the 70 Million (!!!) people use Lumosity. 


The more important question is whether practicing these games should measurably improve your cognitive abilities.


Lumosity never overtly claims that playing their games will improve your cognitive abilities but it is implied. Oh boy is it implied.

Here is a figure that pops up the first time you play Lost in Migration:
Let’s look at that graph. It looks like brain performance gets better with a month of training!
Awesome!
While it's one issue that they deceitfully tiptoe around not saying "this will help improve your attention". There’s a huge problem with this graph. Can you figure out what is wrong here?


One issue is that we have no idea where that data is coming from but let’s assume that this is really some global average of everybody who does this task and that everybody actually improves over a month of training. My biggest issue here is with the y axis. What units are we measuring ‘Brain Performance’ in? What the hell is brain performance? In >5 years of studying neuroscience, I’ve never encountered the term ‘brain performance’ in any lecture or academic paper. What units do you measure 'brain performance' in? 

(Side note- Potential units of brain performance: RAM? "lumositrons"? pico-Elon's? hmmm...)

What they’re really showing on the y-axis of this graph is your performance on this task. They're showing that if you practice that task, you’ll get better at it. 

Does that mean your attention improved? No. No. No. This is the most important part.

Humans improve at almost everything that they practice! Everybody knows that if you practice at something you tend to get better at it. If you practice Lost in Migration or the Flanker task a bunch of times, you will get faster and more accurate. Does that really mean that you improved your ability to prevent yourself from being distracted, or did you just get better at the game? Or did you get better at doing a repetitive task over and over again? ...Or did you improve one small aspect of attention?...Or did you just get better at using a computer? ...Or did you just get more confident in yourself? ...Or did you actually get better at attention? What even is attention?

It’s totally unclear which of these effects underlies improvement on this task over time but my guess is that people are probably just getting better at the games. 



What does the research say?

If Lumosity's attention training works, we would expect to see training improve everyday usage of attention or at least a few neuropsychological measures of attention. Lumosity provides absolutely no evidence that improvement in these games generalizes to everyday experiences or even to other attentional measures. To me, this is the very first thing that they should have tested and proven. And this is a really big problem - especially because they have an in house lab and claim to have connections to 40 university collaborators.


They have done some research though. Some of their work shows that shows when aging or cognitively impaired people play a Lumosity game (Lost in Migration), it improves their abilities to perform on the corresponding or highly similar psychological task (flanker task). This is like showing practicing playing checkers with black and red pieces makes you better at playing checkers with blue and yellow pieces. It doesn’t mean that playing checkers will make you better at chess or improve your SAT score or make you more confident to approach girls at middle school dances; you just learned how to play a game and there is no evidence to think what you learned will extend beyond that.

While I haven't found any evidence for the arrow-bird-game-practice-improves-real-cognition narrative, there is some evidence against it. One critical piece of evidence is that a group at FSU split undergraduate participants into two groups. One group played commercial action/puzzle video game (Portal 2) and the other played Lumosity's games for 8 hours. Both groups were tested before and after training in problem solving, spatial skill, and persistence. The group that played the video game improved at all of these measures while the group that played Lumosity's games improved at none (http://myweb.fsu.edu/vshute/pdf/portal1.pdf). While this is only one study, it's starting to seem less and less likely that Lumosity's games make you better at anything besides the games themselves.


Surprisingly, some of Lumosity's research has been published to real peer-reviewed journals. But none of this research provides evidence that their method actually works. They show some evidence that these games can accurately measure cognitive abilities in aging adults, which is different but actually pretty interesting. 

While I have been pretty critical of Lumosity's false promises of cognitive enhancement, the silver lining is that they have just conducted the largest neurobehavioral assay of all time. They have data from millions of people, men and women, of all ages, from different parts of the world performing dozens of tasks that measure their cognitive abilities. This is really cool because we might be able to find patterns in age related cognitive decline and identify populations that are resilient to cognitive decline at an unprecedented scale. This is really valuable data and has the potential to move science forward but with some serious ethical 'is it worth it' considerations.

I'm really only worried about this because I’m sure that the vast majority of their users believe that these games are actually making them smarter and I wouldn’t be surprised if most of their users are older adults and/or people who are worried about their mental health. It seems likely that they are taking advantage of a vulnerable group of people who are suffering from real cognitive impairments such as dementia and Alzheimer’s Disease. If this is true, then they are being intentionally deceitful by using the smoke and mirrors of 'neuroscience' to claim that this is a product that might really help people. Frankly, they are snake-oil salesmen. They are selling a sham product that they have no reason to think will help anybody as something akin to psychoactive medicine.


Also, while most of Lumosity’s games mirror real cognitive psychological assays, some of the Lumosity games appear to be completely made up with seemingly no psychological basis at all! 


Here is one where you play the role of someone in a kitchen who is supposed to match faces of cartoon characters to food orders they place. In this scenario, you must help this Mark Zuckerburg looking fellow get himself a juicy little cheeseburger. 
What. the. fuck. is. this?


Cognitive Enhancement through brain training games might be possible after all
Ignoring the aforementioned neuromarketing schemes, there are a few groups studying brain training correctly. For instance, there is a group at UCSF showing cognitive enhancement through video game training and they are doing it really, really well (http://gazzaleylab.ucsf.edu/neuroscience-projects/neuroracer/). After publishing a finding that their training works and generalizes in a top peer-reviewed science journal, Nature, they are going through clinical trials to get their neurofeedback training approved through the FDA as a possible treatment for ADHD. They are different from Lumosity because they are going through a rigorous process of demonstrating that their method might actually help people with disabilities while making sure that their users don’t have any adverse side effects like video game addiction. Moreover this could actually revolutionize how we treat people with attention problems. Instead of prescribing drugs that act on the entire brain, with these cognitive training tasks we have the potential to train the specific pathological networks. We just have to be sure that it works.


What did we learn?
Cognitive improvement through commercial brain-training games hasn't been proven (yet). Someday somebody might come out with a report providing tons of evidence that these games work, but until that day, I remain very skeptical. The other issue at hand is that we don't really evaluate many of the ways that we teach and learn. We don't know the most efficient ways to train our mental capacities yet besides the same basic formula we've done in schools for over a century. I think it's a great idea to experiment with cognitive enhancement methodologies but it's important that we know if these techniques work before moving forward. 


In three sentences
Don't buy Lumosity. Sometimes people will do bad science to mislead you. We should challenge ourselves to think critically about how we teach and learn.




No comments:

Post a Comment