Key points:
This article originally appeared on the Christensen Institute blog and is republished here with permission.
Too often, educational research falls short of providing educators with actionable practical advice. However, a recent controversy surrounding Carol Dweck's well-known growth mindset gives me hope that we can move forward with research that can better inform and support professionals and students.
Fortunately, over the past two decades, educational research has moved to adopt randomized controlled trials (RCTs) when possible. However, even if an RCT is reached, educational research tends to stop there, at a stage where all a researcher can claim is that some intervention correlates with a desired outcome.
Research stuck at this stage can only tell us what works on average: what people call “best practices.” However, what works on average often does not work for a specific individual in a specific circumstance. Only moving to more nuanced statements about what works for whom and under what circumstances allows researchers to offer practical insights that educators can use reliably and predictably.
So how do we do that? The key is to move beyond inductive research that looks for average correlations between large N sizes, to deductive research in which we look for anomalies (specific circumstances where the result we see is not what the RCT or the large data set of correlations and studies) would have predicted.
Researchers often lament that they have found a failure in their theory. But anomalies are actually good news because they allow researchers to say, “There's something else going on here.” AND that It is what leads to better understanding.
Instead, what often happens in educational research is that one group of academics conducts a study that shows a positive correlation between a set of recommended actions and a desired outcome, and another group of academics conducts another study that shows something different. However, almost always in these large data sets or RCTs there are anomalies lurking: a particular student, class, or school for which a given intervention did not produce the desired outcome.
When researchers avoid acknowledging anomalies and instead simply attack each other's opposing theories, all we get is a giant game of “my correlations are better than yours,” but nothing that helps people on the ground. .
A recent controversy over Dweck's famous findings on growth mindset that Melinda Moyer covered in “Is the growth mindset a sham??” gets the point.
Growth mindset is the belief that one can improve one's abilities through effort, learning, and perseverance. Historically, the average statement has been that those individuals who have a growth mindset tend to achieve better results than they otherwise would and are able to overcome challenges.
But as Moyer wrote, a recent meta-analysis (a review of several independent studies on the same phenomenon) by Case Western University psychologist Brooke MacNamara and Georgia tech psychologist Alejandro Burgoyne in psychological bullet “concluded that 'the apparent effects of growth mindset interventions on academic achievement are likely attributable to inadequate study design, reporting failures, and bias'; in other words, the science on growth mindset is flawed and The approach doesn't actually improve children's grades.”
This is very similar to the classic case of comparing one set of correlations with another. Your classic “average” food fight that doesn't help people on the ground. As Moyer wrote, “Their goal was to determine whether, on average, growth mindset interventions improved academic performance.” To do this, they grouped students together regardless of circumstances.
Moyer then profiles another meta-analysispublished in the same issue of the journal by several researchers, who came to a more nuanced conclusion, as they “found positive effects on academic outcomes, mental health, and social functioning, especially when the interventions are delivered to people who are hope they benefit more.”
According to Moyer: “The other meta-analysis, on the other hand, attempted to determine when and where growth mindset interventions worked, and when and where they did not, using a slightly different data set. In essence, they did the (sic) opposite of grouping all the students together. These researchers found that growth mindset interventions worked for some groups and not others and that they helped struggling students the most, which, if you think about it, makes a lot of sense. When children are already earning excellent grades, growth mindset interventions are not as important or helpful since students are already performing well. But when students struggle in school, the researchers found, growth mindset interventions can help.”
Interestingly, the meta-analysis criticizing growth mindset also found some evidence of the same varied effects, Moyer wrote. “When they looked at the various studies and looked specifically at how growth mindset affected students who got low grades, they found that the interventions did have some beneficial effects.”
And even more interesting: “After conducting those two meta-analyses, Elizabeth Tiptonstatistician at Northwestern University, and his colleagues heard about them and decided to conduct still other meta-analysis of data on growth mindset. They looked at the same studies included in the “growth mindset doesn't work” analysis, but instead of pooling the data together, they further separated the various effects. They concluded that “there was a significant and significant effect of growth mindset in the (at-risk) focus groups.” In other words, once again, the growth mindset seemed to help kids who weren't doing well in school.”
Another way to put all this is that there is an anomaly. The growth mindset doesn't seem to work as well for those who are already high performers. I suspect Dweck might push back and say something like, “That's true, but when work gets difficult in the future and they experience difficulties, having a growth mindset will serve them well.” That's certainly the implication of a lot of Dweck's stories about stars like John McEnroe in his book. “Mindset” (however debatable it may be to analyze a star that is not known).
But leaving that aside, Tipton defends the need to improve research by looking for anomalies and borderline circumstances. As Tipton told Moyer: “There is often a real focus on he effect of an intervention, as if there was only one effect for everyone,” he said. She argued to me that it is better to try to figure out “what works for whom and under what conditions.” I agree with her. But not all researchers do this, which I find unfortunate for those who try to transcend supposed best practices to do what will work in their specific circumstances and with their specific students.
What's more, I've long heard researchers say that there are other anomalies in which growth mindset alone doesn't make sense. Moyer also writes about this: “Some researchers, including Luke Wood of San Diego State University, have discussed that focusing solely on effort could be detrimental to children of color, who can benefit from being praised for both their ability and their intelligence. (Here it is a great article by journalist Gail Cornwall who delves into Wood's concerns and recommendations in more detail).”
Ultimately, we need more anomaly searching to continue strengthening growth mindset theory. And it would be surprising if Dweck led this movement. This could give more time to disseminate findings about the limitations of the theory, but it would also help educators in the field know how, where, and when to put growth mindset into practice.
Because ultimately, when growth mindset fails to produce the results it aims to produce, we are not undermining the overall theory. Instead, we have the opportunity to make it grow.
!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=();t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)(0);
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘6079750752134785’);
fbq(‘track’, ‘PageView’);