Buzzfeed, January 14: A Mindset Revolution Sweeping Britain’s Classrooms May Be Built On Shaky Science.

Somebody needed to write this article. It’s written very well. I’ve talked to the writer, Tom Chivers, and he was very careful and seems like a great person. The article even quotes me, although I think if I had gotten to choose a quote of mine for thousands of people to see, it wouldn’t have been the one speculating about Carol Dweck making a pact with the Devil.

But I’m not entirely on board with it.

Growth mindset has been really hyped and Carol Dweck has said it can do implausibly exciting things, okay. A lot of smart people are very suspicious of growth mindset and think there has to be some trick, sure. There’s a high prior that something is up, definitely.

But one thing that needs to be at the core of any article like this is that, if there’s a trick, we haven’t found it.

I tried to be really clear about this in my own (mostly pessimistic) article on the subject:

It is right smack in the middle of a bunch of fields that have all started seeming a little dubious recently. Most of the growth mindset experiments have used priming to get people in an effort-focused or an ability-focused state of mind, but recent priming experiments have famously failed to replicate and cast doubt on the entire field. And growth mindset has an obvious relationship to stereotype threat, which has also started seeming very shaky recently. So I have every reason to be both suspicious of and negatively disposed toward growth mindset. Which makes it appalling that the studies are so damn good.

This is the context of my speculation that Carol Dweck has made a pact with the Devil. I haven’t accused (for example) the stereotype threat people of making a pact with the Devil. They did some crappy studies and exaggerated the results. That doesn’t require any diabolic help. Any social scientist can do that, and most of them do. What’s interesting about the growth mindset research is that it looks just like the sort of thing that should fall apart with a tiny gust of wind, but it actually hangs together pretty well.

BuzzFeed doesn’t really challenge that. The article spends most of its time snarking about how overhyped growth mindset is – and no objections there, given that its advocates claim that it can eg help defuse the Israel-Palestine conflict and bring peace to the Middle East. It spends a bit more time talking about how many people are doubtful – no objections there either, I’m doubtful too.

But in terms of the evidence against it, it’s kind of thin. I only see three real points:

First, it uses a technique called GRIM (granularity-related inconsistency of means). I like its explanation so I’m just going to quote it verbatim:

t works like this: Imagine you have three children, and want to find how many siblings they have, on average. Finding an average, or mean, will always involve adding up the total number of siblings and dividing by the number of children – three. So the answer will always either be a whole number, or will end in .33 (a third) or .67 (two thirds). If there was a study that looked at three children and found they had, on average, 1.25 siblings, it would be wrong – because you can’t get that answer from the mean of three whole numbers.

But Dweck says that she “took ambiguous answers as half scores” – maybe if the child was halfway between growth mindset and fixed mindset it was counted as a 0.5. It’s bad practice to do this kind of thing without mentioning it. But everyone does some bad practices sometime. And I don’t see anybody claiming it affected the results, which were very strong and not likely to stand or fall based on these sorts of things. Nobody is claiming fraud, and Dweck released her original data which looks pretty much like she was generally honest but had some bad reporting practice. Neither the statistician involved nor BuzzFeed claims this affects Dweck’s work very much.

Second, it mentions Stuart Ritchie’s criticism of a couple of recent Dweck papers which show “marginally significant” results. These results are so weak that they’re probably coincidence, but the paper hypes them up. There are a couple of studies like this, but they’re all in very tangential areas of mindsetology, like how children inherit their parents’ mindsets. The original studies, again, show very strong results that don’t need this kind of pleading. For example, the one I cited in my original post got seven different results at the p < 0.001 level. And there are a lot of studies like this.

Third, it mentions a psychologist Timothy Bates who has tried to replicate Dweck’s experiments (at least) twice, and failed. This is the strongest evidence the article presents. But I don’t think any of Bates’ failed replications have been published – or at least I couldn’t find them. Yet hundreds of studies that successfully demonstrate growth mindset have been published. Just as a million studies of a fake phenomenon will produce a few positive results, so a million replications of a real phenomenon will produce a few negative results. We have to look at the entire field and see the balance of negative and positive results. The last time I tried to do this, the only thing I could find was this meta-analysis of 113 studies which found a positive effect for growth mindset and relatively little publication bias in the field.

My intuition tells me not to believe this meta-analysis. But I think it’s really important to emphasize that I’m going off intuition. There’s no shame in defying the data when you think that’s justified, but you had better be really aware that’s what you’re doing.

I guess my concern is this: the Buzzfeed article sounds really convincing. But I could write an equally convincing article, with exactly the same structure, refuting eg global warming science. I would start by talking about how global warming is really hyped in the media (true!), that people are making various ridiculous claims about it (true!), interview a few scientists who doubt it (98% of climatologists believing it means 2% don’t), and cite two or three studies that fail to find it (98% of studies supporting it means 2% don’t). Then I would point out slight statistical irregularities in some of the key global warming papers, because every paper has slight statistical irregularities. Then I would talk about the replication crisis a lot.

I could do this with pretty much any theory I wanted. Any technique strong enough to disprove anything disproves nothing.

(and this is especially important in light of recent really strange negative results that eg fail to find a sunk cost effect, something I would hate to enshrine as “well, guess this has been debunked, no such thing as sunk cost now”)

Again, this isn’t to say I believe in growth mindset. I recently talked to a totally different professor who said he’d tried and failed to replicate some of the original growth mindset work (again, not yet published). But we should do this the right way and not let our intuitions leap ahead of the facts.

I worry that one day there’s going to be some weird effect that actually is a bizarre miracle. Studies will confirm it again and again. And if we’re not careful, we’ll just say “Yeah, but replication crisis, also I heard a rumor that somebody failed to confirm it,” and then forget about it. And then we’ll miss our chance to bring peace to the Middle East just by doing a simple experimental manipulation on the Prime Minister of Israel.

I think it’s good that people are starting to question growth mindset. But at this point questioning it isn’t enough. In my essay I tried to find problems that might have caused spurious effects in Dweck’s studies, and patterns inconsistent with growth mindset being powerful. I think we need to do more of that, plus look for specific statistical and experimental flaws in the papers supporting growth mindset, plus start collecting real published papers that fail to replicate growth mindset. Instead of talking about how sketchy it is, we need to actually disprove it.

We owe it to ourselves, to Carol Dweck, and to her infernal masters.