No. No she is not. But Baroness Greenfield recently came under some considerable fire for making inaccurate and unevidenced claims about gaming’s effects on the brain. The once highly respected scientist has now become a go-to for the worst sorts of anti-scientific scaremongering. It’s one thing when a Melanie Phillips type writes any old rubbish that falls out of her face, but when it’s the former director of the Royal Institute, it’s especially sad and frustrating. So how did the member of the House of Lords, and professor of synaptic pharmacology at Oxford, respond to the widespread criticism debunking her claims? She produced more of the same, this time in the form of an edited extract from a forthcoming book, published in the Times (requires subscription) on Saturday.

Employing that most wondrous of scientific article beginnings, her headline reads:

“Are video games taking away our identities?”

Somehow the following article doesn’t consist of, “No. Don’t be so silly.” Instead it’s a couple of thousand words of yet another attack on gaming, without evidence or reason.

I have said this before, and I will likely say it every time: Neither I, nor RPS, are dismissive nor hostile toward research into the dangers of gaming. In fact, we enthusiastically encourage it, because as gamers, we have a heavily invested interest in being informed about such matters. If gaming is proven as harmful (which will admittedly come as something of a surprise, what with the ubiquity of gaming and the lack of demonstrated widespread harm) we absolutely want to know about it for our own protection, and the protection of our readers. And it is for this reason that we get so angry about spurious rubbish being published in the guise of scientific findings or expertise. The more the subject is obfuscated by scaremongering and unevidenced, biased speculation, the more potential danger there is for gamers. So we are very much dismissive and hostile toward those who perpetuate this. Thus.

Greenfield’s untiring campaign against all forms of modern technology is this time being focused on how exposure to such electricity-based items will irrevocably undo our identities.

“Will ‘identity’ remain a robust and continuous experience, or change in some new way, or – bleakest of all prospects – cease to have any real meaning altogether?”

Let’s hope it’s the last one, eh, Baroness? She explains that using a computer causes you to become a “nobody”, in a way that even drugs can’t. Unfortunately, Greenfield hasn’t heeded Ben Goldacre’s excellent advice in his recent Grauniad column, in which he argued that such serious claims require serious scientific evidence, cited and peer reviewed, and published in scientific journals. Because when making such astonishing claims, one must not only provide astonishing evidence, but also respect the scientific process. It’s Greenfield’s disrespect for that scientific process of which she was, until recently, a part, that makes her articles quite so offensive. For instance, in the Times piece she states,

“One study reports that addicts, with an average age of 27, are spending an average of more than 80 hours a week in online gaming…”

A few questions:

1) What study? Where was it published?

2) Addicts of what?

3) What circumstances are these “addicts” in?

But so apparently uninterested is she in including anything approaching evidence that not only does she not stop to answer any of those questions, but even changes subject before she’s put in a full stop.

“…while another survey revealed that children in the UK, between their tenth and eleventh birthdays, spend on average 900 hours in class, 1,277 hours with their family and 1,934 hours in front of a screen – be it television or computer (it doesn’t matter, as the two types of device are converging).”

1) What survey? Where was it published?

2) What was the scale of the survey?

3) What were they doing for the other 4,649 hours?

4) How many hours were spent with family before the ubiquity of video gaming?

See, it’s clever the way she lists three activities with hours lower than looking at a screen, making it look like it’s taking up the rest of their time, eh? Somewhat ignoring the 50%+ of their time not accounted for by the apparent “survey”. But hang on – why should we be concerned about these things? What’s the problem here, Baroness?

“Prolonged and frequent video gaming, surfing and social networking cannot fail to have an effect on the mental state of a species whose most basic and valuable talent is a highly sensitive adaptability to any environment in which it is placed.”

Oh! Well then. It cannot fail to! It’s beyond parody at this point. The core of her argument, the crux of everything she wishes to convey, appears to come down to the wildly ambiguous declaration that it’s just the way it is, so it is. Because she said so. And we’re not even out of the first of five columns of this speculative, unevidenced, waffle. It gets so much more demoralising.

Greenfield argues that “screen experiences are literal”. Well, what she actually says is, “Might the reason be that screen experiences are literal?” and then continues on as if asking the question is all the evidence we need to assume the question to be accurate. “After all, what you see is what you get,” she argues, explaining that “screen images do not depend for their impact on seeing one thing in terms of anything else.” May a hundred years of film theorists please now headbutt their screens. (It’s important to state that Greenfield frequently conflates television and gaming when convenient in her piece, and thus I feel it’s only fair to do the same when interpreting her statements.)

Of course, the absolutely literal nature of screen images is worse in gaming. Because, as Von Greenfield explains as fact,

“Just as it would be hard to translate inner feelings into literal screen images, so it would be difficult to expect software to help the user to gain a sense of abstract concepts or metaphor. How might ‘honour’, for example, be depicted as a simple visual icon? Or how could the famous lines from Macbeth starting ‘Out, out, brief candle…’ be shown as a visual image that conveys its power, and meaning, as metaphor for death?”

If we can ignore the extraordinary notion that all screen-based experiences are apparently static, visual-only icons, this consistent asking of questions to which she seeks no answer becomes extremely painful for anyone who has experienced, enjoyed or been moved by the visual metaphor of gaming. Let alone film. And that’s ignoring the bewildering logic of Macbeth apparently requiring something screens prevent to be understood. It seems that Macbeth is supposed to only exist as a book, because surely to represent that line in theatre, one would have to use a visual image, rather than hold up the line for the audience to read on giant flash cards. (Oh, and that no line in Macbeth can be described as “starting out” with “Out, out brief candle,” what with that being where the line ends. It in fact comes midway through Macbeth’s twelve-line speech during Act V, Scene V, the second half of the seventh line, following, “The way to dusty death.” Which may, in some small way, help us to understand its metaphorical context as related to death. Just a thought.)

Metaphor, she continues, is essential for distinguishing our brains from those of chimps. So that asks another question.

“Could constant exposure to a literal world, devoid of metaphor and abstract concepts, actually lead to a situation where the user’s brain remains trapped in a literal present with images that really ‘mean’ nothing other than what they literally are?”

Good question! Perhaps the response to such a question to be to begin a study into this matter, and have her findings published in a journal, so her peers can assess her process, results and conclusions. Although I’m not entirely sure where she’s going to find these screen images devoid of metaphor. Oh no, wait, she’s found them.

“When you play a computer game to rescue the princess, it is not because the princess is meaningful or significant to you – you probably won’t care about her as a person – but because of the thrill of the process of playing and winning. Yet when you read a book, it is because you care about the characters, their relationships with others and their fates: their past, present and future and interrelations with other characters give them meaning.”

To me it seems that Greenfield doesn’t want to hear otherwise. If she had even the merest interest in actually understanding the subject she is so willing to write about, surely she would have asked a single gamer if her statements were accurate. Since she simply cannot have done this, I can only conclude that she doesn’t care about accuracy.

Having just played the indie masterpiece, To The Moon, and encouraged so many others to play it, I’ve been receiving very many messages from people who have been moved by it. Many men, in fact, letting me know that the game caused them to cry, some to sob. A game that’s exactly about the characters’ pasts, presents and futures, and the interrelations with the other characters, that gives the game its meaning. It’s literally about characters’ pasts, and the significance of their interrelations! That game comes to mind as it’s the most recent game I’ve finished. I’m currently playing Saints Row: The Third – a big, AAA action game – and I’m absolutely gripped by it, not just by the compelling action, but because I find I’m completely hooked by the story, the politics of the characters, their relationships, and how they are changing as the ludicrous story progresses. That’s the game I was playing yesterday. I’m not picking and choosing my examples here.

But even those protestations from me are to ignore the complete mystery of suggesting that television, film and gaming are incapable of offering metaphor that live theatre can. Does Greenfield storm the stage at Shakespeare productions, demanding that she be allowed to interact with the cast, lest the experience become too passive? (Um, because even if that were her requirement, isn’t that what gaming does that other screen-based offerings do not?) Or does she truly believe Shakespeare to have written novels? Because we’re getting to novels.

Then things get really inaccurate. She was saving it for halfway.

“A recent report on 1,400 US college students showed a decline in empathy over the past 30 years, with a particularly sharp drop in the past decade. Screen-based violence has been associated with lower empathy, while repeated exposure to violent video games in turn increases aggressive behaviour via changes in personality factors associated with desensitisation.”

Wow. So much in so little space. Yet again not even a hint as to where this survey came from. How was empathy measured? Where has screen-based violence been associated with lower empathy? I’ve never heard of such a claim, let alone seen a study demonstrating this. And that spectacular last sentence, in which she attempts to use sciency-sounding words to say absolutely nothing whatsoever.

But it gets better/worse. The neuroscience expert explains how human empathy is in fact reliant on the reading of novels.

“Normally we learn to empathise from real conversations where we rehearse eye contact and learn to interpret body language and how and when to give someone a hug. We then progress to reading novels and understanding how differently people can see the world, how they feel and interpret the actions of others.”

I mean, kudos. It’s hard to imagine a more peculiar thing to claim. Humans, as they evolved, would progress from hugging to novels, and that’s how we have empathy. But in Second Life (oh yes, she goes right there, bypassing even WoW!) it’s not the same as learning when to hug before moving on to novels. (Oh, and remember, 10-11 year olds spend 1,934 hours in front of a screen, but only 1,277 hours with their family, and so because the screen number is higher, that means the family number doesn’t exist at all. Right? Surely? It’s birth, then Second Life.) And just in case she weren’t being insulting to everyone possible, she adds,

“It’s not surprising that Second Life is popular with those with autistic spectrum disorder, characterised by their impairments in empathy.”

Yeah, you beastly autistic people, stop finding ways to communicate through technology! Go back to angrily sitting on your own, like you should be. Read a novel.

(Which is to say, stating an example of how a piece of software can be appreciated by those with autism in NO WAY suggests that a piece of software can cause or reproduce the effects of autism, and the implication here is despicable.)

And Twitter. That’s bad too.

“One could even suggest that the constant self-centred readout on Twitter belies a more childlike insecurity, an existential crisis.”

Yes, one could suggest it, if one was prepared to bother to study it and provide some evidence demonstrating it, before declaring it in a national newspaper. If one were a scientist, rather than a scaremongering page-filler.

This continues on for hundreds more words. Despite even listing far more likely reasons for the three-fold increase in prescriptions for ADHD over the last decade, she still goes on to then blame it on gaming. She goes into some detail into the effects of damage of the prefrontal cortex with relation to physical damage and schizophrenia, as if some of her scientific past manages to unconsciously force its way out, but then deflates with a “could” and “might” laden paragraph suggesting that games could/might cause the same effects, without any evidence for that.

To see a formerly respected scientist acting this way is depressing, and potentially extremely damaging to those wishing to take the subject seriously. But then, this is the same Baroness Greenfield who, when Ben Goldacre suggested she should provide evidence for her claims, replied saying that Goldacre was, “like the people who denied that smoking caused cancer.” This is not, I believe, the way anyone in her position should go about things. Articles like the one described above are unhelpful, unscientific, and impede the progress of any genuinely useful research.

Huge thanks to Michael Cook for letting us know, and to Laurence Pope for the scan.