Susan Greenfield: 'I've always marched to the beat of my own drum'

http://www.theguardian.com/science/2014/aug/10/susan-greenfield-interview-marched-beat-drum-mind-change

Version 0 of 1.

I meet Susan Greenfield in a hotel room on Albert Embankment with a spectacular view across the Thames to the Palace of Wesminster, where she regularly appears in one of her public guises, Baroness Greenfield of Otmoor. It goes with her other roles as a scientist, writer, broadcaster, administrator and, not infrequently, subject of controversy.

The professor of synaptic pharmacology at Oxford has long, youthfully blond hair and is dressed in an above-knee pink dress and cream wedges. She tells me she plays squash as often as possible with a 21-year-old squash trainer. A remarkably fit-looking 63-year-old, she is an impressive figure in more ways than the merely professional.

Greenfield is also a public figure, and, among other things, that means being a target for animosity, cynicism and ridicule – never more so than in the age of internet anonymity. With rare exceptions – Sir David Attenborough comes to mind – everyone in the media glare receives their share of flak. It's just that some receive more than their share, and a few really seem to get up people's noses.

I ask Greenfield whether she considers herself to fall into that category.

"Probably," she says.

Does that concern her?

"Obviously everyone likes to be liked," she replies brightly. "Hence the popularity of social media where everyone wants to have friends. That actually constrains people's behaviour because they so desperately seek approval and therefore have to conform. I've always marched to the beat of my own drum."

The reference to social media is pertinent because Greenfield has been issuing warnings of the potential dangers presented by the internet, gaming and social media for a number of years now. Her argument is that digital technology is having a huge influence on the way we lead our lives. She speaks of the brain's "plasticity" and how it's known that environment affects brain development. Therefore, she says, it follows that the pervasive and parallel reality of our modern-day screen culture is rewiring the brain in ways that may well be detrimental.

It's not an outrageous suggestion, and indeed many parents have doubtless at some point questioned the healthiness of computer-based activities. Though Greenfield is not a parent. She is a neuroscientist who talks about the "damage" the 21st century is doing to the brain.

"I've never said 'damage'," she interjects. "Never. That must be someone who has misquoted me. As a neuroscientist it's a term I'd be very cautious to use."

Except that caution was absent in a Daily Mail article that appeared under her byline to promote her 2008 book ID: The Quest for Meaning in the 21st Century.

"Unless we wake up to the damage [my italics]," she wrote, "that the gadget-filled, pharmaceutically enhanced 21st century is doing to our brains, we could be sleepwalking towards a future in which neuro-chip technology blurs the line between living and non-living machines, and between our bodies and the outside world."

It's this sort of thing that has led her many detractors to accuse her of being alarmist and a scaremonger. In her defence she cites various studies and surveys that show worrying trends in areas such as attention span, empathy and aggression, many of which she refers to in her new book, Mind Change: How Digital Technologies Are Leaving Their Mark On Our Brains.

"Mind change" is a phrase Greenfield has coined to draw on the analogy of climate change. The idea is that what's taking place with the mind is, like the weather, global, unprecedented, multifaceted and controversial. The difference, she says, is that "climate change is damage limitation; mind change doesn't have to be".

There she goes again, using that word "damage". The good news, though, is that she believes digital technology should be harnessed "to deliver a wonderful life such as no generation before has had".

With such balancing statements, Greenfield seeks to appear impartial – a neutral scientist simply following the evidence and placing it before us as objectively as possible so that we are in a position to reach our own informed conclusions about the risks posed by the digital world.

But it's abundantly clear from the book and much else that she has written that her argument is weighted overwhelmingly in favour of the contention that the digital world is a very real threat to our wellbeing.

"All that I've done is review the literature, of which there is a small proportion that is positive," she objects. "Inevitably people will say that I'm biased, and everyone is. But I've tried my best to be even-handed, and if the impression you get is that there are negative things, then there are negative things – that's not me."

One methodological problem in this debate is that studies of human behaviour, of the kind that Greenfield cites, are notoriously difficult to control, and even the most rigidly constructed experiments are open to a great deal of interpretation. Another problem, say her critics, is that Greenfield prefers to go to the public with her concerns rather than to her fellow scientists.

Ben Goldacre, who publishes the Bad Science blog, had this to say back in 2011: "I have one humble question: why, in over five years of appearing in the media raising these grave worries, has Professor Greenfield of Oxford University never simply published the claims in an academic paper?"

She's dismissive of Goldacre, saying she has never met him or spoken to him. "Three times he's had the chance to talk to me and he's bottled out each time."

Goldacre responds: "That's very simply, completely and utterly, entirely untrue. This kind of ad hominem seems to be a theme with Susan Greenfield. Previously she has compared me to the epidemiologists who denied the link between smoking and cancer. That's why I say this professor should simply publish her claims clearly in a peer-reviewed academic paper, with accompanying evidence, so they can be assessed."

On the question of why she has chosen expansive books over a tightly defined research-based academic paper, Greenfield says: "As always in life people want a simple answer. It's like that lovely quote, for every complex problem in life there's always a simple answer and it's always wrong. To have the smoking gun, one experiment? Please. What experiment? Tell me."

Perhaps the most sensitive ground over which she has trampled in her catch-all crusade has been the issue of autism. On several occasions she has drawn a correlation between the growth of digital culture and the growth of autism. When challenged she has responded with the formula "I point to the increase in autism and I point to the internet. That's all."

The psychologist Dorothy Bishop, an expert in developmental disorders, was so infuriated by Greenfield's interventions that she wrote an open letter expressing "dismay" at the way "your public communications have moved increasingly away from science". She reminded Greenfield that parents of autistic children have had to suffer a whole catalogue of false causes, and begged her to "stop talking about autism".

Many observers, including Bishop, think that Greenfield has implied a causal link between the internet and autism.

"Well other people who work on autism have actually said there is one," Greenfield replies boldly, noting that her book has the references. "So they can slug it out with Dorothy if they wish."

In fact a link between technology use and autism has not been established, and nowhere in Mind Change is there evidence to support the claim that it has. Instead, in a chapter decorated with "mights" and "mays" and "ifs", there is a citation that suggests that an increase in online relationships has been accompanied by a decrease in empathy, and a psychoanalyst who says that even though social media users may not have a formal diagnosis of autism, they could still develop "autistic-like traits".

"I also point out the fact that people who have autism are more drawn to the cyber world," Greenfield continues. "Now these are issues that need to be explored rather than just dismissed. If you put it together you have to look at it unflinchingly and think about it, rather than just immediately saying it's wrong and that's not how scientists should work. If you're a scientist, you have to talk about it and think about it. You can't just say it's not true."

Several times Greenfield comes back to the importance of standing up and saying what needs to be said, regardless of the criticism one might receive. She sees her role, it's clear, as an instigator of debate. Thus, she argues, she has published on her website the references to the papers she has cited, and therefore everything is available to anyone, scientist or lay person, who wants to question the findings or use them as the foundation for further research.

Greenfield's broad spectrum approach, it seems, comes back to her instinct to take science to the public. She first came to attention beyond academia after she gave the Royal Institution Christmas lectures in 1994 – the first woman to do so since they began in 1825, when it was broadcast by the BBC. The title of her lecture series was Journey to the Centre of the Brain. She says she has a demystifying way of communicating science because she came to it relatively late.

Greenfield didn't study any science at A-level at Godolphin school in west London, and at Oxford she initially read classics. She then switched to philosophy, then psychology and finally to neuroscience. "I went easily into talking to the general public because I knew what they weren't understanding. I know what it's like to be patronised. So I think I had a natural affinity with ordinary human beings."

She's since been a familiar presence on TV and the radio, all the more so after she was appointed director of the Royal Institution in 1998. She was charged with shaking up the stuffy, Mayfair-based organisation. It was said that the brief was to make it into the "Groucho Club for science".

But after a £22m refurbishment was reported to have left the RI in serious debt, Greenfield was forced out in 2010 amid claims of sex discrimination. She says she can't speak to me about any of it because she settled out of court under terms of strict confidentiality.

It wasn't her first public spat with science's great and good. In 2004 it was widely reported that she was on the list of candidates for a fellowship at the Royal Society, the very pinnacle of the science establishment. It was subsequently even more widely reported that she failed to make the shortlist for nomination. She remains unchosen, a snub that has not gone unnoticed by those who argue that she's a better self-publicist than scientist, nor by those who believe that science is still rife with sexism.

She says she's not really bothered by the rejection, only by its manner. "What made me really angry is that the whole point of those nominations is that they're supposed to be confidential. I'm the only person in history whose nomination was made public. That was a bit shabby."

Did she ever wonder if her public profile affected how she was viewed by her colleagues?

"It would be very disingenuous of me to say 'Oh, no it's made no difference whatsoever.' I'm personally not envious of anyone. Whereas for a lot of people how they're perceived by others and the external awards they get are what defines them. Luckily I was given enough confidence in myself as a child. I'm so happy and privileged to be me."

However, she is damning of scientists who abjure publicity. "If the taxpayers are paying their salaries and the taxpayers are funding their research, then I'm sorry, they have a duty to explain to the general public what they're doing."

Her appetite for performing, she says, was inherited from her mother, who was a dancer. "For me it was second nature to be on stage." Her father was an electrician, and for 13 years, until her younger brother came along, she was the cherished only child of devoted but undemanding parents. She describes them as "wonderful" people who brought her up to do what she believes in "rather than seek approval from the crowd".

The constricting wisdom of the crowd is another threat she sees coming from the internet. Approval-seeking, she asserts, has become the basis of online interaction. "If nowadays everything you are thinking is downloaded and shared, and people comment on it and you respond depending on approbation, surely that will mean you have less robust sense of who you are."

She believes this development signifies a movement towards "externally constructed identities". But haven't identities always been externally constructed – through families, friendships, school, religion, the workplace, and so on?

She concedes this point but insists that the difference now is that the disconnect between the external self and the "real self" is so great that "you're seeing an increase in narcissism accompanied by a decrease in self-esteem".

The cruel-minded would say that this description exactly fits Greenfield's experience of writing a dystopian novel on the subject of identity, last year's 2121: A Tale From the Next Century. The reviews, she accepts, were "mixed", although that could not be said of the Guardian's. There was nothing mixed about it at all. "It is badly conceived, badly realised, badly characterised, badly paced and above all badly written," concluded the reviewer, Adam Roberts.

Nonetheless, she drew satisfaction from the novel, she says, because she loves setting herself a goal in life and then doing it. "The publishers did put it in for the Booker prize," she confides, a little surprisingly, "so I knew, irrespective of what the reviews said, in terms of writing it couldn't have been that lame. So that was very pleasing, and I have had an approach about a film for it."

But beyond the novel and the books and the internet controversies and the House of Lords and television and all the rest, Greenfield's really big preoccupation is finding a cure for Alzheimer's disease.

She warns against over-optimism but believes that the progression of the condition will either be significantly slowed or perhaps stopped entirely by the development of drugs that are perhaps 20 years away from appearing in a clinic.

Her own project is pharmaceutical and it involves the attempt to develop a medication that will eventually halt the death of brain cells. "If that was coupled with early identification then you might never have the symptoms."

The best chance there is for a breakthrough, she says, is an approach across neuroscience that allows "a thousand flowers to bloom until there is a clear and accepted theory as to why the cells are lost in Alzheimer's".

There are many people in science – not all of them Greenfield's enemies – who wish she would devote all her time to cultivating her Alzheimer's research. But it's not in her makeup to focus on one thing. Her desire to absorb as much experience as possible in life outstrips even her ambition – which is large – to make a defining scientific mark.

"Unless it's life-threatening or going to cost you a lot of money," she says, "it's always best to take on something rather than look back and say 'if only I'd done this'."

Does she ever look back and say, if only I'd had children?

"You can look back on things and say you've missed out, but it doesn't mean you regret it," she answers. "I remember very clearly the potty training and sleepless nights when my brother came along. Perhaps, unlike most, I didn't have a romantic view of parenthood. In life, you can't do everything."

That's true, of course. It just seems sometimes that Baroness Greenfield can.

Mind Change by Susan Greenfield – an extract

The term "emotional intelligence" has increasingly crept into everyday language to define the "the ability, capacity, skill or a self-perceived ability, to identify, assess, and manage the emotions of one's self, of others, and of groups". Whether or not emotional intelligence is part of, or different from, more general intelligence is an interesting question – but not our immediate priority here. Suffice it to say that if it's something that, like intelligence itself, varies from person to person, then emotional intelligence cannot be a feature that is determined and guaranteed from birth. A survey of 14,000 Michigan college students found that levels of empathy may be declining. While this survey, like all surveys, cannot provide a causal link, the somewhat eerie correlation between the soaring popularity of social networking sites and the decline in empathy is undoubtedly worth considering.

A particularly interesting approach by Miller McPherson was to compare ideas of friendship in 1985 with those in 2004. McPherson's team discovered that the 2004 subjects had fewer people they could really talk to, with the number of available confidants down by about a third. Even more alarming, the proportion of those actually having no one at all with whom they could discuss important matters had nearly tripled. While there were losses from both within the family and in friend groups, the largest deficits in confidants occurred in the community and neighbourhood. McPherson and his colleagues raised the possibility that respondents might have interpreted the question as pertaining to strictly face-to-face discussion, and if so, the shift from oral to online communication may account for the apparent decline.

It is easy to see how these two trends, a decrease in empathy and an increase in online relationships, could actually be linked. As the psychologist Larry Rosen has pointed out, if you hurt someone's feelings but cannot see their reaction, you'll lack sufficient cues to understand, apologise or take otherwise compensatory action. The increase in feelings of isolation may be the result of an increase in opportunity, thanks to the ease and speed with which personal information can be posted, which may encourage thoughtless and possibly damaging information to go out to the world with insufficient forethought. If empathy arises from experience of interpersonal face-to-face communication, and we are good only at what we rehearse, then empathetic connecting in the real world could be a good analogy for the networking between individual neurons which, when they "fire together" – in Hebb's famous words – "wire together". However, if you have no one who you feel cares about you, you might be all the more tempted to be uncaring to others or just care less about being so. And what effect might this indifference have on our own view of what is important and appropriate to share?

Beyond empathy, excessive internet use could lead more generally to a reduced ability to communicate effectively, as it has been associated with a lack of emotional intelligence, including poor performance on interpreting facial expressions. Perhaps it is unsurprising that excessive internet users have deficits in face processing. One particular study used a visual detection system to compare the early stages of the processing of face-related information in young excessive internet users by analysing their EEG. By presenting subjects with images of faces and objects, researchers had discovered that the brain waves elicited by the viewing of faces were generally larger and peaked sooner than the same responses elicited by objects. This meant that the faces had more significance for the average observer than the objects. However, excessive internet users generally had a smaller brain wave response than normal subjects, whether they were looking at faces or at tables. This result suggests that for heavy internet users, faces were of no more importance than everyday inanimate objects.

Meanwhile, three academics at Cornell University, Michael Waldman, Sean Nicholson and Nodir Adilov, have explored possible associations between technology use and the later development of autism. They considered a variety of screen activities including watching television, watching videos and DVDs, watching films in a cinema and using a computer. A link emerged between early TV watching and autism. If TV can be a factor, it would hardly be surprising if the screen world of the internet didn't turn out to have an impact as well.

So if we accept the broadening of the term "autistic-like trait", the Cornell findings might suggest that we shouldn't, after all, exclude environmental factors in some cases. Rates of autism diagnosis have been increasing rapidly in the past two decades. This increase cannot be attributed solely to genetic causes, yet may well be explained by a more astute diagnosis. However, the possibility that triggers in the environment, such as prolonged and early exposure to a world of the screen where no one looks you in the eye, cannot be dismissed out of hand. One study by Irva Hertz-Picciotto and Lora Delwiche at the University of California showed that, even after taking into account changes to diagnostic criteria and the broadening of the spectrum, a significant proportion of the rise in autism cases was still unexplained. Even if you are never formally diagnosed, your evolutionary mandate to adapt to an environment that does not rehearse the interpersonal skill essential for empathy might result in the development of autistic-like difficulties with empathy.

To sum up: there is a link between atypical brain wave responses in problematic face recognition, characteristic of autism, and also of heavy internet users; a link between autistic spectrum disorders and an under-functioning prefrontal cortex, indicative of a more literal take on the world; a link between early screen experiences and later development of autism; and a link between autistic conditions and the appeal of screen technologies. While it is impossible to establish cause and effect between these various links, and indeed impossible to draw any firm conclusions, there appear to be some parallels between heavy internet use and autistic-like behaviours that deserve further exploration.

This is an edited extract from Mind Change by Susan Greenfield, published on 21 August by Rider Books, £20. To order for £16 with free UK p&p call 0330 333 6846 or go to guardianbookshop.co.uk