The battle isn’t won yet: Why feminism still matters and is relevant to everyone

It is easy for me to be complacent about equal opportunities. I’ve never personally been held back by discrimination. I mean, I’ve had people think it is their right to comment about my appearance, and I’ve even had a few individuals who have bordered on stalking because of my internet presence, and my gender has certainly been a factor in that, but I’ve never not been able to do anything because I’m a women. Likewise, although I’m a second generation immigrant and my heritage is from a cultural minority, I’ve grown up as a white British atheist and have never experienced discrimination (even if there have been occasional incorrect assumptions about my religion or politics). I’ve had a broad social network, but I’ve never witnessed my friends or colleagues experience overt discrimination either.

I’ve always seen gender stereotypes as something of a challenge, in fact. I was one of three female students who did A-level physics, compared to about 50 males, and got good marks in maths and hard sciences before I went into psychology. As a student I bought a Haynes Manual and replaced the starter motor of my Vauxhall Astra along with an oil and filter change, because I couldn’t afford the quote from the garage. Likewise I have learnt all about the construction of houses, and was involved in the design and manual labour of various home improvements. I’ve been an early adopter of technology and a fan of video games as an emergent art form. And now I lift big weights at the gym, defying the gender pressure to lose fat through cardio rather than build muscle. I’ve encouraged my daughters to be brave and strong as well as kind, and to want more to the story than for the main character to marry the prince and live happily ever after.

So from my position of relative privilege it is hard not to assume that the battle for equal opportunities has already been won. However, as soon as I look a little more broadly at the world this is clearly not the case. So many different examples illustrate how my experience is the exception rather than the rule.

In the UK women on average earn 21% less than men per hour. This is the case in most of the developed world and the disparity is much worse in less developed nations. Although there has been significant progress over the last 50 years to reducing this disparity, economists admit the gender gap in wages is likely to take at least the next 100 years to close. Even in the most conservative figures, when all the variables that affect wages, such as lower experience due to career breaks and lower levels of qualifications for some population groups are taken into account, women still earn 5-10% less when equivalently skilled and doing equivalent work. In the most senior roles there are far fewer women, and those that are present earn substantially lower salaries. The earnings gap is larger as people get older, and in the higher earning percentiles of the population, suggesting that choosing to care for children does sacrifice status and earnings for the remainder of the woman’s career. These are figures I find appalling.

Thankfully there are movements and books containing advice about how to counter this effect. Cheryl Sandburg’s “Lean In” movement encourages women to take a seat at the table where big decisions are being made in big companies. The excellent “Give and Take” by Adam Grant advises people who are natural givers to advocate for their dependents when making decisions and entering salary negotiations, if they are not assertive/demanding enough when arguing for themselves. And many women and men are advocating helpfully for the value that women bring to senior positions.

In psychology and therapy professions we hit another facet of gender politics, with the dominance of women in the workforce reflecting the idea that empathy and caring are perceived by much of the public as feminine qualities. This message that facts are the male domain and feelings are the female domain is seen to be natural and innate, because of the typical division in gender roles between hunter and home maker in the origins of our species. However, since industrialisation and the invention of effective contraception, these roles seem to be transmitted more as a story based on past experience than in terms of reflecting the current reality (in which we can purchase food by selling other skills, and few of us would be very good at hunting or gathering our own food if this involved strenuous physical activity). After all, women being naturally suited to be the home-maker was ‘true’ in a time that it was also ‘true’ that the earth was flat, bathing frequently would have been seen as a wasteful fad, nobody understood the connection between hygiene/sanitation and disease, and very few people stayed alive beyond their 40s.

I believe that providing attachment relationships is probably the single most important job in society. That quality of caring about another person, and holding them in mind is essential for each of us to be happy. It is a powerful gift, whether in terms of parenting, friendship or a therapy relationship. However, I have seen no evidence that efficacy in this role is determined by gender. It may be true that in general women have slightly better ‘folk psychology’ and men have slightly better ‘folk physics’, as Simon Baron-Cohen’s research has shown, but apart from the head start that pregnancy and breast-feeding give to mothers, there is a paucity of evidence that the gender of parent who takes the primary carer role affects outcomes for children. Certainly, women feel more guilt about returning to work or choosing not to be the primary carer, but does that reflect a genuine concern about attachment security or the projections of a society where a women is supposed to ‘have it all’ in the form of balancing work, parenting and their own identity, having gained expectations of being an equal provider whilst not having handed over equal expectations of looking after home and family.

By devaluing caring and empathy for men, we lose a significant proportion of the potential workforce for psychological therapies. Those that remain often have less traditionally masculine qualities than are typical for males (whilst women who gain places in clinical psychology typically have more of the ‘masculine’ qualities of assertiveness, ambition and intelligence than are typical of their gender). We also make it unacceptable for boys and men to express their feelings openly, or to seek help for emotional problems without shame. And of course there is the wider issue of devaluing homosexuality, and through association any gentler or more feminine traits in men (for example with the playground taunt of “gay” for disliked characteristics or outcomes). This leads to lower uptake of psychological therapies or treatments for mental health problems, along with greater rates of completed suicide in young men.

More recently social media has provided a new means of networking which have been widely taken up, especially by young people. Mobile phones, text, Facebook, Twitter, chatrooms, Vine, Snapchat, Instagram, Tumblr, forums, multi-player gaming and video chat have allowed people to find those with similar interests and to communicate in new ways, but have also been media in which new forms of bullying and harassment have emerged, along with pockets of rampant prejudice including misogyny. In these contexts sexism, racism and discrimination has emerged in new forms, and some media are better at moderating this than others. Online video gaming spaces, Facebook and Twitter in particular have proved to be free playgrounds for “trolls” (those who gain enjoyment by harassing others online) due to their lack of willingness to intervene about abusive content.

There have been remarkably sad examples of what happens when such media allows the predatory minority to find vulnerable targets, such as the tragic story of Amanda Todd, the teenage girl who was encouraged to flash over webcam and then blackmailed with these images by an adult man until the point she committed suicide. There was also the disturbing video manifesto of Elliot Rodgers, a college student who killed 6 and injured 13 before committing suicide due to the perceived injustice of him not being as attractive to girls as he felt he deserved to be.

In amongst the array of content on the internet a subculture has developed that is profoundly sexist and has disturbing ideas about how to “play the game” in ways that “put women in their place”. Some of the members identify as Pick-Up Artists (PUAs) or Mens Rights Activists (MRAs), but the idea that women now hold too much power, and that men have seized upon feminist and progressive thinking to impress women, seems to be a common strand. There is great anger from members of these groups against men who speak up for women’s issues or social issues more broadly, who are often disparagingly labelled “White Knights” or “Social Justice Warriors” (terms which are intended as insults, despite sounding pretty awesome). Many women have learned to use gender-neutral names on social media, and not to speak when playing multi-player online video games, rather than to risk the onslaught of comments, which range from “get back in the kitchen” to violent threats of rape and murder of them and their loved ones (especially when defeated by the superior skill of a female player).

The latest iteration of this undercurrent has been the harassment of women who have highlighted the sexist tropes within video games, or otherwise become a figurehead of progressive thinking within that culture. Anita Sarkeesian’s highly accessible video series “Tropes vs Women in Video Games” has been a focal point. When her Kickstarter attracted death threats, harassments and attempts to discredit and silence her the community spoke out by massively over-funding her project and giving it a much bigger audience. However, she has continued to be subject to a variety of death and rape threats for merely casting a light on the fact that a small percentage of the content of many popular video games is a set of tired old tropes in which women are the decoration, damsel to be rescued, or die as motivation for the hero’s vengeance, rather than the protagonist of the story. Likewise a bitter ex-boyfriend’s rant about female developer Zoe Quinn led her to be a target of harassment (with a thin veneer of concern about ethics in games journalism that was not evidenced by similar hounding of the journalists who were wrongly alleged to have given favourable write-ups of her work due to personal relationships with her) and games writer Brianna Wu, for writing an article saying that the old stereotype of a gamer has been superseded by a much wider demographic (perceived as a “death threat” to “true gamers”). In each example, the profound sexism of the antagonists is evident, and the impact on the target has included them needing to move out of their homes due to the severity of threats to their safety, after their identifying information has been discovered and released into the public domain (a harassment tactic know as doxxing).

So whilst I observe from the safe space of being a successful female professional, who to date has had very limited personal experience of sexism, I am reminded that feminism is far from being a battle that has already been won, and equality is far from ubiquitous in the hearts and minds of the whole population. The internet has always been a great leveller, by forcing us to judge people on their words and not on their gender, age, ethnicity, sexuality, disability or any other aspect of their physical self, and I think that is an amazing thing and as close to a meritocracy as we will ever experience. So I am saddened by the resurgence of such hate and vitriol into places where these variables shouldn’t even be relevant, and that there are now seemingly topics about which women cannot write without fear of a personal backlash. It shames me that I have a little bit of fear about the repercussions each time I express an opinion online through this blog, or twitter or my forays towards podcasts/videos. We all need to do our little bit to change this, to speak up for equality and against harassment, and to reclaim those spaces in which prejudice is showing – for the benefit of everyone.

Between a rock and a hard place – when friendship and your professional role overlap

I’ve always tried hard to keep a clear distinction between work and non-work stuff in my life. I expect my friends to be able to offer, on balance, a similar level of support to me than they require from me. If the relationship is too skew then it will be meeting one person’s needs at the cost of the other, and that isn’t a friendship. Friendships are reciprocal, and allow me to trust enough to show facets of myself that I might not want to reveal in the context of work. In the safety of such a relationship I can have my own vulnerabilities. I can worry that I am less than a perfect parent, or talk about my relationships with other members of my family. I can joke, swear, drink wine, express opinions, or laugh at the contestants on The Apprentice without fear that this will tarnish my professional reputation. The rest of the time I feel like I have my professional hat on. I am in a position of responsibility and power, and I am bound by a code of conduct. When I talk or post online as a psychologist, I run the risk that my comments will be brought back against me when I’m in the witness box, or be taken out of context and misinterpreted by a present, past or future client or colleague.

I am friends with some psychologists and other colleagues from work and via the clinpsy forum. That’s a good thing. We share common values and experiences. We have shared stressors, and we spend time together. I am also friends with other professionals that know me as a psychologist, like lawyers, paediatricians, psychiatrists and social workers. Again, our work overlaps and becomes a topic of mutual interest. I also have non psychology friends. That’s a good thing, as they bring different ideas and perspectives. They let me relax, share other interests and remind me of the other parts of me outside of being a psychologist. We can cook, eat, play, exercise, explore, talk. We can play video games, make music or art, debate politics and current affairs. As a prior supervisor would say, we are people, partners, parents and professionals as well as psychologists, and we need to pay attention to each of those roles. What marks it out as a friendship is that there is trust, and that the relationship is enjoyable or nurturing.

The difficulty comes when you feel like you ‘click’ with someone who you are seeing professionally and feel that had you met outside work it could have been a friendship, as that makes it harder to stay within a work role and remain within the more neutral and guarded boundaries that a professional relationship entails. A therapist needs to respect their clients, be curious about them, accept them, hold them in positive regard and see their potential. The relationship may be very important for the client, who may idealise you and want to bring you into their life. But that doesn’t make it a friendship. The power balance is different in a professional relationship. Within therapy the client is expected to disclose a lot about their life whilst the therapist discloses little. It is not a reciprocal relationship, and the relationship is not there to be enjoyable or nurturing for the therapist. Having started from there it is not possible to reach a place of reciprocity (at least not without a lot of time and distance after the end of the therapeutic relationship). So if you find yourself acting too casually, sharing too much information, or wanting to step outside of your normal professional role, this is definitely something to discuss in supervision.

Likewise, if someone in your personal life starts to use your professional skills, this needs to be handled very carefully. Parents asking for advice about their child’s anxiety or poor sleep may not differentiate whether you are giving advice as a friend and fellow parent or as a professional. A friend who wants guidance how to access IAPT, or is feeling suicidal and needs to be taken to A&E needs to know you can support them as a friend, but not as their psychologist. We may well know the system and the right things to say, or the right people to approach, but it is important not to end up muddling the role. You can’t ring up someone you know’s treating clinician and say “Hi, this is Dr Silver and I’m wanting to ensure you understand my formulation about my friend Jane”. They are entitled to confidentiality in their therapy and trust within their friendship. But you may also feel a greater obligation to act on concerns about someone’s mental health, or a child protection concern, than a general member of the public.

It is all too easy to get sucked into an uncomfortable place in between. What of someone that approaches you in a way that appeals to both the personal and the professional? They just find you so easy to talk to that they tell half their life story, and next thing you are feeding back a formulation at a dinner party. Where do you go from there? Do you reciprocate and tell the ins and outs of your life, or give them a business card if they want to follow up the conversation with a formal session? Or the friend who just can’t get an assessment for their dyslexia, but is self-critical about how stupid they are, when you have the psychometrics needed in the office and your assistant has a spare hour on Friday. Surely that’s not so personal? Or the friend of a friend that never seems able to access the services they need. Do you step in and advocate for them? Its a very difficult decision to call sometimes. But in my experience it is these situations that are most likely to fall down around your head.

A colleague of mine was concerned that a friend of a friend (lets call her Sarah) was discharged from an inpatient stay without proper risk assessment or follow-up. He spoke to the GP and inpatient team to raise concerns, but nothing was done. Sarah later committed suicide, and my colleague was interviewed in the enquiry that followed. The coroner did not seem able to differentiate between a concerned friend who happened to be a professional, and someone with professional responsibility, and he got given a really hard time. This was on top of the guilt he felt for not having been able to prevent Sarah taking her own life.

Another colleague ended up having to drop everything to collect a friend from various complex situations all over the country as she had psychotic episodes, and would not trust professionals when she was not taking her medication and did not have a good support network.

I ended up writing to the GP of someone I shared an office with early in my career, to report an eating disorder, suicidal ideation and risky behaviour. I felt like there was little else I could do after a supervisor said it wasn’t their problem, because their actions were placing other people at risk. I wanted to be supportive, but at the same time I felt like it was unfair to burden me with the information without allowing me to act on it. I was very clear with the person involved that this was what I was going to do if they continued to confide this type of information, and they chose to write down the contact details of their GP knowing that I would share this information. Thankfully, they went on to get appropriate therapy.

When I first met my husband it was evident he was dyslexic. I did some informal assessments so that I was sure my hunch was correct and then pushed him to get formally assessed at university. This confirmed the diagnosis and enabled him to get concessions made about his spelling and handwriting in exams, and I learnt to help by proof-reading his course work. I felt like the assessment needed to be independent to have any authority, and that I could not take on this dual role.

A decade later, I started at a new post and started talking to the IT guy who covered CAMHS, who was concerned about his memory. It was clear he had a specific deficit that had never been assessed, and I owned the WAIS and WMS that were current at the time. With the consent of the directorate manager and my supervisor, I did a full psychometric assessment. We have gone on to be lasting friends, and he credits me with helping him to understand that he is a bright guy with a specific deficit, rather than a guy of mediocre intellect who has done well for himself. However he has never wanted to use the assessment formally.

More recently I spent 24 hours taking an acquaintance to A&E after they confided detailed suicide plans in the wake of a relationship breakdown. After a long time talking in the waiting room before they were seen, they asked me to be with them in the room and share some of their abuse history with the assessing clinician. I agreed, but I had to be very clear to identify as someone from the personal network. Whilst the assessing clinician was keen to make me part of the follow up plan, I had to set out clear boundaries and decline. I was not a professional to them, and I was not somebody who could take responsibility for this person on discharge as I lived in a different part of the country.

Each of these has been a learning experience and shown the importance of differentiating the personal from the professional, but it is something I will continue to grapple with both personally and through supervising others. The nature of our skills and knowledge mean that there will always be situations in which people want to use our professional expertise, even when we are not wearing that hat. Whether that is the GP that wants advice about a patient when you go in for advice about your own health, or the business coach that wants to talk about their concerns about their child, or the friend who is giving evidence to a child abuse enquiry. We need to find a way to be both compassionate and pragmatic about the capacity in which we can be involved, to keep ourselves and the individuals safe and ensure they get the right kind of support. The role of speaking to other people on the internet is one I will blog about at some future point, and brings with it a plethora of new and challenging ethical issues, not just the way that the informality of the medium makes roles blur more between personal and professional.

The double bind of trying to do research as a clinician

In my first job after university I was an Assistant Psychologist on an applied research project evaluating the impact of staff training on the quality of care in old people’s homes. It wasn’t my natural client group, but I loved the fact that we were measuring whether all our fancy psychology ideas actually made a difference to people when applied in practice. My supervisor, Esme Moniz-Cook was an inspiration in this regard, and set a pattern I have aimed to maintain to this day: innovate, evaluate, disseminate. We wrote lots of papers, and each one worked through many iterations in which Esme would cut up a print out of my latest hopeful draft and stick it back together and/or annotate it in different coloured pens. I developed a ritual to overcome the frustration: I would look at the result, take a few deep breaths and then set it aside. Then I’d move the text around on screen to match the arrows and sellotape. Then set it aside. Then do a few of the suggested edits that felt too trivial to argue about and set it aside. Then I’d look at the reduced number of suggestions that was left and decide which were worth disputing and do the rest. Then I’d send it back to Esme and the process would repeat. It was a challenging process, but the papers were always better as a result, and meant I started training with a set of publications on my CV. My doctoral research felt like a piece of cake by comparison. I developed and evaluated a computer based training tool to help people (especially those on the autistic spectrum) learn to recognise facial expressions of emotions and predict how people would feel in different situations. I published a paper from it, of course.

As a newly qualified clinical psychologist in 2000, I was all fired up about continuing to do research. I put in a bid to two charitable funds to expand my doctoral research to a larger population and to follow up the effects after a period of time. It took me a long time to complete the proposal document and to get all the signatures, references and endorsements from both academic and clinical hosts. Both were declined. The first said the bid was so strong that it didn’t need autism specific funding, the latter said that we hadn’t accounted for poor uptake or drop-out because they didn’t know how keen the kids with ASD had been to have extra computer time. It felt like a Catch 22. In retrospect I should have researched the potential funders better; the former probably wanted research that would relate to their patented use of pig secretin in autistic children, they focused on medical interventions and were not interested in something as ‘soft’ as psychology. I was gutted. In the time I had spent preparing the two bids I could have made significant inroads into the study, but I hadn’t got off the starting blocks.

In my next job, I applied my learning to undertaking small bits of research myself by juggling my clinical time. I explored what it was that defined the young people who challenged multiple agencies. I evaluated how social workers in the adoption support team reacted to having the opportunity for consultations with a psychologist. I measured the Theory of Mind skills of Looked After Children and compared them to children with ASD and a control group. I assessed the cognitive ability and mental health of the children in residential care homes within the catchment. Most importantly, when asked to advise on how to improve adoptive matching, I undertook an audit of factors affecting adoptive outcomes across 116 families. This turned out to be the largest study of risk and resilience factors in adoptive matching ever undertaken in the UK.

I found the answers to my questions, and I learnt a lot of new things that other people need to know. But I had no funded time or academic support to disseminate the results. With the smaller projects, I let my Assistants write up the results for publication in professional newsletters and present them with me at conferences. But with the larger studies I wanted to do them justice, and my NHS role never gave me an opportunity to do so.

Once I left the NHS, I followed up my adoption audit 5 years after the initial data collection because I was determined not to let such interesting findings disappear into the ether. Then I made another bid for research funding, and by fluke or serendipity secured a £75,000 Shine Award from the Health Foundation to complete a one year study of the impact of adding brief psychological input into the local diabetes service (something I have previously blogged about). We reached the end of the year and applied for further funds to spread the impact, but were told that they do not fund follow-up data collection, or time to write up results for publication as this should be done in the course of your employment in the NHS. Of course we don’t work in the NHS, and have no funded time as part of a bigger contract or our contract of employment. So the Assistant Psychologist who was doing our data collection and analysis was only funded for the duration of the project, and has since moved on. The £2500 they did offer towards a editing our service user feedback into a short video, revamping our website, a press release and a small local event, requires a long report to release which is now overdue. I’m still trying to work out how to complete and submit the academic papers, but this takes time and means it has to compete against all the other demands of running a business.

Herein lies the big double bind: you need funded time to write bids and to write up papers to give the credibility to future bids, but unless you have academic tenure this time is not funded. For a clinician it competes with more immediate clinical demands, and for someone self-employed it competes with the tasks that actually pay the bills (in my case training, consulting, and up to 4 court assessments per month). There are also other things that demand my time and mental energy (committees, other forms of writing, http://www.clinpsy.org.uk, etc) and I sometimes even have a life outside work. Not everything can fit into the evenings and weekends, even if I allow work to expand into the rest of my life like gap-fill foam.

But I am not defeated. I am determined that the diabetes results will be submitted to peer reviewed journals by the end of the year. And I’m even more determined that what I have learnt about risk and resilience factors in adoptive matching will make its way into social work practice. I’ve considered all kinds of options including writing it up as a PhD, or a book, or a series of papers, or all of these things seeking funding as a grant, or a stipend, or a fellowship. I’ve tried to use my network to find potential funders. But nothing seems to come to fruition in a way that would allow me to have funded time to write it up with input from a statistician, whilst still having at least half my working week to run my company. Perhaps that is just asking the impossible, no matter how close to fruition the research is or how it will impact on people’s lives. I don’t know.

Meanwhile I keep asking the questions and gathering the data that will answer them, even if I can’t share the results. Innovate, evaluate, disseminate is simply part of what I do. I’ve got grand plans for what the next project will be, but I’ll save them for another blog.

BTW, if you have any ideas where I can secure funding, or want to collaborate with me I’d love to hear from you. Likewise if you can offer an overloaded clinician with big ideas a nominal NHS and/or university home, that would also be very welcome. It is twice as hard to bid for funds when you work in a small company that nobody has heard of!

Dressing for the job: Presentation and the art of neutrality

Everyone seems to have different ideas of how to dress for work as a Clinical Psychologist or therapist. NHS dress codes are often generalised from nurses on the ward and make exclusions including jewellery, open-toed shoes and nail polish which seem unduly controlling and irrelevant for a therapist. Others argue that diversity is a good thing, and that a therapist can dress however they want to represent their personality and culture, and should not be judged for it. Many settings exclude jeans, whilst others say that jeans are comfortable, practical and come in a range of colours and cuts that are hard to distinguish from other forms of trouser. Are my dark blue velveteen trousers jeans because they have rivet joints, rear pockets and a zip fly? Some employers exclude visible tattoos, piercings beyond a single stud in each ear and hair colourings that are not naturally occurring, whilst these choices are becoming increasingly prevalent in society. Some want staff to dress conservatively, excluding garments that expose the midriff or involve short skirts or low cut tops for women, but is this just sexist? Some make specifications about being clean and tidy, but this means different things to different people. So I figured I’d reflect on my thoughts on the topic of appropriate dress for work as a Clinical Psychologist.

Firstly at a personal level: When I dress for work, most of the time I try to be as neutral as possible. I don’t want my appearance to lead potential clients and colleagues to make judgements about me that form a barrier between us any more than the factors I can’t avoid such as my gender, age, ethnicity, or accent. I want to appear professional but not intimidating, clean and well presented but not ostentatious. I want my clothes to be serviceable if I want to sit on the floor with a child to play, and not to inhibit me if I want to use paints and felt-tips, and not to cause offence to anyone else. Plus I want to be comfortable and not self-conscious about the way I move or sit. For these pragmatic reasons I normally wear trousers, though I do sometimes wear a long skirt or dress. I also like to wear fairly colourful clothes. I suppose I think they are more cheerful than drab colours, and suit me better than paler colours. So you’ll often see me in navy blue, bottle green, chocolate brown, burgundy, purple, turquoise or multi-colour patterned prints.

Of course I have more scope for relaxed dress, as I am effectively the boss, so no-one can tell me off for what I wear! It also makes a difference that I work with children and families, which is generally a more eclectic and casually presented workforce than settings like neuropsychology or forensics where “power dressing” in suits and business wear is more typical. Likewise my clients tend to be younger and more adventurous about what is acceptable than older adults, and I don’t work in health settings where the stronger dress codes apply. Nonetheless I don’t want to be so casual that people think I don’t take the work seriously, and I know that as a senior professional I need to acknowledge the expectations of others who imbue me with status and power. Certainly when I go to court I will always wear a suit to show I take the responsibility of advising on people’s lives seriously, and when I train other professionals I think carefully about my audience and the message I would like to convey. But here I am thinking particularly about client-facing work.

In terms of others, I’ve had to grapple with all kinds of examples of inappropriate presentation over my career. I have had members of my staff wear ripped jeans and trainers to meetings with professionals and clients. I’ve had some who seem to want to look like surfers or a member of an indie band, in slogan T-shirts, bermuda shorts and unkempt facial hair. To me this seems disrespectful towards those who have come to see you. I’ve had staff who have shown too much skin, whether midriff, cleavage, or the dreaded builder’s crack. This can trigger very strong responses in certain people, whether due to their own abuse history, their assumptions about showing more skin indicating sexual availability, or through their religious beliefs about modesty. I once had an employee with significant body odour, and had to have a cringe-inducing conversation in supervision to feed this back. However, not only is body odour unpleasant for those around you, it also creates a bad impression to clients and colleagues, and would put us in a hypocritical position when observing or trying to improve poor self-care.

At the other end of the scale I’ve had staff who over dress. I’ve had graduates who turn up in power heels and suits for everyday work. Or those who look like they’ve just dropped into a session on the way to the catwalk or tea with a Duchess, wearing designer clothing, expensive jewellery and branded accessories. It isn’t very practical to have shoes that you worry about getting wet, or a coat you can’t put down in a dirty environment, and if a client knows you spent more on your handbag than they get in 10 weeks of Job Seeker’s Allowance this may cause understandable resentment. At a more practical level, doing a home visit in a rough area whilst wearing £1400 of accessories, expensive jewellery and talking on a £600 smartphone must surely increase the risk of being a target of crime.

I am acutely aware that many of the families that we see come from high levels of socio-economic deprivation, so I would feel very uncomfortable if I felt that any of my clothes or accessories spoke of excessive wealth. I remember the feeling of visiting a conservative MP at home in a seven-figure mansion full of antique furniture, and a member of the household staff being sent to make the drinks. My mind immediately asked “what can someone who lives like this know of what real life is like for the majority of people in their constituency?” I would hate for a client to think that about me, and to take longer to build trust, or not to be able to confide the whole of their story because of it.

As a result I rarely wear branded clothing, and tend to stick with shopping for most of my work wear in department stores or supermarkets. Likewise I wear quite practical and mid-market shoes. Beyond my wedding ring I wear little jewellery. I rarely spend more than £60 on any item apart from a suit, and I don’t wear things to work that I would be too upset to spoil. My approach is also wider than physical appearance: I try to also be aware of what I talk about  to clients, in terms of whether it reflects my relative wealth and education, or my cultural values. How much of this is my personal taste or my bargainaholic nature and how much is 20 years of cultivating the most neutral appearance possible is hard to separate. Also in the mix are the dress styles of the supervisors and mentors that I have most admired during my career (mostly very down to earth people, of humble appearance) and of my own parents, especially as my mum is also a Clinical Psychologist.

I expect my employees to also find this balance between being themselves and appearing professional and neutral for clients. This includes being clean and tidy, not wearing overly revealing clothes, and being smart but not ostentatious.But beyond that I am happy with some personalisation to reflect their own style and culture. I wouldn’t want a team of clones. I also encourage my staff to reflect on what they bring of themselves to a session or meeting, whether in terms of appearance, body language, accent or content in conversation. We are often unaware of how this would be perceived by others outside of our familiar social circle (which is often defined by similar age, socio-economic status, culture, political values, education or other factors we won’t necessarily share with our clients).

For interview I would always advise presentation that is one step smarter than you’d expect to wear in the job, whilst being something you feel comfortable wearing and not so bland that you are instantly forgettable.

Finally, I wanted to clarify that although I think it is helpful to dress neutrally in a professional role, I don’t consider a person’s dress to be an excuse for how others react to them. Intolerance of appearance or dress relating to someone’s culture or religion is unacceptable and a form of harassment. Using skimpy clothing as an excuse for sexual harassment or assault is likewise totally unacceptable. A person is always responsible for their own behaviour, no matter how others look or what they do. So what I am talking about in this blog entry in terms of reactions to appearance are thoughts and feelings, whether conscious or subconscious, that may have an impact on the therapeutic or professional relationship, not actions that cause harm or fear.

High on scare, low on science: a tale of charity, politics and dodgy neuroscience

In 2011 when I took a voluntary redundancy from the NHS I was asked to help set up a parenting charity* focusing on the period from conception to age 2. I agreed to be the founding Clinical Director and to help them set policies, sort out pathways of treatment and recruit staff. I worked for them one day per week. After less than six months it was clear that there was a divergence between what I felt was most clinically helpful to say about supporting parents in this critical period and the primary goals of the charity**. This was particularly evident in what was being said to promote the launch event of the charity. The title of the launch conference was the dramatic and pessimistic pronouncement, “Two is too late”. This title was cast in stone despite my repeated protests that parents would feel blamed and might think that there was nothing they could do beyond the age of two if they had not had a perfect attachment relationship before this point (when the evidence suggests that there are in fact many effective strategies for enhancing attachment relationships beyond this point, and many therapies for helping children and even adults to learn to emotionally regulate, mentalise and have successful relationships, even where there has been poor attachments, neglect or maltreatment).

The media were given soundbites to promote the event that suggested a baby is born with only one third of their brain active, and the rest relies on the quality of parenting received to grow. The news coverage in the Telegraph*** said that “a failure to help troubled mothers bond with their babies can stunt the development of the children’s brains”. The BBC coverage*** stated “a growing body of research suggests that the amount a baby is loved in the first few months of its life determines to a large extent its future chances” (when love and the quality of the attachment a parent is able to provide are quite different things, the most critical period is usually cited as 6-18 months of age, and the change in prognosis is most impacted by significant maltreatment).

Although our tiny pilot had kept 5 children out of 6 at home with parents successfully, despite them being referred on the edge of care, I had some misgivings about the marketing messages. We had feedback from service users and user groups that they felt stigmatised by some of these messages, but the organisation was unwilling to hear that. I am passionate about the value of improving attachment relationships and I had written a brief literature review on the impact of poor early care to ensure that the project was informed by the evidence. I was also writing a book about attachment and the impact of maltreatment, but I couldn’t match my views up with the politics of the organisation. I felt that to stay would conflict with my professional ethics, and my desire to honour the evidence base and respect the people who needed the service, so I quit before the launch. My colleague decided it would be unsafe to practise in my absence and left at the same time, leaving the charity with no clinical staff. Nonetheless, they decided to make a very big launch event, that I could only describe as one third professional conference, one third stately home wedding and one third party political broadcast for the blue party. It sold 500 tickets to health professionals and other interested parties, and I went along to see the show.

The speakers included a Conservative Peer, Ian Duncan Smith and Andrea Leadsom, along with Dr Amanda Jones (who shared a case study of parent infant psychotherapy). The fantastic Camilla Batmanghelidjh was also present (and made a good job of challenging the lack of empathy from politicians for the people they serve and quipping that this reflects their avoidant attachment styles). I had invited Dr Michael Galbraith (a Consultant Clinical Psychologist who has run community children’s services in Liverpool for many years) to talk about the health economics of early intervention. He did so persuasively and he also challenged the politics that came before his talk (with genuine zeal, as his entire service had been closed in a cost-saving ‘reorganisation’ a few weeks prior to the conference). But the biggest draw was that Baroness Susan Greenfield was invited to talk about the epigenetic effects of early attachment experience on the infant’s developing brain****. As I had not heard of her work prior to this event I was intrigued.

The talk that Prof Greenfield gave was baffling from the off. It massively overran her time-slot, and the program was rearranged to give her a second slot in the afternoon to complete what she wanted to say. My recollection was of a chaotic set of shock images and headlines, with provocative statements which appeared to contradict my knowledge of the literature, despite the fact she claimed they were scientifically founded in hard neuroscience research. Thankfully the pdf of the PowerPoint she used was circulated after the event, so you can see the content for yourself (zip file to download here).

Her title was “The mind of the 21st Century Infant” overlaid on a stock photograph of a baby using a computer. She immediately moved on to dramatic images of a youth celebrating in front of a fire during the recent riots, blaming the riots on the lack of attachment young people have grown up with, which she said had been replaced by technology. She then showed scary images of “artificial intelligence” before trying to define the mind. Then she made a knight’s move to demonstrate that “environment trumps genes” through a single study of rats given genes that cause Huntingdon’s Chorea which had less symptoms if they lived in a more stimulating environment. Then back to human babies, and images of how neurones proliferate during the first 2 years of life. Then a study showing that the Hippocampi of taxi drivers are enhanced, and then some blobs designed to indicate that mental practise of piano also activates the brain like physical practise. Then back to rats, showing more neural connections in a richer environment than when rats are isolated in boring cages. Then a description of how the mind shifts during development, from sensory processing to cognitive experience and gives greater meaning over time, with the view this is driven by experience.

She then claimed the mind might be “changing in unprecedented ways” due to interaction with technology, and showed alarming headlines circled in red, and book titles reflecting her view that internet use is changing our brains.

Prof Greenfield then showed a study counting children’s hours of screen time reported by parents, according to the child’s age. The source cited turns out to be a report saying that children have always used whatever media is current, mostly watching TV (which has been on for 7 hours per day since the 1970s) and although digital media is rapidly proliferating including learning toys, music and phones, total media use by white children had only increased by 38 minutes between 2004 and 2010, though it was more prevalent in low income families and had increased more in BME families. It states there is no evidence yet about how much is too much when it comes to media consumption but states that “media platforms by themselves are neutral; what matters most are the choices made by parents, educators, educational production companies, and other content providers in order to encourage a balanced pattern of consumption” using the metaphor of needing a balanced diet. This was not reflected in Prof Greenfield’s narrative about this amount of media being harmful, and it is unclear how she extrapolated the figures in her table.

Another leap, and we were onto how dopamine is the reward chemical and behind all addictive behaviour. Prof Greenfield said that it changes neural activity, inhibiting the frontal lobes. This is why children are becoming fat, sedentary and obsessed with technology. They are all addictions, and disrupt our frontal functioning. Then a leap to schizophrenia not having sufficient frontal lobe activity, and reverting the brain to sensory processing which is fragmented and without meaning. Another slide full of brains: The prefrontal cortex is not mature until your 20s. Then a claim that schizophrenia, gambling, over-use of screen technology and over-eating have a common pattern of prioritising our senses over reason, due to dopamine making us mindless rather than able to synthesise meaning. It felt very alarming to have schizophrenia and addictions linked to the same pathways as attachment difficulties and technology use. The implication was that parents could cause these difficulties in how they parented babies, or by allowing children to use digital media. These are claims for which I have never read any scientific evidence, despite being a clinician working in this area and trying to keep abreast of the research literature.

Another leap to social media and how it makes us “alone together”. Prof Greenfield told us how real communication is three dimensional, and little of the meaning is conveyed in the words, whilst 90% is in eye contact, body language, tone of voice, perhaps even touch and pheromones. But online we have only the words. According to her, this is why empathy has dropped over the last 30 years (another newspaper headline, not a scientific study, and with no reflection on the socio-political changes that might explain this). The lack of empathy required is why people with autism are so at home with technology and on the internet. People also have reduced identity, so they have to record their existence online. Prof Greenfield characterised the development of online communication as going from describing your cat sneezing on Blogger, to putting up a photo on Flickr, to a video on YouTube, to live Tweeting the action, saying that such activities reflected the author as a disconnected “nobody” who needs to prove they exist. She postulated that a rise in social networking is the cause of reduced empathy and people having a less robust identity, but it seems to me that even if these two things co-occur the direction of causality could be the reverse.

She then skipped on to the evils of video games, inserting a slide with MRI scans to show reduced listening when looking at something else, before blaming video games for the increase in methylphenidate prescriptions. Prof Greenfield claimed ADHD could be caused by video games because they lead to “fragmented attention, shorter attention span and increased recklessness” because they activate the dopamine system. Another headline in a red circle saying children who love video games have “brains like gamblers”. Then she showed us her own work bringing this together: a proposed cycle of how the intense stimulation and immediate feedback lead to high arousal and dopamine release, reward seeking behaviour and this makes brain changes which cause “conditions of childhood, schizophrenia, obesity” and a drive for sensation over cognition increasing the appeal of screen based stimulation in a continuous cycle. Again, I don’t believe any of these claims have appeared in peer reviewed publications or have any evidence to substantiate them, and even if there was evidence of co-occurrence the direction of causality is far from certain. There is however a growing body of evidence that some symptoms that could be interpreted as ADHD-like are caused by early trauma and maltreatment having an impact on neural development. To end that section, Prof Greenfield juxtaposed the “mindless” brain slide with a shot of World of Warcraft and mocked the lifestyle she believed was typical of those who play the game.

Then Prof Greenfield turned her attention to search engines, claiming they give fragmented information but nothing about meaning. By way of example she claimed that you can’t possibly understand what honour is from the search engine results produced by that term. Again, she claimed digital media is all fragmented content, lacking metaphor, depth and meaning. She strongly asserted that nobody could care about a character in a video game like you care about characters in a novel. Again, I would disagree with this. Like any media, video games are very diverse in style and quality and you pick ones that fit your taste, just as you would with a book or a film. If you don’t like violence, don’t pick a violent one. You don’t have the same expectations for the latest chick lit/flick as you do a weighty classic. Some examples are also of better quality than others, some focus on special effects over plot, others are low budget and whimsical. In my opinion if you feel immersed and the story is told well it feels like time well spent and you care about the characters and outcomes, whether the media is a video game, a book or a film. Its disingenuous of her to pick a random game she has probably never played and say nobody could care about a character in it as much as one in War and Peace.

Prof Greenfield then talked a little about the benefits for children of reading with a parent, and how we need to “make up our own minds”. She finished by advertising her books, and claiming that “mind change is the new climate change, the biggest issue facing us in the 21st century”. I’d share the comments on this claim raised here.

The whole felt to me like a mishmash of pseudoscience, headlines and speculation that didn’t even address the topic of the conference. Even if there was persuasive scientific research about the impact of using digital media (which I wasn’t persuaded), it wasn’t relevant to the conference as babies don’t use it. Her talk wasn’t about the importance of relationships between conception and two, which was what the conference was designed to highlight. She had come with a single agenda to sell. And it was clear that she was very much an outsider looking in when it comes to technology; judging it with minimal knowledge of social media, the internet, or video games.

As someone fairly immersed in that world, I could pick out numerous examples of violence in TV, film and video games, particularly violence against women and children. I might even be able to make a prima facie case that we are being desensitised to human suffering (and violence and sexism is being normalised). It is possible that the manufacturers of such products are buying into various ‘exciting’ neurochemical pathways that deal with arousal and reward (cortisol, adrenalin, dopamine), over those that deal with relationships, empathy, love and the ability to soothe (oxytocin and the work of the prefrontal cortex). But I think Susan Greenfield is making a huge correlation-causality error when she blames new media for people becoming isolated and lacking social skills and healthy relationships. I think there is much more evidence that real life experiences of maltreatment prime certain brain changes that make people more sensitive to later triggers and confer vulnerability for later mental health problems (see the work of Prof Eamon McCrory, for example) than that digital media is the cause of the problem.

I do think that if people lack templates for how to do real relationships in a healthy way, and haven’t learnt empathy and self-soothing skills, then these kind of media have a stronger attraction and a different effect on their brain, and can perpetuate rather than ameliorate this pattern. However, in the end I figure that people can always fill their time with something that disconnects them from others, or anaesthetises their pain. In other words, it isn’t the availability of the internet or video games that is the problem (any more than the presence of cheap alcohol, or drugs), it is the unhappiness and isolation that creates the void people want to fill with those things. And that has much more complex solutions, though it might generate less click-bait headlines.

* It is now nearly 3 years on, and I am confident that the clinicians recruited after I left have been able to establish a high quality service, so I would not wish to imply any concern about the services they provide.

** I felt, cynically perhaps, that there was a second agenda designed to promote the MP who founded the project and her political party which was of more importance than our clinical goals, although this was never explicit.

*** http://www.telegraph.co.uk/women/mother-tongue/familyvideo/9273569/New-post-natal-depression-charity-will-address-huge-gap-in-provision.html

http://www.bbc.co.uk/news/uk-england-northamptonshire-18117945

****The promotional flyer for the event said “We are honoured to announce that Baroness Susan Greenfield, Professor of Synaptic Pharmacology at Oxford University, whose speciality is physiology of the brain will bring you up to date on the Science, Neuroscience and Epigenetics”.