The misrepresentation of evidence

About a week ago I was involved in a heated twitter debate about this blog post. I felt, as I said on twitter and in my extensive comments about the blog, that it entirely misrepresented the evidence about Adverse Childhood Experiences by implying that because of risk multipliers within particular population groups, certain negative outcomes were almost inevitable for people with multiple ACEs. The author repeatedly asks rhetorical questions like “If 1 in 5 British adults said they were abused in childhood in the last CSEW (2017), why hasn’t our population literally collapsed under the weight of suicides, chronic illness, criminality and serious mental health issues?” Likewise, she asks how anyone can be successful after childhood abuse if the ACEs research is correct. I replied to explain that this simply isn’t what the data tells us or what risk multipliers mean, so the exceptions are expected rather than proof the finding is incorrect. For example the claim that a 1222% increase in the risk of suicide amongst people with 4 or more ACEs meant these people were doomed, in reality means that the odds increase from 1 in 10,000 to 1 in 92, meaning that 91 of every 92 people with 4+ ACEs do not die by suicide.

ACEs are a very useful population screening tool, and have provided incontrovertible evidence of the links between traumatic experiences in childhood and numerous social, psychological and medical outcomes that has been highly informative for those of us designing and delivering services. To me it seems like an example of how a simple piece of research can have a massive impact in the world that benefits hundreds of thousands of people. Yet that blog repeatedly implies ACEs are a harmful methodology that “targets” individuals and to is used to “pathologise and label children, arguing that those kids with the high ACE scores are destined for doom, drugs, prison, illness and early death”. It has been my experience that ACEs are used not to pathologise individuals, but to to highlight increased vulnerability, and to identify where there might be additional need for support. For example, I have used this data to argue for better mental health services for Looked After Children.

I felt that the repeated misrepresentation of the maths involved in interpreting risk multipliers undermined the entire message of the blog, to which I was otherwise sympathetic. (For the record, it is entirely appropriate to highlight bad practice in which it seems certain professionals are applying ACE scores to individuals inappropriately, and making people feel that their life chances are restricted or their parenting under scrutiny because of their childhood experiences of trauma). But unfortunately the author took my polite, professional rebuttal of elements of her blog as a personal attack on her – to the extent that she misgendered and blocked me on twitter, and refused to publish my response to her comments about my reply to her on the blog. That’s a shame, as the whole scientific method rests on us publishing our findings and observations, and then learning from the respectful challenge of our ideas by others with knowledge of the topic. But I guess we are all prone to defending opinions that fit with our personal experience, even if they don’t fit with the evidence.

Thinking about how uncomfortable it felt to see someone I considered to be a peer whose expertise I respected misrepresenting the evidence and being unwilling to correct their misconceptions when challenged, but instead trying to discredit or silence those making the challenge, it struck me that this was an example that highlighted a wider issue in the state of the world at the moment. Evidence is being constantly misrepresented all around us. Whether it is the President of the USA saying there is a migrant crisis to justify a wall (or any of the 7644 other false or misleading statements he has made in office) or the claims on the infamous big red bus that Brexit would give the NHS £350 million per week, or Yakult telling us their yoghurt drink is full of “science (not magic)” now that they can’t pretend live cultures are good for digestive health. There are false claims everywhere.

I stumbled into another example just before I started writing this blog, as I (foolishly) booked accommodation again through booking.com, despite the horrible experience I had last time I tried to use them (which remains unresolved despite the assurances from senior managers that they would reimburse all of my costs). I booked a room in a property in London which they have euphemistically called “Chancery Hub Rooms” to stay over whilst I delivered some training in Holburn. It wasn’t a hostel or a hotel, but just a small terraced house. This time it had keypad entry to the property and to the individual room, which is a system that I have used successfully several times in Cambridge. Unfortunately it didn’t work so well in London, as they changed the codes twice without informing me. Once this resulted in locking me out of the room on the night of my arrival (and meaning that the beeping on the door as I tried the various codes they sent me woke the lady in the neighbouring room, due to the total lack of sound insulation in the property) and then by locking me out of the property the following evening, when all my stuff was locked inside. It also had glass inserts above the room doors that meant your room lit up like Times Square when anyone turned the landing light on. I then discovered that the building (which I already recognised to be small, overcrowded and not complying with fire regulations) had walls like cardboard, when the couple in the next room had noisy sex, followed by noisy conversation and then a full blown argument that lasted from 3am to 4am – despite me eventually in desperation asking them quite loudly whether they could possibly save it for a time that wasn’t keeping everyone else in the building awake. Of course Booking.com didn’t see it as their problem, and the member of staff I got through to after a 15 minute call queue didn’t even seem to comprehend what the problem was (though I couldn’t tell if the barrier was language, accent or simply not understanding the situation). The property management company just blamed the other guests for being inconsiderate, whilst denying that the building was inherently problematic and saying they had no alternative accommodation or staff intervention to offer.

So I felt like I should be able to reflect my negative experience in my review. But oh no, Booking.com don’t let you do that. You see, despite seeing that properties appear to have scores out of ten on every page when booking, you can’t score the property out of ten. What you can do is to determine whether you give a smiley that ranges from unhappy to happy for each of their five ratings (which don’t, of course, include quality of sleep or feeling safe). So if you think the location was convenient, the property gets a score above five out of ten, no matter what other qualities mean you would never wish to sleep there again. But worse than that, the Booking.com website forces reviewers to give a minimum length of both positive and negative comments, but only displays the positive comments to potential bookers. So my “It was in a quiet, convenient location” gets shown to clients, but you have to work out how to hover in the section that brings up the review score, then click the score to bring up the averages, then click again to access the full reviews, and then shift them from being ranked by “recommended” to showing them in date order to actually get an objective picture. Then you suddenly see that at least half the guests have mentioned that the noise from other rooms is a problem, the bathrooms were inadequate or failed to provide hot enough water, the fire exits were locked, the beds were cheap, the pillows flat, there was a strong smell of air conditioner that left people itchy or wheezy, the TVs in the rooms don’t work and not a single person had got the free wifi to work. It really made me wonder whether Advertising Standards might have something to say about it.

But just as Boris has faced no consequences for his bus claims (even though he stretched them further still after the ONS said he had misrepresented the truth), and Trump no consequences for his lies, and the consultants selling contracts worth hundreds of thousands of pounds of public funds to children’s social care departments proudly told me they just wanted to get on with the doing without that slow process of validation, so the world carries on with little more than a tut of disapproval towards people and businesses who intentionally mislead others. Maybe I’m in the minority to even care. But I do care. I feel like it is the responsibility of intelligent people and critical thinkers, people in positions of power, in the professions and particularly in the sciences, to ensure that we are genuinely led by the evidence, even if that makes the picture more complicated, or doesn’t confirm our pre-existing beliefs. To counteract this age of misinformation, we all need to be willing to play our part. That is why I have always placed such a focus on evaluations and research, and have developed my screening tools so slowly and thoroughly, despite the fact that potential customers probably don’t see this as necessary. I believe that as much as possible, we should be promoting the value of evidence, educating the public (including children) to be able to think critically and evaluate the evidence for claims, and stepping up to challenge misleading claims when we see them.

 

The elephant in the room: Mental health and children’s social care services

I heard a few months ago that the Housing, Communities and Local Government Select Committee were undertaking an inquiry to look at the funding of local authorities’ children’s services, and thought that sounded like an interesting topic that might relate to my areas of interest. I therefore met with a local MP about the topic, contributed to the BPS response to the inquiry, and (on the request of the committee) submitted my own response in relation to my innovative work with BERRI. I have subsequently been called to give evidence in person to the enquiry in a few weeks time.

Given I’ve been so immersed in this issue it seemed a good topic for a blog. I’m going to start with the evidence that this sector is in crisis, before thinking more about what a clinical psychologist like myself can contribute to addressing elements of this need. Hopefully I can then write another blog in a few weeks time to talk about my experience of giving evidence, and report back about whether the politicians grasp the issues and appear motivated to do something about it.

It didn’t surprise me that this was an issue that the government wished to give more scrutiny, given the steep increase in need in this area over the last decade, whilst funding for local authorities has been substantially reduced by the government’s austerity agenda. Human distress and unmet need rarely seems to gain political attention unless it is in such a crisis that the public are aware of the issues, or it has financial implications for the public purse, and children’s social care has suddenly hit both of those thresholds in the last year or so. 

A number of factors have combined to increase need in children’s services. This includes growing awareness of child abuse and its impact (particularly emotional abuse which has long lagged behind the more tangible forms of abuse), along with reduced stigma in disclosing having been abused (due, for example, to the publicity surrounding the Jimmy Saville scandal, the various institutional abuse enquiries, and the #metoo movement) and a reduced tolerance for forms of abuse that had been normalised or ignored in the past (due to cases like Baby P and the Rotherham child sexual exploitation trials, and subsequent prosecutions in many other areas). A lot of teenagers who had been allowed to remain in unsuitable living circumstances because of the belief that they would “vote with their feet” if removed are now appropriately protected and brought into Care, perhaps because of some precedent setting cases in which people have taken successful legal action against local authorities and have been compensated for failures to protect them in childhood. This includes an enormous legal settlement for two Care leavers from Jersey, who have received tens of millions of pounds compensation.

Children in Care are also entitled to stay in their foster placements up to the age of 21 where they want to and it would be beneficial for them, and to have support after leaving Care from a personal advisor until the age of 25. Another pressure is the reduced use of secure units on welfare grounds, and a reduced willingness to incarcerate children in institutions for recurrent minor offending. The increased stress, shame and social hardship of benefit changes and increases to cost of living has led to move children growing up in poverty, and more families developing the risk factors that can cause harm to children, such as drug or alcohol use, mental health problems, domestic violence and family breakdown. This has had a particularly negative impact in families in lower socioeconomic groups.

It is therefore unsurprising that over the same period of time the demands for social care services have risen steeply. Over the last decade there has been a 9% increase in referrals to social care and numbers of children considered in need, but there has been a 84% rise in child protection cases, and 26% more children are in Care. This creates a lot of additional workload for children’s services, with a 122% increase in demand for section 47 enquiries, and a 125% increase in Care Proceedings (as less children are now informally Accommodated with parental consent). Yet the budgets have shrunk, so there is no resource available to meet this need.

The financial picture is genuinely shocking, and yet it has hardly made the news (perhaps because looking at the numbers is considered too technical or boring for the lay public, and the political and news agenda has been hijacked by the continuing debacle of Brexit). But reviewing the figures makes sobering reading. The cuts to local authorities since 2010 are unprecedented. The National Audit Office highlighted the extent of the shortfall in their report on the financial sustainability of local authorities published last year. They point out that central government spending on social care has halved. This has been masked by changes in how funding is delivered, and some additional funds from council tax being made available to spend locally, but the cuts are still enormous and amount to a real terms reduction of nearly one third of the entire budget for local authorities, but the burden is again being disproportionately felt in more deprived areas.

Such cuts are unrealistic and unsustainable, as they make the total budget too small to cover anything other than statutory services, which are legally protected. This means that councils have no means to make ends meet without dipping into their savings. The report shows that two thirds of local authorities had drawn from their reserves by 2016-17, so there is an ever decreasing amount left in the pot for contingencies, and the audit office predicted that 11% of authorities will empty that pot by the end of this financial year. Councils are having to sell off properties and come up with increasingly radical plans to try to fulfil their minimum duties. Recently Northamptonshire County Council had to declare themselves bankrupt as they had no means to cover statutory services from the available budget.

This mismatch between demand and resourcing has led to enormous cuts to non-statutory services, with two thirds of the spend on preventative and community children’s services disappearing. This means that, as with mental health, there is a minimal set of brief services delivered for milder or less entrenched difficulties, but that there is then an abyss in which no services are available until they reach the threshold for the crisis-focused specialist services – which are expensive and time-consuming to deliver and can’t keep up with demand. The focus has moved from collaborative work to assessments and interventions that are perceived as the end of the line, despite the absence of the precursor interventions that might have enabled change.

To me, the elephant in the room when it comes to children’s social care is mental health need. I don’t just mean the clean single-condition, diagnosable treatable mental health need that gets through the doors to CAMHS. That’s the need up on the sterile concrete plains of mental health research that Prof Miranda Wolpert describes so well. I mean the real messy need down in what Miranda calls the swampy lowlands where real complex people live in varied circumstances, where numerous issues intersect to create barriers in their lives that are not straightforward to address, and do not fall into the simple diagnosis to treatment pathway that currently gets through the doors to CAMHS. That’s the need that determines the outcomes for these children, and the pathway on which they leave Care and try to negotiate adulthood. It is that need which determines whether they can go on to happiness, employment and family life or whether they become one of the Care leavers who end up facing prison, homelessness, mental health problems, addiction, conflict and/or their own children going into Care.

So what are these broader mental health needs? In my experience, a complex and interwoven picture of trauma, adversity, behaviour problems, attachment difficulties, developmental disorders or delay and mental health needs is typical of children in Care or receiving social care services. As well as the traditional “mental health” needs of anxiety and depression I see a much broader picture that is expressed in a variety of ways. Some children act out with their behaviour, others withdraw and show signs of emotional difficulties (including low mood, poor self-esteem, and a lack of positive identity or perception of belonging). They often struggle to form healthy relationships/attachments to others, and can present a risk to themselves and others. They have an increased prevalence of conditions like Learning Disability, Autism, ADHD, or psychosis that add an additional layer of challenge in standard services effectively meeting their needs. That is why my BERRI assessment system attempts to cover all of these areas.

Seen as a group, children who are Looked After have high levels of mental health difficulties (45% have a diagnosable condition, and over two thirds have significant mental health need), so it would be easy to blame the Care system. However, this extraordinary level of need is predominantly caused prior to them coming into Care. It is well established that Adverse Childhood Experiences lead to multiple layers of vulnerability, and these are very prevalent for Looked After Children (my own research suggests an average of 4 historic ACEs per child, along with 2 current vulnerability factors at the point they come into care, such as involvement in gangs, sexual exploitation, school exclusion or the criminal justice system). Looked After Children are in the vast majority traumatised children, who have experienced abuse and/or neglect. But these problems don’t occur in isolation. They are contextually embedded. Children in Care come disproportionately from families that experience the adversities of poverty, crime, family breakdown, and poor housing. They are more likely to be born to parents who have lower education, higher risks of unemployment, and a higher incidence of mental health problems, substance misuse, domestic violence and a history of abuse or neglect in their own childhoods. As a result, their parents are less able to provide safe and stable care. Patterns of difficulty often carry through many generations of the family, and the problems they face are a symptom of our increasing social inequality. 

However, CAMHS are not really set up to meet these complex and interwoven needs, and cut off at 18 years of age, whilst children can stay in care until they are 21 and receive leaving care services until the age of 25. They also have ongoing needs that will need to be revisited over time as they develop or different themes emerge as they enter different life stages or face different challenges. It might be that a dental care model, in which there is long-term oversight but with responsive services as and when they emerge works better than the time-limited episodic care that is currently on offer. Likewise services need to be embedded so that they collaborate with placements and other support services, rather than stand in isolation.

The wider context of the underlying contextual and vulnerability factors mean that treating symptoms or even specific conditions might be an ineffective model of intervention. We need to think back to Maslow’s hierarchy. These children first and foremost need their basic needs met, and to have reliable food, shelter and warmth. They need safety and security, medical care and an environment that doesn’t contain ongoing risks. They need opportunities for identity and belonging, such as education, employment, hobbies, peer relationships, and family. They need intimacy and trust in their friendships, sexual/romantic relationships and relationships with carers. When that is reliably in place they need opportunities for achievement and being valued, so that they can gain self-esteem, confidence, status, responsibility and individuality. The icing on the cake is then self-actualisation, the chance to explore creativity, set goals, reflect on morals and values, and feel purpose and fulfilment. Mental health needs only fit in mid-way up that pyramid. We cannot expect a child to have a positive outlook and good coping strategies and social skills if they are not in a safe environment, don’t have their basic needs met, or cannot trust those around them. To see the point of going along to a therapist takes enough self-esteem to believe you deserve to feel happier, and you then need the organisation and social skills to get there, and the trust to confide your story, or a carer who will advocate for you and help you to achieve these steps. There are many building blocks that need to be put in place by the caregiver and environment before therapeutic interventions are possible, and it may be that when we get these other elements right, the child is able to recover using their own resources and that of their caregivers, without ever seeing a therapist.

My perspective is that if we can help to identify needs of children as early as possible and skill up the caregivers and the systems around the child, we can make the most impact. That is why I have increasingly moved from working with individual children to working with their caregivers and the systems that surround them, and have developed the BERRI system to identify needs and help carers understand them, as well as developing and delivering training to help carers and professionals understand the needs of the children and young people better. It doesn’t have the depth of working psychologically with a single individual, but it has the scope to make impact on a much wider scale, and it fits better with my personal strengths and interests. As I’ve said before, I’m not the most patient therapist to walk a long journey of recovery or personal development with a client, but I do have strengths with assessment and evidence-based practice.

My aims have always been to address human needs. I believe that Clinical Psychology in its simplest form is an attempt to make people happier and more able to lead fulfilling lives, and that is what drew me to this profession. And within that broader mission, my focus is to work with the most vulnerable members of society at the earliest possible point in the lifecycle, which has brought me to working with Looked After Children and the broader population of children and families receiving (or in need of) social care services. Recognising the mismatch between the level of need and the resources available to meet that need has increasingly led me to focus on systemic and population level interventions. Rather than drowning in the burnout that comes with trying to solve an overwhelming problem, I’ve tried to find a niche where my skills can make an impact. Having looked at this population group from multiple perspectives, and tested out projects in various settings, I have become increasingly persuaded that there is scope to make positive changes through the use of better systems to identify need, and increased clinical governance over the choice of placements and interventions. 

I have tried to develop practical, cost-effective ways to make a difference, and to gather evidence of their efficacy. I have then tried to share my findings, and what is already known from research, with the widest and most influential possible audience. That is why I have given so much of my time over to writing best practice papers and contributing to policy. Through these experiences I have gradually learnt to shape the messages I share to make them relevant and understandable to various audiences. After all, whilst most of psychology seems common sense to those of us working in the profession, once you have learnt about the main findings and the methodologies for gathering knowledge, to lay people (and professionals, commissioners and politicians) it might seem very complex and unfamiliar. Over time I have learnt that being able to articulate the financial benefits of improving people’s lives helps to get decision makers on board. So my goal in responding to the enquiry was to explain both the human and financial case for greater psychological input for children receiving social care services. I don’t know how well I have achieved that, but I’d be interested in your thoughts and feedback.

Reaching the summit?

For a long time, I’ve had a metaphor in my mind about how it feels to run a small business aiming to change children’s social care. The image is of me rolling a massive boulder up a hill. Progress is slow, it is hard work and I often find it tiring. Even when I rest I have to do so holding the rock in place. At times I feel like I might be reaching the summit, only to see that there is another climb ahead. I sometimes wonder why I’ve taken on this mammoth task, or whether my goals are even possible, but I am stubbornly determined that now I’m so far up the hill I don’t want to give it up. Maybe that is about sunk cost. But I’ve chipped off the worst of the bumps from the rock and got my rolling technique worked out, so I keep telling myself that if anyone can get this thing to the top of the hill, I can. Over the years of my journey I’ve tried to encourage other people to help me to push, so I am not bearing all the weight, but whilst I’ve had good company at times and plenty of encouragement, it has always seemed like the task is mine alone. That has been reinforced by numerous people telling me how I’m uniquely skilled at rock-rolling, even though I know that I was no better than many other people at the start of my journey. In fact I’m pretty sure anyone with some pretty basic skills who rolled a rock for this long could be standing in my shoes.

Of course, that bypasses the fact that I had to be willing to spend a lot of time on this, be resilient in the face of obstacles, and give up other easier opportunities to stick with it. And the fact I had the intellectual, social and personal characteristics to work out how to do this, choose a viable route and make improvements along the way. And it also omits to mention that had I known the real scope of the task would take me over a decade I might not have taken it on at the beginning. On the other hand, perhaps the fact it was difficult enough for nobody else to take on was why I did it. I think those who know me might point out it isn’t the first time I’ve jumped in at the deep end, and that I don’t do things in half measures. I don’t like taking the easy route in life, and if I set myself a challenge I like doing the task properly. I’ve always thought about what I can do to make the most impact, rather than to have the easiest life or earn the most money. I prefer to cut my own path, than to take one that is already well-trodden, and to find a way to enjoy the challenges of the journey.

So here I am, pushing my boulder and feeling like I’ve come quite a long way over the years. I might be deluding myself, but the gradient appears less steep these days. In fact, it feels tantalisingly close to reaching level ground, and I am starting to imagine what it might be like to roll my boulder down the other side of the hill. I’m trying not to be complacent that I’ve reached a point at which the boulder is stable enough not to roll back the way we came up, but people are starting to talk about how this boulder is not just on the level, but given one more push might gain enough momentum to create a landslide that will divert the river to irrigate the lands the local population need to farm. That would be beyond my wildest dreams. I mean, the motivation behind all this is to improve the lives of people who are having a tough time, but to think that it could have impact on the scale some people are now anticipating is mind-blowing. That would mean my big gamble of investing so much time and effort into this project could pay off in terms of impact. In a way that’s the great thing about indirect interventions – that they can make change that ripples out on a much bigger scale. In my boulder metaphor I’m trying to make change not by trying to teach them new farming skills one by one, but by trying to address some of the systemic barriers that impair their life chances, so that they have the opportunity to find their own ways to thrive.

So this blog is a marker of me standing at what I hope might be the top of the hill, and crossing my fingers the gaining momentum part happens. The mixture of hope and uncertainty is stressful to balance. When it’s a bit more concrete I’ll write a bit more, and hopefully I’ll not need a metaphor to couch my cautious optimism in, and can tell you about the actual project and the steps I’ve taken to progress it.

Communicating the value of evidence

I presented at a couple of conferences over the last few weeks about my BERRI system. And I was struck, once again, by how little weight is given to evidence when it comes to services that are commissioned in the social care sector. Various glossy marketing claims and slick consultants were successfully persuading commissioners and service managers that it was equivalent to use their systems and “metrics” (in which people gave entirely subjective ratings on various arbitrarily chosen variables) to using validated outcome measures. By validated outcome measures, I mean questionnaires or metrics that have been developed through a methodical process and validated with scientific rigour that explores whether they are measuring the right things, whether they are measuring them reliably, whether those measures are sensitive to change, and whether the results are meaningful. A pathway that then leads to an established scientific process of critical appraisal when those studies are presented at conferences, published and made subject to peer review.

But outside of the academic/scientific community it is very hard to prove that having a proper process is worth the time and investment it takes. It means that you are running a much longer race than those who work without evidence. At one event last week, I asked a question of a consultancy firm making hundreds of thousands of pounds out of “improving children’s social care outcomes”, about their basis for what they chose to measure, how they measure it, and how they had validated their claims. The answer was that they were confident that they were measuring the right things, and that having any kind of scientific process or validation would slow down their ability to make impact (aka profit). My answer was that without it there was no evidence they were making any impact.

They couldn’t see that their process of skipping to the doing bit was equivalent to thinking that architects, structural drawings, planning permission and buildings regulation control slow down building houses, and selling houses they’d built without all that burdensome process. Thinking anyone can build a house (or a psychometric measure to track outcomes) feels like an example of the Dunning-Kruger effect, the idea that those with the least knowledge overestimate their knowledge the most. But the worst thing was that those commissioning couldn’t see the difference either. They find the language of evidence to be in the domain of academics and clinicians, and don’t understand it, or its importance. We are in an age where expertise is dismissed in favour of messages that resonate with a populist agenda, and it seems that this even applies when commissioning services that affect the outcomes of vulnerable population groups. I don’t know how we change this, but we need to.

For those who don’t know, I’ve been working on BERRI for 12 years now, on and off, with the goal of being able to map the needs of complex children and young people, such as those living in public care, in a way that is meaningful, sensitive to change and helps those caring for them to meet those needs better. For as long as I’ve worked with Looked After children, there has been a recognition of the fact that this population does worse in life along a wide range of metrics, and a desire to improve outcomes for them for both altruistic and financial reasons. Since Every Child Matters in 2003, there have been attempts to improve outcomes, defined with aspirations in five areas of functioning:

  • stay safe
  • be healthy
  • enjoy and achieve
  • make a positive contribution
  • achieve economic well-being

A lot of services, the one that I led included, tried to rate children on each of these areas, and make care plans that aimed to help them increase their chances in each area. Each was supposed to be associated with a detailed framework of how various agencies can work together to achieve it. However, whilst the goals are worthy, they are also vague, and it is hard to give any objective score of how much progress a young person is making along each target area. And in my specific area of mental health and psychological wellbeing they had nothing specific to say.

As with so much legislation, Every Child Matters was not followed up by the following government, and with the move of children’s social care and child protection into the remit of the Department for Education, the focus shifted towards educational attainments as a metric of success. But looking primarily at educational attendance and attainments has several problems. Firstly it assumes that children in Care are in all other ways equivalent to the general population with which they are compared (when in fact in many ways they are not, having both disproportionate socioeconomic adversity and disproportionate exposure to trauma and risk factors, as well as much higher incidence of neurodevelopmental disorder and learning disability). Secondly it limits the scope of consideration to the ages in which education is happening (primarily 5-18, but in exceptional circumstances 3-21) rather than the whole life course. Thirdly it doesn’t look at the quality of care that is being received – which has important implications for how we recruit, select and support the workforce of foster carers and residential care staff, and what expectations we have of placement providers (something I think critical, given we are spending a billion pounds a year on residential care placements, and more on secure provision, fostering agencies and therapy services that at the moment don’t have to do very much at all to show they are effective, beyond providing food, accommodation, and ensuring educational attendance). Finally, it masks how important attachment relationships, and support to improve mental health are in this population. I can see that strategically it makes sense for politicians and commissioners not to measure this need – they don’t want to identify mental health needs that services are not resourced to meet – but that is significantly failing the children and young people involved.

In my role as a clinician lead for children in Care and adopted within a CAMH service, I kept finding that children were being referred with behaviour problems, but underlying that were significant difficulties with attachment, and complex trauma histories. I was acutely aware that my service was unable to meet demand, leading us to need some system to prioritise referrals, and that there was a lot of ambiguity about what was in the remit of CAMHS and what was in the remit of social care. I wasn’t alone in that dilemma. There were a lot of defensive boundaries going on in CAMHS around the country, rejecting referrals that did not indicate a treatable mental health condition, even if the child had significant behavioural or emotional difficulties. The justification was that many children were making a normal response to abnormal experiences, and that CAMHS clinicians didn’t want to pathologise this or locate it like an organic condition inside the child, so it should best be dealt with as a social care issue.

On the other hand, I was mindful of the fact that this population have enormous mental health needs, having disproportionately experienced the Adverse Childhood Experiences that are known to lead to adverse mental and physical health outcomes. Research done by many of my peers has shown that two thirds to three quarters of Looked After children and young people score over 17 on the SDQ (the Strengths and Difficulties Questionnaire – the government mandated and CORC recommended measure for screening mental health need in children) meaning they should be eligible for a CAMH service, and various research studies have shown that 45% of LAC have a diagnosable mental health condition, but the resources are not available to meet that need. As The Mental Health Foundation’s 2002 review entitled “Mental Health of Looked After Children” put it:

Research shows that looked-after children generally have greater mental health needs than other young people, including a significant proportion who have more than one condition and/or a serious psychiatric disorder (McCann et al, 1996). But their mental health problems are frequently unnoticed or ignored. There is a need for a system of early mental health assessment and intervention for looked-after children and young people, including those who go on to be adopted.

My initial goal was to develop a new questionnaire to cover the mental health and psychological wellbeing issues that this population were experiencing, as well as considering attachment/trauma history and the child’s ability to trust others and form healthy relationships, and the behaviours that these often expressed through. I was also interested in what issues determined the type of placement given to a child, and the risk of placement breakdown, as well as what opened doors to specialist services such as therapy, and whether those services and interventions really made any difference. I therefore ran two focus groups to explore what concerns carers and professionals had about Looked After children and young people, and asked them about what they saw that might indicate a mental health problem, or any related concerns that led people to want my input, or that caused placements to wobble or break down. One group contained foster carers and the professional networks around them (link workers, children’s social workers, the nurse who did the LAC medicals, service managers) and one contained residential care workers and the professional networks around them (home managers, children’s social workers, the nurse who did the LAC medicals, service managers). I wrote their responses down on flip-charts, and then I sorted them into themes.

I had initially thought that it might cluster as behavioural and emotional, or internalising and externalising, but my items seemed more complex than that. In the end there were five themes that emerged:

  • Behaviour
  • Emotional wellbeing
  • Risk (to self and others)
  • Relationships/attachments
  • Indicators (of psychiatric or neurodevelopmental conditions)

The first letters gave me the name for the scale: BERRI. I then piloted the scale with various carers, and then with a group of clinical psychologists involved with CPLAAC (the national network within the British Psychological Society that contained about 300 Clinical Psychologists working with Looked After and Adopted Children that I was chair of for about six years). I then added a life events checklist to set the issues we were identifying in context.

The working group I chaired in 2007 on the state of outcome measurement for Looked After and adopted children (on the invitation of CORC) came to the conclusion that no suitable metrics were available or widely used. We therefore agreed to further develop and validate the various tools that members of the group had home-brewed, including my BERRI. There was acknowledgement that it takes a lot of work to develop a new psychometric instrument in a valid way, but a consensus that this needed to be done. So I resolved to find a way to follow that proper process to validate and norm BERRI, despite the lack of any funding, ring-fenced time or logistical support to do so. The first challenge was to collect enough data to allow me to analyse the items on the measure, and the five themes I had sorted them into. But I didn’t have the resources to run a research trial and then enter all the data into a database.

My way around this barrier was to get my peers to use the measure and give me their data. To do this I took advantage of some of the technically skilled people in my personal network and developed a website into which people could type anonymous BERRI scores and receive back a report with the scores and some generic advice about how to manage each domain. I tested this out and found my peers were quite enthused about it. We then had a formal pilot phase, where 750 BERRIs were completed by Clinical Psychologists about children and young people they were working with. I then talked about it with some young people and care leavers to check that they felt the areas we were covering were relevant and helpful to know about*. Then I started to use the system in a large pilot with residential care providers and developed tools to focus in on particular concerns as goals to work on, and track them day by day or week by week, as well as creating tools to give managers an overview of the progress of the children in their care. We’ve had a lot of feedback about how useful and game-changing the system is, and how it has the potential to revolutionise various aspects of commissioning and decision-making in children’s social care.

But I really wanted the process to be one in which we were truly scientific and based our claims on evidence. I’ve never marketed the BERRI or made claims about what it can do until very recently, when I finally reached a point where we had evidence to substantiate some modest claims**. But to me the process is critical and there is still a long way to go in making the data as useful as it can be. So from day one a process of iterative research was built in to the way we developed BERRI. As soon as it was being used by large numbers of services and we had collected a large data set we were able to look closely at how the items were used, the factor structure, internal consistency and which variables changed over time. We ran a series of validity and reliability analyses including correlations with the SDQ, Conners, and the child’s story – including ACEs, placement information and various vulnerability factors in the child’s current situation. But even then I worried about the bias, so a doctoral student is now running an independent study of inter-rater reliability and convergent/divergent validity across 42 children’s homes.

BERRI will always be developed hand in hand with research, so that there is an ongoing process of refining our outputs in light of the data. The first step in that is getting age and gender norms. But the data can also indicate what we need to do to improve the measure, and the usefulness of the output reports. For example, it seems that it might be meaningful to look at two aspects of “Relationships” being distinct from each other. If the evidence continues to show this, we will change the way we generate the reports from the data to talk about social skills deficits and attachment difficulties separately in our reports. We might also tweak which items fall into which of the five factors. We also want to check that the five factor model is not based on the a priori sorting of the items into the five headings, so we are planning a study in which the item order is randomised on each use to repeat our factor analysis. We also want to explore whether there are threshold scores in any factor or critical items within factors that indicate which types of placements are required or predict placement breakdown. We might also be able to model CSE risk.

The results to date have been really exciting. I have begun to present them at conferences and we are currently preparing them to submit for publication. For example, I am currently writing up a paper about the ADHD-like presentation so many traumatised children have, and how we have learnt from our BERRI research that this reflects early life ACEs priming readiness for fight-or-flight rather than proximal events or a randomly distributed organic condition. But the findings depend on all the groundwork of how BERRI was developed, our rigorous validation process and the data we have collected. It is the data that gives us the ability to interpret what is going on, and to give advice at the individual and organisational level.

So you’ll forgive me if I’m somewhat cynical about systems that request a subjective likert rating of five domains from Every Child Matters, or an equally subjective score out of 100 for twelve domains pulled from the personal experience of the consultant when working in children’s social care services, that then claim to be able to map needs and progress without any validation of their methodology, areas to rate, sensitivity to change or the meaning of their scores. Having gone through the process the long way might put me at a commercial disadvantage, rather than going straight to marketing, but I like my houses built on the foundations of good evidence. I can feel confident that the load bearing beams will keep the structure sound for a lifetime when they are placed with precision and underpinned by the calculations and expertise of architects, structural engineers, surveyors and buildings control, rather than cobbled together as quickly as possible, marketed with amorphous claims and sold on rapidly to anyone who will pay for them. After all, I’m not in it to make a quick buck. I know my work is a slow and cumulative thing, and BERRI still has a long way to go before it can create the greatest impact. But my goals are big: I want to improve outcomes for children and young people who have experienced adversity, and I want that impact to influence the whole culture of children’s social care provision in the UK and to continue to be felt through the generations. And to do that, I need to build the thing properly.

*I’m still intending to act on the advice to also have a strengths scale to recognise resilience and positive factors, so that it doesn’t feel like we see the children purely as a list of problems. However, I didn’t want to duplicate the work of others, so I am following up a potentially exciting lead in terms of a collaboration with the Mulberry Bush School, who have explored the positive factors they have seen as markers of progress in their environment.
** that carers, therapists and managers find it useful and easy to use, that using the BERRI pathway demonstrated an improvement of 14% over 6 months for the first 125 children placed on the system, and that BERRI has the basic statistical qualities that suggest sufficient validity for use. We also have some testimonials, including a commissioner who used BERRI to map the needs of 15 high tariff children and found four suitable to move to foster or family placements with support, saving nearly half a million pounds per year from his budget – a finding we would like to replicate with a much larger study, given the opportunity.

 

 

Solve for happiness: Some thoughts on big data/AI and mental health

We are hearing a lot about the use of big data at the moment, mostly that it has been an underhand way to manipulate people politically, that has been used by those with no ethical compunctions to get people to vote against their own best interests*, and in favour of Brexit and Trump. Cambridge Analytica and AIQ seem to have commercially exploited academic research and breached data protection rules to try to nudge political behaviour with targeted messaging. Whether or not that was successful is up for debate, but to the public the narrative is about big data being bad – something technocrats are exploiting for nefarious reasons. I can understand that, because of the associations between gathering data on people and totalitarian political regimes, and because of concerns about privacy, data protection and consent. There is increasing awareness of what had previously been an unspoken deal – that websites harvest your data and show you targeted advertising, rather than charge you directly for services, and the new GDPR means that we will be asked to explicitly consent to these types of data collection and usage.

But what about the potential for big data to do good? I know that DeepMind are doing some data crunching to look at whether AI algorithms can help identify indicators that determine outcomes in certain health conditions and point doctors towards more effective treatments. Their work to identify warning signs of acute kidney injury was criticised because of breaches to data protection when they were given access to 1.6 million medical records without individual patient consent, but whilst the data issues do need to be sorted out, the potential for projects like this to improve health and save lives is undeniable. Computers can look through huge amounts of detailed data much more quickly and cost-effectively than humans. They can also do so consistently, without fatigue or bias, and without a priori assumptions that skew their observations.

Research often highlights findings that seem counterintuitive to clinicians or human researchers, and that means that using the data to generate the patterns can find things that we overlook. One example I read about today was the fact that admitting offending behaviour does not reduce the risk of recidivism in sexual or violent offenders (in fact those who show most denial offend less, whilst those who demonstrate more disclosures and shame are more likely to reoffend). But this is also true about telling people they are being given a placebo (which will still produce positive placebo effects), using positive mantras to enhance self-esteem (which seem to trigger more negative thoughts and have a net negative impact on mood and self-esteem) or about expressing anger (rather than this being cathartic and leading to a reduction in anger, it actually increases it). Various fascinating examples are listed here. There is also the well-known Dunning Kruger effect, whereby ignorance also includes a lack of insight into our own ignorance. As a population, we consistently overestimate our own ability, with people in the bottom percentiles often ranking themselves well above average.

I often refer to the importance of knowing the boundaries of your own competence, and identifying your own “growing edges” when it comes to personal and professional development. We talk about the stages of insight and knowledge developing from unconscious incompetence to conscious competence, and finally to unconscious competence where we can use the skill without conscious focus. Confucius said “Real knowledge is to know the extent of one’s ignorance.” And it may well be that when it comes to solving some of the big problems we are limited by our own frame of reference, what we think of as relevant data, our preconceptions and our ability to build complex models. Using giant data sets and setting technology to sift through and make sense of them using various paradigms of AI might help open up new possibilities to researchers, or find patterns that are outside of human observation. For example, certain medications, foods or lifestyle traits might have significant impact on certain specific health conditions. I am reminded of a recent article about how a third of antidepressants are prescribed for things other than their primary function (for example, one can seemingly help with inflammatory bowel disease that has very limited treatment options). A computer sifting through all the data can pick up both these unintended positive effects and also rare or complex harmful side-effects or interactions that we may not be aware of.

What difference could this make in mental health? Well, I think quite a lot. Of course many predictors of mental health are sociopolitical and outside of the control of the individual, but we also know that some small lifestyle changes can have very positive impacts on mental health – exercising more, for example, or having a healthy diet, or getting more sleep, or using mindfulness, even just getting outdoors more, learning something new, doing something for others, or spending more time with other people (and less time on social media) can have a positive impact. There are also many therapy and therapist variables that may make an impact on mental health, for people who engage in some form of talking therapy, although variance in outcomes seems to actually boil down to feeling heard and believed by a therapist who respects the individuality and cultural context of the client. And of course there are many medical treatments available.

So is there a way of using big data to look at what really works to help people feel happier in their lives? I think the potential for apps to collect mass data and test out what makes impact is enormous, and there are a proliferation of apps in the happiness niche and more that claim to help wellbeing in a broader way. They seem to have found a market niche, and to offer something positive to help people make incremental life changes that are associated with happiness. What I’m not sure of is whether they reach the people that need them most, or if they are evaluating their impact, but presumably this is only a matter of time, as real life services get stripped back and technology tries to fill that gap.

I think there is huge need to look at what can make positive change to people’s wellbeing at a population scale, and I think we need to be tackling that at multiple levels. First and foremost, we need to make the sociopolitical changes that will stop harming the most vulnerable in society, and encourage greater social interconnectedness to prevent loneliness and isolation. We need to increase population knowledge and tweak the financial incentives for healthy lifestyle choices (eg with much wider use of free or subsidised gym memberships, and tax on unhealthy food options). And we need to invest in preventative and early intervention services, as well as much more support during pregnancy and parenting, and in mental health and social care. But I can also see a role for technology. Imagine an app that asked lots of questions and then gave tailored lifestyle recommendations, and monitored changes if the person tried them. Imagine an app that helped people identify appropriate local sources of support to tackle issues with their health and wellbeing, and monitored their impact when people used them. As well as having a positive immediate impact for users, I’m sure we’d learn a lot from that data that could be applied at the population level.

*I think the evidence is strong enough that the demographics who voted for these people/policies in the greatest numbers are the very people who have come out the worst from them, so I am just going to state it as a fact and not divert into my personal politics in this blog, given I have covered them in previous topics about Brexitmy politics, “alternative facts”, Trump, why and what next, the women’s march, and Grenfell and the Manchester bomb.

I am not a therapist

I’ve always been someone that likes to keep busy, and has a lot of ideas about places where psychological thinking can make a positive impact. The aspect of my character that I now identify as entrepreneurial and put to good use in my business has always led me to want to try new things and create innovative solutions to problems. I like a lot of things about being a clinical psychologist, and particularly our ability to turn our hand to multiple types and levels of work. However, unlike many other clinical psychologists, I don’t really see myself as a therapist. In fact, I haven’t seen more than a handful of clients for individual therapy over the last decade, and even before that it was a pretty small proportion of my qualified jobs. I’ve always had more of a focus on the other facets of being a clinical psychologist. I think the picture of a clinical psychologist as a therapist is so strong that a lot of people will now be wondering how I fill my time!

So I will answer that question: I have done loads of highly specialist assessments (of neurodevelopmental concerns, attachment, parenting capacity, mental health, life skills, self-esteem, wellbeing etc) and lots of formulating and report-writing – some in collaboration with psychiatric or medical colleagues or within a wider MDT, but more as an external expert or second opinion. I have advised the family courts as an expert in care proceedings and complex custody disputes, and completed numerous pre-court assessments for local authorities to help inform their care planning. I’ve managed teams and services, and supervised from 2-20 other staff at a time, along with sitting in various organisational/management structures. I have designed and delivered training to parents, carers and professionals, and I have done lots of consultancy to various organisations and professionals (mainly those providing health and social care services, or involved in the family courts), and help placement providers to improve their services. I design and deliver group programs (eg Managing behaviour with attachment in mind), but then rapidly cascade train other staff to continue to deliver them. I wrote a book about attachment/developmental trauma, and lots of papers and policy documents about Looked After children, and acting as an expert witness to the family court. I sat on a BPS committee and I contributed to NICE and SCIE guidelines. I’ve designed, managed and evaluated therapy services (but employed others at lower bands to deliver the therapy). I’ve been an expert advisor to the HCPC in a fitness to practice case and to the team investigating a death in public care. I’ve done loads of practice-led research about each client group I’ve worked with, from looking at the psychological and health economic impacts of offering brief therapy to hospital users with diabetes, to commissioned evaluations of other services. So I have plenty to fill my days despite not having a therapy caseload!

I have reflected on why it is that I don’t feel drawn to therapy, and reached the conclusion that, whilst I see it as a very worthwhile endeavour, I don’t really have the patience for resolving difficulties one person at a time over sessions spanning many months. I’m always more interested in grappling with the bigger questions of why people are in distress, and what we can do to most effectively prevent or ameliorate those difficulties. When I’ve solved the riddle (or at least, reached a plan that improves upon existing solutions) I like to evaluate its efficacy, modify it if necessary and then disseminate the learning and/or train others to replicate the solution. I try to step outwards from the individual issue to the broader themes and ways that we can intervene on a wider scale. To use a visual metaphor, if dealing with mental health problems is like bailing out a ship, then rather than scooping out water one cup at a time, I am trying to work out how to plug the leaks, and to design boats that won’t have the same vulnerabilities to leakage in the future. It also helps me to avoid feeling hopeless about factors outside my control and demand exceeding supply, or burned out by an accumulation of traumatic stories.

Jenny Taylor, a past chair of the Division of Clinical Psychology, once described our profession as the structural engineers of the therapy world. Unlike a therapist trained in a single modality of therapy, we can survey the landscape and assess the need, then design the intervention that best meets that need – even if we are not always best placed to deliver it. We can base that recommendation on our knowledge of the current evidence base, which can change as new information comes to light.  If we consider the challenges people face as a river they need to cross, a therapist trained in a single model of therapy might be a bridge-maker. A psychodynamic therapist might be a mason who can build traditional stone bridges and claims that this design best stands the test of time. A CBT therapist might be a carpenter with a set of designs for wooden arched bridges that he claims are cheaper and quicker to erect. Each sees their own skill as either suitable to solve the challenge or not, but also has some incentive to sustain their own livelihood by continuing their tradition. A clinical psychologist can survey the land either side of the river, the span length required to cross it, and the materials available in the locality. They can then advise on the various options, including the relative costs and the evidence of how they fare in different conditions. They may or may not feel that bridge required is within their own skill-set to erect, but have a reasonable overview of other bridgebuilders in the area to recommend. If new designs of metal suspension bridges are developed, this is not threatening to the structural engineer, who can adjust their recommendations to incorporate the emerging evidence base.

I really like this metaphor and strongly identify with the role of structural engineer rather than bridgebuilder. I had always thought that this was instilled in me by my first graduate job, where I was an assistant psychologist on a research project about improving quality of life in residential care homes for older people, and I could see how the research and clinical work were closely tied together and built on each other reciprocally. But now I think my love of data and the scientific method runs deeper than that and I can see it infused throughout my whole approach to life since childhood. When it comes to my work I am a scientist practitioner down to my bones, as I always collect data as I go along. Where I don’t feel like I understand the situation well enough, I first look to the literature and then to gathering data and doing my own analysis to try to gain insight. When I develop something new to try, wherever possible I try to evaluate what we are doing, and refine it through an iterative process until we can prove maximum efficacy. I see that process as being part of the USP of a clinical psychologist – that we think like scientists and gather data to inform our interventions.

But I’m not sure that we communicate this mindset well enough, or that it is universal amongst the profession. It certainly isn’t what draws people into the profession in my experience. Too many clinical course application forms I review could be paraphrased as “I want to learn to be a good therapist” with an afterthought of “and do/use research” because they think that is what selectors want to hear – but in my view therapy can be done by lots of cheaper professionals, who might do an equally if not better job of it. I believe that clinical psychologists should be more than well paid therapists. We should know the evidence base and be able to take on the most complex assessments and formulations (even if others then deliver part or all of the treatment) but also to be able to develop, refine and evaluate novel therapeutic interventions, supervise other staff, improve services, consult, train and manage – things that extend beyond the skillset of most therapists. I’m sure it is clear by now that this is where my own interests lie. And I think it shows through in everything I do.

For example, when I was asked to lead the CAMHS service providing neurodevelopmental assessments I started with a literature review and current policy and best practice guidance. I then conducted an audit of the existing pathways, then tried to make things better. We set up a new clinic system with more rapid throughput and more thorough assessments, and then re-audited showing a reduction from an average of 18 months of input to five, with increased clinician confidence in the service and higher client satisfaction. I also wrote a booklet to help provide the information to parents whose child received a diagnosis of an Autistic Spectrum Condition. Although it required dedicated clinician time for the multi-disciplinary clinic and for the psychometric assessments generated, overall the new pathway freed up capacity because less cases were being held open by other clinicians whilst waiting for assessment, or kept open for prolonged periods afterwards to help the family understand the diagnosis and connect up to local sources of support.  I also sat on a multiagency strategy group to look at establishing best practice standards for the county.

I had the same approach when I was asked to support the adoption and permanence service. I initially set up a consultancy clinic, where social workers could bring cases to discuss or book in families to see jointly. I found that I was explaining similar information about attachment, trauma and neuroscience to multiple professionals, parents and carers in the consultations. So I designed a group to share this content. I called it “Managing Behaviour with Attachment in Mind”, and developed some “doodles” I would draw on flipchart paper to explain the concepts more accessibly. I evaluated the impact and showed it to be an effective format for supporting parents in this situation. The groups were popular and over-subscribed, so I trained others to deliver the group to keep up with demand, first in my service and then more widely. Many people in the groups liked to photograph the doodles to remind them of the topic, so I decided to write a book to share them and Attachment: In Common Sense and Doodles was born.

But I also wanted to know about how we could achieve permanence for more children. I started by looking at the literature about what makes effective adoptive matches. Very little information was available, so I systematically audited the paperwork from 116 adoptive matches and followed them up over 7 years to see what factors influenced the placement outcomes. I was able to look at whether the innovative adoption project to place children with more complex needs had better or worse outcomes, and was able to explore the impact of different motivations for adopting. Whilst to me this was just a natural process of answering the question as an evidence based practitioner, it transpired that these studies of adoption risk and resilience factors were amongst the largest ever done, and I have discovered unique findings that I really should publish*.

You could argue that I was using a sledgehammer to crack a nut by doing all this research and trying to change process when organisations are notoriously slow to change, and that I could have spent my time more productively working with more individual adoptive families. But that’s not how I’d see it. The research I did helped me to understand what the key variables are when considering whether a child can achieve permanence, what kind of family we need to look for to place them successfully, and what kinds of support might ensure that the placement succeeds. I hope that I have fed that knowledge back through my court work, and into various organisational and policy work over the last decade. I have also disseminated it at conferences. However, I would still like to spread it further, because it is my belief that such knowledge can have positive impact at multiple levels – it can help to inform individual placement decisions, service-wide strategies for helping optimal numbers of children to access permanence, and national policy about adoption.

That work led naturally on to developing our services for Looked After Children when I left the NHS and set up my own company, LifePsychol Ltd. We provide training and consulting to foster carers and residential care staff, the social care organisations that support them, and the wider professional networks surrounding them, including education and health staff, police, lawyers, magistrates and judges. As I started to get more immersed in working with children in and on the edge of Care, it led me to recognise that there was a lack of validated and reliable tools to identify the needs in these populations, no outcome measurement tools that could reliably measure change over time in a way that was sensitive to the context and type of life events these young people experience, and a dearth of clinical governance in terms of the efficacy of both placements and interventions for this group of children. That seemed shocking to me, given their highly complex needs, and massively elevated incidence of mental health problems, challenging behaviour, risk to self and others, and prevalence of intellectual or neurodevelopmental difficulties.

As well as the human cost of not being able to identify the best choices for people, it seemed unacceptable that huge amounts of money were being spent on placements and specialist services for this group without any evidence of them changing their wellbeing or life course for the better. Placements seemed to struggle to identify what to work on and how, and there was little objective indication of what defined a successful placement, beyond annual visits from Ofsted (who were predominantly focused on process and procedure). The high level of need and the lack of clinical governance in the sector has allowed various specialist therapists and services to spring up that are virtually unregulated, and many placements have adopted terms like “therapeutic” without these having a consistent definition or meaning. So I wanted to see whether I could make any headway in changing that.

Meanwhile there is pressure from the government to improve outcomes for children in public Care, because they are seen to fare badly compared to the general population of children the same age. The difficulty is that this isn’t comparing like for like – children in care have many more adversities to face, both organic and in terms of their life experiences, that mean they often deviate from the norm. For example, I found that there was a 20 point skew downwards in IQ distribution in children in residential care compared to population norms, meaning that 20-25% of children in this setting had a learning disability, compared to 2% in the general population. Likewise the incidence of Autistic Spectrum Conditions and other neurodevelopmental difficulties amongst children in Care is more than triple that in the wider population. The same is true of young offenders. If we don’t acknowledge that, then the sector is being asked to seek impossible goals and will inevitably be seen as failing, even if placements and services are performing optimally and adding a lot of value to the lives of the children they work with.

To state the obvious, children in care are not just randomly drawn from the population – by definition their needs have not been met, and this can mean both the presence of additional challenges and exposure to harm or deficits in care. I believe that to look at the needs of this population and the degree to which these are met by placements or interventions, we need to either compare them to carefully matched controls or ensure that outcomes are always considered relative to baseline. The latter seems more pragmatic. Scores for young people also need to be considered in the context of what is going on in their lives – as changes in placement, daily routine, contact arrangements, or the arrival or departure of other children from the home can make big impacts on the child’s functioning.

So I’ve been beavering away exploring these issues and developing systems to measure needs and make the data meaningful for those providing care and services. The impact might not be as obvious as delivering psychological therapy directly, but I’d like to think that over time it can improve services for thousands (or even tens of thousands) of children, and make a greater net change in the world.

 

*Maybe I’ll write more about this in a future blog. But the short version is that I have been trying to secure some funding to complete the statistical analysis and disseminate this information, and would still like to do so, so if you have any ideas or useful connections to assist with this please let me know. Failing that I hope I’ll find enough time to write a book on making better adoptive matches at some point in the future.

Six degrees of separation

My brother, David Silver, is panning out to be one of the significant players in the world of artificial intelligence. His PhD topic was applying reinforcement learning to the oriental strategy game of Go, and he has gone on to be the lead researcher on AlphaGo at Google DeepMind. That is the program that last year beat the world champion human player and became the best computer player of Go. More recently AlphaZero has taught itself to play Go from scratch (AlphaGo started by learning from thousands of top level human games) and has also taught itself to play chess and shogi, all to unprecedented levels of excellence. It has been very exciting following his progress, and going to the premier of the documentary film about AlphaGo (which is a lovely human drama, even if you don’t know or care much about the technology, so do give it a watch on netflix/prime/google play/itunes if you get the chance).

It is no surprise to me that David has gone on to find a niche that is intellectually impressive, as he has always been a pretty smart guy and done exceptionally well in education (though reassuringly he isn’t all that practical, makes the same silly mistakes as the rest of us, and has remained quite down to earth). I’ve always been glad to be the older sibling, as I think it would have been difficult to follow in his footsteps. As it was, I could be proud of my relative achievements before he came along and beat them all! He has always had a very analytical mind and enjoys solving logical puzzles. I guess I do too in some ways, but I’m much more interested in how people work than complicated mathematical calculations, and how we can reduce suffering and help people recover from trauma, rather than pushing the boundaries of technology. We’ve chosen quite different career directions, but I think we still have quite similar underlying values and ethics.

Although I’m proud of him, I’m not mentioning my brother’s achievements to show off (after all, I can take no credit for them) but because they’ve given me cause for reflection. Firstly, it would be easy to feel inadequate by comparison. After all, he is making headlines and working on the frontiers of technology, whilst I’m just a clinician running a tiny company and have made relatively little impact to date. It would be easy to be jealous of the financial security, publications and plaudits that he has got. He has made the news all around the world, and even has a wikipedia page! But I think I’d find that spotlight uncomfortable, and I suspect I’d find his job pretty stressful, as well as finding all the maths and computing pretty boring and unfulfilling. So whilst there is plenty to admire, I don’t really envy him and wouldn’t want to swap places.

Secondly, and perhaps more interestingly in terms of this blog, it has made me think about what my goals are. Making the best possible AI to play Go is quite a narrow and specific goal, and within that he selected a specific methodology with reinforcement learning, and he has focused on that for the past decade, before looking at what other applications the same system might have. Yet in that same time period I’ve been pulled in many different directions. I’ve been an NHS CAMHS clinician and service manager. I’ve been an at home mum. I’ve helped to found a parenting charity. I’ve set up and evaluated a project to improve outcomes for diabetes patients. I’ve bid for grants. I’ve tried to help recruit psychologists and improve clinical services within a children’s home company. I’ve undertaken specialist assessments of complex cases. I’ve been an expert witness to the family courts. I’ve delivered training. I’ve run a small therapy service. I’ve conducted research. I’ve tried to influence policy, and sat on committees. I’ve written a book about how to care for children affected by poor attachments and trauma. And I’ve developed outcome measures. Most of the time I’ve done several of these things in parallel. It is hard to keep so many plate spinning, and means I have not been able to invest my full energy in the things I most want to do. I’ve also had hesitations about investing in entrepreneurial ideas, because of guilt about saying no to other stuff, or fear that it won’t pay off  that have taken a really long time to shake off.

Greg McKeown says in his brilliant articles for Harvard Business Review about ‘essentialism’, that success can bring on demands that cause you to diversify, and ultimately reduce your focus on your primary goal and cause failure, and that is exactly what I’ve experienced. It reminded me of a reflective exercise I did as a trainee on a workshop about creative methods, where I made an amoeba shape out of clay to represent the pulls I felt in different directions. The amoeba was a resonant image for me as it can’t spread too thin without losing its depth at the centre, and it can’t travel in two directions at once. Finding the right direction of travel and resisting other pulls on my time is something I am still working on 20 years later! It has been a growth curve to learn what to say ‘no’ to so that the company does not become overloaded or incoherent*. There are also other forces that influence what a small business can deliver – we have to do work that we are passionate about, uniquely skilled to deliver and that there is a market for. There is no point offering services that nobody wants to buy, or that other people can provide better, or that you are not enthusiastic about, so we need to stick to things that we can deliver brilliantly and build a positive reputation for. However, with the breadth of clinical psychology there will always be multiple demands and opportunities, and it is necessary to find a focus so that we have a single defined goal** in order to attain the most success.

I’ve taken time to refine my goal from “applying clinical psychology to complicated children and families facing adversity” (which is actually quite a broad remit, and includes a wide range of neurodevelopmental, mental health, physical health and social aspects of adversity, being applied to all sorts of different people) to “applying clinical psychology knowledge to improving services for Looked After and adopted children” to “using outcome measurement tools developed through my knowledge of clinical psychology with placement providers and commissioners to improve outcomes for Looked After and adopted children”. Likewise, it has taken me time to clear space in my head and in my diary, and to be in good enough physical health to give it sufficient time and energy. But I am finally able to dedicate the majority of my working time to making people aware of BERRI, doing the statistical analysis to validate and norm it, and supporting/training those who subscribe to it. I have secured an honorary research fellowship at UCL and some data analyst support, and a trainee from Leicester is making it the subject of her doctoral research, so I very much hope that 2018 will be the year that we publish a validation of the measure and methodology, and can then roll it out more widely. I believe that is my best chance to make a difference in the world – to improve the standards of care for children living outside of their family of origin by encouraging universal psychological screening, regular outcome measurement, and the ability to identify and track needs over time.

Finally, my brother’s achievements have given me pause for thought because him working at Google has made me feel a sense of being somehow distantly connected to silicon valley, and all the technological and entrepreneurial activity that goes on there. Suddenly the people who founded Google, Facebook and Tesla/SpaceX are no longer as abstract as Hollywood actors or international politicians, but are now three steps away in a technology game of six degrees of Kevin Bacon. It makes the world feel a little smaller and making an impact seem more possible, when your kid brother is connected (however peripherally) to the technology giants who are changing the world.

Alongside this, in my ImpactHub coaching peer group several people have gone on to make successful social businesses that have rapidly scaled and made an impact on the world. Proversity for example, have expanded massively into the digital education space. Old Spike Roastery & Change Please have expanded their coffee businesses that employ homeless people, and School Space have scaled up a project they started at the age of 17 to help their school rent out its premises out of hours into a thriving business that has generated £350,000 of income for participating schools. Code Club have partnered with the Raspberry Pi Foundation to teach children in 10,000 clubs in 125 countries all around the world to program computers. And Party for the People have made a competitor for TicketMaster or SeeTickets where the fees go to a good cause, and have set up arts spaces in old factory buildings.

In this context, it seems possible to dream big, to think that an idea could become a reality that has an impact on the world. So whilst my main vocation remains to bring the process of regular outcome measurement to services for Looked After Children (and that is making some really positive steps at the moment), I’ve started to work out how to make my back-burner project a reality. This one is a proper entrepreneurial idea in the digital space and tied in a little to my previous blog topic of the issue of how the public understand the evidence for different kinds of interventions. I’m hoping I can develop a pilot and then seek some investment, so watch this space as I’ll report back how it goes.

In the meanwhile, I still want to make some changes in my personal life. I’m generally feeling quite upbeat about the future at the moment, and I’ve sorted out the issues I mentioned in a prior blog about disappointment. We’ve also pulled in payment for many of the outstanding invoices, and the business is the best organised it has ever been. But after reviewing how I spend my time and who I interact with the most, I have become much more aware of my various different networks, and to what degree I feel able to express myself authentically within them. I am being a bit more thoughtful about my networks, both in real life (where I want to make greater efforts to meet like-minded people locally) and online, where I need to spend less time. I have realised that I haven’t been choosing the company I keep well enough, so I am trying to connect more  with those who are positive influences on my life, and to pull away from people who are a drain on my emotional resources. I am also choosing to engage more with people in the social entrepreneurial space. As Jim Rohn is much cited as saying “you are the average of the five people you most associate with” and hanging out with inspiring people allows us to be more creative and entrepreneurial ourselves.

So hopefully 2018 will be the year where I make a success of BERRI, complete the validation research and get some publications out. I’d also like to get a pilot of my entrepreneurial idea up and running. And in my personal life I’d like to get back to the gym, to get the planning permissions sorted out for my house, and most importantly to make more real life social connections with people who share my values. If I’m only a few degrees of separation from people who have achieved all of these things, then maybe I can too.

 

*I wrote more about developing my business model and setting up a social enterprise in clinical psychology forum number 273 in Sept 2015

**or failing that, a primary goal, secondary goal and fall-back plan, in ranked order of preference (with an awareness than only exceptional polymaths like Elon Musk can achieve in more than one area at the same time).