Heart and Head

It is an interesting fact that most people make decisions based on their emotional “gut feeling” but then justify them logically in retrospect. So you pick a house or a car that feels right, but tell people you chose it because of the miles per gallon or the lovely neighbourhood. Maya Angelou put it rather perfectly when she said

I’ve learned that people will forget what you said, people will forget what you did, but people will never forget how you made them feel.

However, if you ask me why I think BERRI can benefit a placement provider or a local authority, then my first port of call is facts and explanation, rather than finding the emotional hook that will persuade people that it feels like the right thing to do. Maybe its the scientist-practitioner thing showing through, but I feel like it is more valid or legitimate to make the case in the language of logic and facts, than to try to pull on people’s heart strings, and I am very loathe to make any claims I can’t prove with quantitative evidence.

I think that is partly because of my personal style. I have previously blogged about how I am not a typical clinical psychologist in terms of not seeing myself as primarily a therapist. I wonder whether that is because I tend to start with a more intellectual approach, and address the language of emotions less than many people in my profession. For example, I like to operationalise how I conduct assessments, or the information I like to share in consultations, so that it can be replicated more easily, and to evaluate everything I deliver to check that it is effective. It doesn’t mean I’m not mindful of feelings that arise for people I interact with, or in myself, or that I’m emotionally closed off. In fact, quite the opposite, as I quite often find myself becoming tearful when discussing the stories of the children and families that I’ve worked with (or when reading the news or discussing the current political situation, and even when watching films or reading novels for that matter). I’m likewise prone to a giggle or a belly laugh, sometimes about entirely inappropriate things. I’d say I’m pretty comfortable expressing both positive and negative feelings, and responding to those in others. But when communicating information to others, particularly in writing or as a presentation, science is my starting point.

Maybe it comes from eight years of academic study and writing before I qualified as a clinical psychologist, or maybe it is because I am used to presenting to my professional peers through journal articles and conference presentations. But ask me to explain about a topic or project and my first instinct is to tell a summary of the context and then share my methods, results and conclusions. However, I am mindful that to be an effective salesperson I need to be able to pick out the message that will be emotionally resonant with the listener to focus on, and to capture hearts as well as heads.

To that end, I’ve been working on how we communicate the impact of BERRI on individual children, as well as its benefits for organisations and commissioners. To do this we have started to capture some stories from cases where BERRI has made a difference, and then to anonymise these enough to present them in the materials that I present at conferences and on social media. I had always thought that the lovely animation that Midlands Psychology commissioned to show how they had changed the autism service was a brilliant example of this. I felt like they managed to tell the important elements of their story in an engaging way, and that this couldn’t help but make people see the improvements they had achieved. We didn’t have their budget, but I wanted to capture something similar in our case study animations.

Luckily for me, I had recently reconnected with an old school friend, Joe Jones, who is brilliant at this kind of thing. His “explanimations” for the renewable energy sector had been great at simplifying complex ideas, and he knew all about my business, as he had been helping us to look at our comms. We talked about what I wanted from an animation, the tone we wanted to convey, and the stories that we could tell about different case examples. Joe showed me how The Girl Effect had been able to capture an emotionally compelling story in very simple graphics, where the character created represented every girl, because she was effectively anonymous and culture free.

The result is a short animation that I think captures what we are trying to achieve with BERRI. Obviously, there is more information that will set the simple story into context, which I can tell people in the rest of my presentation, or as they enquire. But as a hook that helps people to see the impact it can make, I think he has done a great job.

BERRI Case Study – Daniel from Joe Jones, Archipelago.co.uk on Vimeo.

What do you think? Does it explain what we are offering? Does it appeal to both heart and head?

Can you make things better for children and young people in Care whilst saving money?

That seems to be the critical question in an age in which there is no money in the budget to try anything innovative just because it will create improvement. To be able to try anything new that involves spending any money we have to evidence that double win of also saving costs. A few years ago when I was in the NHS, I found that really frustrating – I had so many ideas about how we could do things better by creating new services or better collaborations with other agencies, or reaching out to do the proactive and preventative work that would save money down the line, but it was almost impossible to get them off the ground because the budgets were so tight. Since then I’ve tried various things to unlock the spend-to-save deadlock, but it was only once we started looking at the economic impacts of some projects using BERRI that we had clear evidence that we could save money whilst making services better, and on a fairly substantial scale. Our pilot in Bracknell Forest saved £474,000 in the first 12 months whilst making services better and improving the outcomes for the young people involved. And that was just a small scale pilot within a single local authority.

After so many years of being told that improving outcomes whilst saving costs would be impossible it sounds unlikely, but it is true. We made life better for the children involved – in some cases in ways that entirely changed the trajectory of their lives – whilst reducing costs for the local authority. The savings generated would be enough to fund services to address the mental health needs of all Looked After Children whilst still lowering the overall cost of Care. I’m not prone to hype, but that feels pretty extraordinary! Importantly we did it whilst also making life easier for the carers, professionals and placement providers involved. So it is no great surprise that we are now working with many Local Authorities to scope out and deliver wider scale projects.

So, what are we doing that is different? And where do the savings come from? Using BERRI we are identifying psychological needs effectively, and then addressing them early. For some young people that leads to significant change in their behaviour, risks or mental health, that then opens the door to different placement options, and for a small proportion of children the placement costs are substantially reduced. I’m not talking about forcing children in residential care to move to foster placements for financial reasons. I’m talking about better identifying the types of placements and services that young people need. For some, that will mean that they get to access residential care without having to break down a long series of foster placements to do so. For others it will mean that they get access to much increased mental health input, or specialist services. For many it will mean helping their carers to better understand their needs so they can make minor adjustments to the day to day care. But for some children it can open (or reopen) the doors to a family placement.

It may also have an impact on their longer-term trajectory, as it is well known that addressing mental health needs in childhood is easier and more cost effective than trying to address the difficulties they go on to develop in adulthood if these needs are not addressed. Using the BERRI helps carers to see behind the presenting behaviours and to recognise emotional, relational or attachment needs, or feel empowered to support these more empathically. Importantly, it can evidence the impact of the great work that many carers and organisations are doing already to support children by showing the changes they are making over time. It can help to set goals to work on, and to monitor what is and isn’t working effectively to create positive change. BERRI also helps to pick up learning difficulties, neurodevelopmental difficulties and disorders, so that children can then be more thoroughly assessed and care and education can be pitched appropriately.

We are also learning from our increasing data set what scores are typical in different settings, how individual children compare to the general population, and which variables are important in preventing negative outcomes in adulthood.

I sometimes use the metaphor of the cervical cancer screening programme. At a cost of around £500 per woman each 3-5 years, the screening programme prevents 2000 deaths per year. About 5% of women screened have abnormal cells, and 1-2% have the type of changes that are treated to reduce risk. As a result women who are screened are 70% less likely to get cervical cancer, which has an enormous human cost, but also costs £30,000+ to treat. Screening has saved the NHS £40 million. Most importantly it has led to the discovery that the human papillomavirus is significant in the development of cervical cancer. This has led to preventative treatment programmes with 10 million girls in the UK receiving the HPV vaccination. This has reduced the rates of cervical cancer (with 71% less women having pre-cancerous cervical disease), as well as preventing genital warts (by 91% in immunised age groups). It also has the potential to reduce other forms of cancer, as HPV is responsible for 63% of penile, 91% of anal, and 72% of oropharyngeal cancers, with this and the importance of herd immunity leading to the decision to immunise boys as well as girls in many countries.

I would argue that the case for psychological screening, particularly in population groups that have experience trauma, abuse or neglect, is even stronger. More than half of children in Care have a diagnosable mental health condition, and half of the remainder have significant mental health need that doesn’t reach diagnostic thresholds or doesn’t fit into a diagnostic category. They also go on to higher risks of a range of negative outcomes than the general population, including having a higher risk of heart disease, cancer, strokes, fractures and numerous other health conditions, as well as more than fifty times higher risk of homeless, addiction, imprisonment, requiring inpatient mental health care, or having their own children removed into Care. Like cancer, these have an enormous human cost on the individual and their network, and they also have a huge financial cost for the public purse (some estimates suggest £2-3 million per young person leaving Care, when including lower contributions to tax, increased benefits and the cost of services). If we can understand and address the issues that lead some young people down these more negative paths, and address those needs as early as possible in their lives, hopefully we can increase the proportion of young people who survive difficult early lives and go on to healthy happy adult lives.

If you want to learn more about BERRI and the impact it can have on your services feel free to get in touch. Or you can come and learn more about the pilot in Bracknell Forest and the larger scale projects we have started to expand on it, as I am presenting at the NCCTC next month with Matt Utley from the West London Alliance.

Communicating the value of evidence

I presented at a couple of conferences over the last few weeks about my BERRI system. And I was struck, once again, by how little weight is given to evidence when it comes to services that are commissioned in the social care sector. Various glossy marketing claims and slick consultants were successfully persuading commissioners and service managers that it was equivalent to use their systems and “metrics” (in which people gave entirely subjective ratings on various arbitrarily chosen variables) to using validated outcome measures. By validated outcome measures, I mean questionnaires or metrics that have been developed through a methodical process and validated with scientific rigour that explores whether they are measuring the right things, whether they are measuring them reliably, whether those measures are sensitive to change, and whether the results are meaningful. A pathway that then leads to an established scientific process of critical appraisal when those studies are presented at conferences, published and made subject to peer review.

But outside of the academic/scientific community it is very hard to prove that having a proper process is worth the time and investment it takes. It means that you are running a much longer race than those who work without evidence. At one event last week, I asked a question of a consultancy firm making hundreds of thousands of pounds out of “improving children’s social care outcomes”, about their basis for what they chose to measure, how they measure it, and how they had validated their claims. The answer was that they were confident that they were measuring the right things, and that having any kind of scientific process or validation would slow down their ability to make impact (aka profit). My answer was that without it there was no evidence they were making any impact.

They couldn’t see that their process of skipping to the doing bit was equivalent to thinking that architects, structural drawings, planning permission and buildings regulation control slow down building houses, and selling houses they’d built without all that burdensome process. Thinking anyone can build a house (or a psychometric measure to track outcomes) feels like an example of the Dunning-Kruger effect, the idea that those with the least knowledge overestimate their knowledge the most. But the worst thing was that those commissioning couldn’t see the difference either. They find the language of evidence to be in the domain of academics and clinicians, and don’t understand it, or its importance. We are in an age where expertise is dismissed in favour of messages that resonate with a populist agenda, and it seems that this even applies when commissioning services that affect the outcomes of vulnerable population groups. I don’t know how we change this, but we need to.

For those who don’t know, I’ve been working on BERRI for 12 years now, on and off, with the goal of being able to map the needs of complex children and young people, such as those living in public care, in a way that is meaningful, sensitive to change and helps those caring for them to meet those needs better. For as long as I’ve worked with Looked After children, there has been a recognition of the fact that this population does worse in life along a wide range of metrics, and a desire to improve outcomes for them for both altruistic and financial reasons. Since Every Child Matters in 2003, there have been attempts to improve outcomes, defined with aspirations in five areas of functioning:

  • stay safe
  • be healthy
  • enjoy and achieve
  • make a positive contribution
  • achieve economic well-being

A lot of services, the one that I led included, tried to rate children on each of these areas, and make care plans that aimed to help them increase their chances in each area. Each was supposed to be associated with a detailed framework of how various agencies can work together to achieve it. However, whilst the goals are worthy, they are also vague, and it is hard to give any objective score of how much progress a young person is making along each target area. And in my specific area of mental health and psychological wellbeing they had nothing specific to say.

As with so much legislation, Every Child Matters was not followed up by the following government, and with the move of children’s social care and child protection into the remit of the Department for Education, the focus shifted towards educational attainments as a metric of success. But looking primarily at educational attendance and attainments has several problems. Firstly it assumes that children in Care are in all other ways equivalent to the general population with which they are compared (when in fact in many ways they are not, having both disproportionate socioeconomic adversity and disproportionate exposure to trauma and risk factors, as well as much higher incidence of neurodevelopmental disorder and learning disability). Secondly it limits the scope of consideration to the ages in which education is happening (primarily 5-18, but in exceptional circumstances 3-21) rather than the whole life course. Thirdly it doesn’t look at the quality of care that is being received – which has important implications for how we recruit, select and support the workforce of foster carers and residential care staff, and what expectations we have of placement providers (something I think critical, given we are spending a billion pounds a year on residential care placements, and more on secure provision, fostering agencies and therapy services that at the moment don’t have to do very much at all to show they are effective, beyond providing food, accommodation, and ensuring educational attendance). Finally, it masks how important attachment relationships, and support to improve mental health are in this population. I can see that strategically it makes sense for politicians and commissioners not to measure this need – they don’t want to identify mental health needs that services are not resourced to meet – but that is significantly failing the children and young people involved.

In my role as a clinician lead for children in Care and adopted within a CAMH service, I kept finding that children were being referred with behaviour problems, but underlying that were significant difficulties with attachment, and complex trauma histories. I was acutely aware that my service was unable to meet demand, leading us to need some system to prioritise referrals, and that there was a lot of ambiguity about what was in the remit of CAMHS and what was in the remit of social care. I wasn’t alone in that dilemma. There were a lot of defensive boundaries going on in CAMHS around the country, rejecting referrals that did not indicate a treatable mental health condition, even if the child had significant behavioural or emotional difficulties. The justification was that many children were making a normal response to abnormal experiences, and that CAMHS clinicians didn’t want to pathologise this or locate it like an organic condition inside the child, so it should best be dealt with as a social care issue.

On the other hand, I was mindful of the fact that this population have enormous mental health needs, having disproportionately experienced the Adverse Childhood Experiences that are known to lead to adverse mental and physical health outcomes. Research done by many of my peers has shown that two thirds to three quarters of Looked After children and young people score over 17 on the SDQ (the Strengths and Difficulties Questionnaire – the government mandated and CORC recommended measure for screening mental health need in children) meaning they should be eligible for a CAMH service, and various research studies have shown that 45% of LAC have a diagnosable mental health condition, but the resources are not available to meet that need. As The Mental Health Foundation’s 2002 review entitled “Mental Health of Looked After Children” put it:

Research shows that looked-after children generally have greater mental health needs than other young people, including a significant proportion who have more than one condition and/or a serious psychiatric disorder (McCann et al, 1996). But their mental health problems are frequently unnoticed or ignored. There is a need for a system of early mental health assessment and intervention for looked-after children and young people, including those who go on to be adopted.

My initial goal was to develop a new questionnaire to cover the mental health and psychological wellbeing issues that this population were experiencing, as well as considering attachment/trauma history and the child’s ability to trust others and form healthy relationships, and the behaviours that these often expressed through. I was also interested in what issues determined the type of placement given to a child, and the risk of placement breakdown, as well as what opened doors to specialist services such as therapy, and whether those services and interventions really made any difference. I therefore ran two focus groups to explore what concerns carers and professionals had about Looked After children and young people, and asked them about what they saw that might indicate a mental health problem, or any related concerns that led people to want my input, or that caused placements to wobble or break down. One group contained foster carers and the professional networks around them (link workers, children’s social workers, the nurse who did the LAC medicals, service managers) and one contained residential care workers and the professional networks around them (home managers, children’s social workers, the nurse who did the LAC medicals, service managers). I wrote their responses down on flip-charts, and then I sorted them into themes.

I had initially thought that it might cluster as behavioural and emotional, or internalising and externalising, but my items seemed more complex than that. In the end there were five themes that emerged:

  • Behaviour
  • Emotional wellbeing
  • Risk (to self and others)
  • Relationships/attachments
  • Indicators (of psychiatric or neurodevelopmental conditions)

The first letters gave me the name for the scale: BERRI. I then piloted the scale with various carers, and then with a group of clinical psychologists involved with CPLAAC (the national network within the British Psychological Society that contained about 300 Clinical Psychologists working with Looked After and Adopted Children that I was chair of for about six years). I then added a life events checklist to set the issues we were identifying in context.

The working group I chaired in 2007 on the state of outcome measurement for Looked After and adopted children (on the invitation of CORC) came to the conclusion that no suitable metrics were available or widely used. We therefore agreed to further develop and validate the various tools that members of the group had home-brewed, including my BERRI. There was acknowledgement that it takes a lot of work to develop a new psychometric instrument in a valid way, but a consensus that this needed to be done. So I resolved to find a way to follow that proper process to validate and norm BERRI, despite the lack of any funding, ring-fenced time or logistical support to do so. The first challenge was to collect enough data to allow me to analyse the items on the measure, and the five themes I had sorted them into. But I didn’t have the resources to run a research trial and then enter all the data into a database.

My way around this barrier was to get my peers to use the measure and give me their data. To do this I took advantage of some of the technically skilled people in my personal network and developed a website into which people could type anonymous BERRI scores and receive back a report with the scores and some generic advice about how to manage each domain. I tested this out and found my peers were quite enthused about it. We then had a formal pilot phase, where 750 BERRIs were completed by Clinical Psychologists about children and young people they were working with. I then talked about it with some young people and care leavers to check that they felt the areas we were covering were relevant and helpful to know about. Then I started to use the system in a large pilot with residential care providers and developed tools to focus in on particular concerns as goals to work on, and track them day by day or week by week, as well as creating tools to give managers an overview of the progress of the children in their care. We’ve had a lot of feedback about how useful and game-changing the system is, and how it has the potential to revolutionise various aspects of commissioning and decision-making in children’s social care.

But I really wanted the process to be one in which we were truly scientific and based our claims on evidence. I’ve never marketed the BERRI or made claims about what it can do until very recently, when I finally reached a point where we had evidence to substantiate some modest claims*. But to me the process is critical and there is still a long way to go in making the data as useful as it can be. So from day one a process of iterative research was built in to the way we developed BERRI. As soon as it was being used by large numbers of services and we had collected a large data set we were able to look closely at how the items were used, the factor structure, internal consistency and which variables changed over time. We ran a series of validity and reliability analyses including correlations with the SDQ, Conners, and the child’s story – including ACEs, placement information and various vulnerability factors in the child’s current situation. But even then I worried about the bias, so a doctoral student is now running an independent study of inter-rater reliability and convergent/divergent validity across 42 children’s homes.

BERRI will always be developed hand in hand with research, so that there is an ongoing process of refining our outputs in light of the data. The first step in that is getting age and gender norms. But the data can also indicate what we need to do to improve the measure, and the usefulness of the output reports. For example, it seems that it might be meaningful to look at two aspects of “Relationships” being distinct from each other. If the evidence continues to show this, we will change the way we generate the reports from the data to talk about social skills deficits and attachment difficulties separately in our reports. We might also tweak which items fall into which of the five factors. We also want to check that the five factor model is not based on the a priori sorting of the items into the five headings, so we are planning a study in which the item order is randomised on each use to repeat our factor analysis. We also want to explore whether there are threshold scores in any factor or critical items within factors that indicate which types of placements are required or predict placement breakdown. We might also be able to model CSE risk.

The results to date have been really exciting. I have begun to present them at conferences and we are currently preparing some articles to submit for publication. For example, I am currently writing up a paper about the ADHD-like presentation so many traumatised children have, and how we have learnt from our BERRI research that this reflects early life ACEs priming readiness for fight-or-flight rather than proximal events or a randomly distributed organic condition. But the findings depend on all the groundwork of how BERRI was developed, our rigorous validation process and the data we have collected. It is the data that gives us the ability to interpret what is going on, and to give advice at the individual and organisational level.

So you’ll forgive me if I’m somewhat cynical about systems that request a subjective likert rating of five domains from Every Child Matters, or an equally subjective score out of 100 for twelve domains pulled from the personal experience of the consultant when working in children’s social care services, that then claim to be able to map needs and progress without any validation of their methodology, areas to rate, sensitivity to change or the meaning of their scores. Having gone through the process the long way might put me at a commercial disadvantage, rather than going straight to marketing, but I like my houses built on the foundations of good evidence. I can feel confident that the load bearing beams will keep the structure sound for a lifetime when they are placed with precision and underpinned by the calculations and expertise of architects, structural engineers, surveyors and buildings control, rather than cobbled together as quickly as possible, marketed with amorphous claims and sold on rapidly to anyone who will pay for them. After all, I’m not in it to make a quick buck. I know my work is a slow and cumulative thing, and BERRI still has a long way to go before it can create the greatest impact. But my goals are big: I want to improve outcomes for children and young people who have experienced adversity, and I want that impact to influence the whole culture of children’s social care provision in the UK and to continue to be felt through the generations. And to do that, I need to build the thing properly.

* that carers, therapists and managers find it useful and easy to use, that using the BERRI pathway demonstrated an improvement of 14% over 6 months for the first 125 children placed on the system, and that BERRI has a robust factor structure, good reliability between raters, and the basic statistical qualities that suggest sufficient validity for use. We also have some testimonials, including a commissioner who used BERRI to map the needs of 15 high tariff children and found four suitable to move to foster or family placements with support, saving nearly half a million pounds per year from his budget – a finding we would like to replicate with a much larger study, given the opportunity.

 

 

I am not a therapist

I’ve always been someone that likes to keep busy, and has a lot of ideas about places where psychological thinking can make a positive impact. The aspect of my character that I now identify as entrepreneurial and put to good use in my business has always led me to want to try new things and create innovative solutions to problems. I like a lot of things about being a clinical psychologist, and particularly our ability to turn our hand to multiple types and levels of work. However, unlike many other clinical psychologists, I don’t really see myself as a therapist. In fact, I haven’t seen more than a handful of clients for individual therapy over the last decade, and even before that it was a pretty small proportion of my qualified jobs. I’ve always had more of a focus on the other facets of being a clinical psychologist. I think the picture of a clinical psychologist as a therapist is so strong that a lot of people will now be wondering how I fill my time!

So I will answer that question: I have done loads of highly specialist assessments (of neurodevelopmental concerns, attachment, parenting capacity, mental health, life skills, self-esteem, wellbeing etc) and lots of formulating and report-writing – some in collaboration with psychiatric or medical colleagues or within a wider MDT, but more as an external expert or second opinion. I have advised the family courts as an expert in care proceedings and complex custody disputes, and completed numerous pre-court assessments for local authorities to help inform their care planning. I’ve managed teams and services, and supervised from 2-20 other staff at a time, along with sitting in various organisational/management structures. I have designed and delivered training to parents, carers and professionals, and I have done lots of consultancy to various organisations and professionals (mainly those providing health and social care services, or involved in the family courts), and help placement providers to improve their services. I design and deliver group programs (eg Managing behaviour with attachment in mind), but then rapidly cascade train other staff to continue to deliver them. I wrote a book about attachment/developmental trauma, and lots of papers and policy documents about Looked After children, and acting as an expert witness to the family court. I sat on a BPS committee and I contributed to NICE and SCIE guidelines. I’ve designed, managed and evaluated therapy services (but employed others at lower bands to deliver the therapy). I’ve been an expert advisor to the HCPC in a fitness to practice case and to the team investigating a death in public care. I’ve done loads of practice-led research about each client group I’ve worked with, from looking at the psychological and health economic impacts of offering brief therapy to hospital users with diabetes, to commissioned evaluations of other services. So I have plenty to fill my days despite not having a therapy caseload!

I have reflected on why it is that I don’t feel drawn to therapy, and reached the conclusion that, whilst I see it as a very worthwhile endeavour, I don’t really have the patience for resolving difficulties one person at a time over sessions spanning many months. I’m always more interested in grappling with the bigger questions of why people are in distress, and what we can do to most effectively prevent or ameliorate those difficulties. When I’ve solved the riddle (or at least, reached a plan that improves upon existing solutions) I like to evaluate its efficacy, modify it if necessary and then disseminate the learning and/or train others to replicate the solution. I try to step outwards from the individual issue to the broader themes and ways that we can intervene on a wider scale. To use a visual metaphor, if dealing with mental health problems is like bailing out a ship, then rather than scooping out water one cup at a time, I am trying to work out how to plug the leaks, and to design boats that won’t have the same vulnerabilities to leakage in the future. It also helps me to avoid feeling hopeless about factors outside my control and demand exceeding supply, or burned out by an accumulation of traumatic stories.

Jenny Taylor, a past chair of the Division of Clinical Psychology, once described our profession as the structural engineers of the therapy world. Unlike a therapist trained in a single modality of therapy, we can survey the landscape and assess the need, then design the intervention that best meets that need – even if we are not always best placed to deliver it. We can base that recommendation on our knowledge of the current evidence base, which can change as new information comes to light.  If we consider the challenges people face as a river they need to cross, a therapist trained in a single model of therapy might be a bridge-maker. A psychodynamic therapist might be a mason who can build traditional stone bridges and claims that this design best stands the test of time. A CBT therapist might be a carpenter with a set of designs for wooden arched bridges that he claims are cheaper and quicker to erect. Each sees their own skill as either suitable to solve the challenge or not, but also has some incentive to sustain their own livelihood by continuing their tradition. A clinical psychologist can survey the land either side of the river, the span length required to cross it, and the materials available in the locality. They can then advise on the various options, including the relative costs and the evidence of how they fare in different conditions. They may or may not feel that bridge required is within their own skill-set to erect, but have a reasonable overview of other bridgebuilders in the area to recommend. If new designs of metal suspension bridges are developed, this is not threatening to the structural engineer, who can adjust their recommendations to incorporate the emerging evidence base.

I really like this metaphor and strongly identify with the role of structural engineer rather than bridgebuilder. I had always thought that this was instilled in me by my first graduate job, where I was an assistant psychologist on a research project about improving quality of life in residential care homes for older people, and I could see how the research and clinical work were closely tied together and built on each other reciprocally. But now I think my love of data and the scientific method runs deeper than that and I can see it infused throughout my whole approach to life since childhood. When it comes to my work I am a scientist practitioner down to my bones, as I always collect data as I go along. Where I don’t feel like I understand the situation well enough, I first look to the literature and then to gathering data and doing my own analysis to try to gain insight. When I develop something new to try, wherever possible I try to evaluate what we are doing, and refine it through an iterative process until we can prove maximum efficacy. I see that process as being part of the USP of a clinical psychologist – that we think like scientists and gather data to inform our interventions.

But I’m not sure that we communicate this mindset well enough, or that it is universal amongst the profession. It certainly isn’t what draws people into the profession in my experience. Too many clinical course application forms I review could be paraphrased as “I want to learn to be a good therapist” with an afterthought of “and do/use research” because they think that is what selectors want to hear – but in my view therapy can be done by lots of cheaper professionals, who might do an equally if not better job of it. I believe that clinical psychologists should be more than well paid therapists. We should know the evidence base and be able to take on the most complex assessments and formulations (even if others then deliver part or all of the treatment) but also to be able to develop, refine and evaluate novel therapeutic interventions, supervise other staff, improve services, consult, train and manage – things that extend beyond the skillset of most therapists. I’m sure it is clear by now that this is where my own interests lie. And I think it shows through in everything I do.

For example, when I was asked to lead the CAMHS service providing neurodevelopmental assessments I started with a literature review and current policy and best practice guidance. I then conducted an audit of the existing pathways, then tried to make things better. We set up a new clinic system with more rapid throughput and more thorough assessments, and then re-audited showing a reduction from an average of 18 months of input to five, with increased clinician confidence in the service and higher client satisfaction. I also wrote a booklet to help provide the information to parents whose child received a diagnosis of an Autistic Spectrum Condition. Although it required dedicated clinician time for the multi-disciplinary clinic and for the psychometric assessments generated, overall the new pathway freed up capacity because less cases were being held open by other clinicians whilst waiting for assessment, or kept open for prolonged periods afterwards to help the family understand the diagnosis and connect up to local sources of support.  I also sat on a multiagency strategy group to look at establishing best practice standards for the county.

I had the same approach when I was asked to support the adoption and permanence service. I initially set up a consultancy clinic, where social workers could bring cases to discuss or book in families to see jointly. I found that I was explaining similar information about attachment, trauma and neuroscience to multiple professionals, parents and carers in the consultations. So I designed a group to share this content. I called it “Managing Behaviour with Attachment in Mind”, and developed some “doodles” I would draw on flipchart paper to explain the concepts more accessibly. I evaluated the impact and showed it to be an effective format for supporting parents in this situation. The groups were popular and over-subscribed, so I trained others to deliver the group to keep up with demand, first in my service and then more widely. Many people in the groups liked to photograph the doodles to remind them of the topic, so I decided to write a book to share them and Attachment: In Common Sense and Doodles was born.

But I also wanted to know about how we could achieve permanence for more children. I started by looking at the literature about what makes effective adoptive matches. Very little information was available, so I systematically audited the paperwork from 116 adoptive matches and followed them up over 7 years to see what factors influenced the placement outcomes. I was able to look at whether the innovative adoption project to place children with more complex needs had better or worse outcomes, and was able to explore the impact of different motivations for adopting. Whilst to me this was just a natural process of answering the question as an evidence based practitioner, it transpired that these studies of adoption risk and resilience factors were amongst the largest ever done, and I have discovered unique findings that I really should publish*.

You could argue that I was using a sledgehammer to crack a nut by doing all this research and trying to change process when organisations are notoriously slow to change, and that I could have spent my time more productively working with more individual adoptive families. But that’s not how I’d see it. The research I did helped me to understand what the key variables are when considering whether a child can achieve permanence, what kind of family we need to look for to place them successfully, and what kinds of support might ensure that the placement succeeds. I hope that I have fed that knowledge back through my court work, and into various organisational and policy work over the last decade. I have also disseminated it at conferences. However, I would still like to spread it further, because it is my belief that such knowledge can have positive impact at multiple levels – it can help to inform individual placement decisions, service-wide strategies for helping optimal numbers of children to access permanence, and national policy about adoption.

That work led naturally on to developing our services for Looked After Children when I left the NHS and set up my own company, LifePsychol Ltd. We provide training and consulting to foster carers and residential care staff, the social care organisations that support them, and the wider professional networks surrounding them, including education and health staff, police, lawyers, magistrates and judges. As I started to get more immersed in working with children in and on the edge of Care, it led me to recognise that there was a lack of validated and reliable tools to identify the needs in these populations, no outcome measurement tools that could reliably measure change over time in a way that was sensitive to the context and type of life events these young people experience, and a dearth of clinical governance in terms of the efficacy of both placements and interventions for this group of children. That seemed shocking to me, given their highly complex needs, and massively elevated incidence of mental health problems, challenging behaviour, risk to self and others, and prevalence of intellectual or neurodevelopmental difficulties.

As well as the human cost of not being able to identify the best choices for people, it seemed unacceptable that huge amounts of money were being spent on placements and specialist services for this group without any evidence of them changing their wellbeing or life course for the better. Placements seemed to struggle to identify what to work on and how, and there was little objective indication of what defined a successful placement, beyond annual visits from Ofsted (who were predominantly focused on process and procedure). The high level of need and the lack of clinical governance in the sector has allowed various specialist therapists and services to spring up that are virtually unregulated, and many placements have adopted terms like “therapeutic” without these having a consistent definition or meaning. So I wanted to see whether I could make any headway in changing that.

Meanwhile there is pressure from the government to improve outcomes for children in public Care, because they are seen to fare badly compared to the general population of children the same age. The difficulty is that this isn’t comparing like for like – children in care have many more adversities to face, both organic and in terms of their life experiences, that mean they often deviate from the norm. For example, I found that there was a 20 point skew downwards in IQ distribution in children in residential care compared to population norms, meaning that 20-25% of children in this setting had a learning disability, compared to 2% in the general population. Likewise the incidence of Autistic Spectrum Conditions and other neurodevelopmental difficulties amongst children in Care is more than triple that in the wider population. The same is true of young offenders. If we don’t acknowledge that, then the sector is being asked to seek impossible goals and will inevitably be seen as failing, even if placements and services are performing optimally and adding a lot of value to the lives of the children they work with.

To state the obvious, children in care are not just randomly drawn from the population – by definition their needs have not been met, and this can mean both the presence of additional challenges and exposure to harm or deficits in care. I believe that to look at the needs of this population and the degree to which these are met by placements or interventions, we need to either compare them to carefully matched controls or ensure that outcomes are always considered relative to baseline. The latter seems more pragmatic. Scores for young people also need to be considered in the context of what is going on in their lives – as changes in placement, daily routine, contact arrangements, or the arrival or departure of other children from the home can make big impacts on the child’s functioning.

So I’ve been beavering away exploring these issues and developing systems to measure needs and make the data meaningful for those providing care and services. The impact might not be as obvious as delivering psychological therapy directly, but I’d like to think that over time it can improve services for thousands (or even tens of thousands) of children, and make a greater net change in the world.

 

*Maybe I’ll write more about this in a future blog. But the short version is that I have been trying to secure some funding to complete the statistical analysis and disseminate this information, and would still like to do so, so if you have any ideas or useful connections to assist with this please let me know. Failing that I hope I’ll find enough time to write a book on making better adoptive matches at some point in the future.

Seeking collaborator to change the world

LifePsychol Ltd is a company with a clear social purpose – to improve outcomes for people who have experienced adversity through the application of clinical psychology, particularly children who are Looked After in public care after trauma or maltreatment. We deliver effective psychological services for Looked After and adopted children by providing assessments, formulations, therapeutic interventions, consultation, training and outcome measurement tools for placement providers. And we are very much in demand. But at the moment we are clinician led, and we really need a COO with complementary business skills as the company scales up, to ensure that we make the maximum impact going forward.

We are at a very exciting time, with the potential of rapid growth and the first evidence of efficacy for our pathway emerging. We have started the process of applying for DfE Innovation Programme funding, and we have great support from key people (Sir Martin Narey, government advisor who just reviewed the future of children’s homes in the UK, described our pathway and tools as “the missing link for the sector”, Jonathan Stanley at the Independent Children’s Homes Association described them as “the new gold standard for our members”, whilst Lord Listowell said the government should fund part of the cost to ensure there is input from a clinical psychologist in every residential care home). Despite having done no marketing, we have more enquiries about joining our system than we can keep pace with. We are already used in over 100 children’s homes, and we have a growing number of local authorities who wish to roll out our pathway across their entire catchment. We are looking at how we train and license other clinicians to deliver the model both in the UK and internationally.

We have a great clinical team, a graduate project manager/admin, a fantastic professional network and a great product set. What is important to us now is getting the right person to drive the business side forward at this critical time. To do that we really need someone with business skills and experience, combined with a passion for making social change to take on a leadership role on the financial/business side of the company. We are therefore seeking an extraordinary COO who will help us achieve extraordinary things.

Who are we looking for?

You need to genuinely care about making the world a better place, and to share our goal of making a measurable difference to the lives of vulnerable children and young people. As a clinician CEO it is vital for me to have someone I trust to bounce ideas around with, who will ensure that we are on a sound financial footing to enable us to deliver our ambitious plans. You will be familiar with all aspects of the finances for running a business, have a good working knowledge of the UK social care system and be a dynamic manager, but with a willingness to turn your hand to other aspects of the business (from fundraising to recruitment to CRM) until we are large enough to take on a full team. You understand the value of evidence-based practice and you have a good awareness of the financial demands of the social impact sector. You are the kind of person that can nail down complex ideas and grand ambitions into concrete and achievable plans that will make genuine social change.

You will ideally be based in Derbyshire at our new Matlock office and will help to develop a team there, but with some travel to other sites. However, we already have a base in Milton Keynes that I visit fairly regularly, along with existing relationships and use of shared working space in North London (Kings Cross), so if you are the right person then these might be possible alternative locations, provided you are prepared to travel regularly to meet with me in Matlock and are comfortable using video chat in between times.

How to apply

If what we are looking for sounds like you, and you are looking for a new challenge, please get in touch and we can set up a meeting. Or if you know someone that might be the right fit, please pass this information along to them. Email lifepsychol@gmail.com to express an interest. No agencies or recruiters please.

Background information:

LifePsychol currently consists of a small clinical team who provide assessment and therapy services, particularly for children and families, and services commissioned by local authorities to support Looked After Children, adoption or families at the edge of care. Our Clinical Psychologists also provide expert assessments for the family court and to local authorities considering entering proceedings. We provide consultations advice on service development and service evaluations for social enterprise and third sector organisations. Our main specialist area is around attachment, trauma and maltreatment and how this evidence base can inform the care of children who do not live in their family of origin. We therefore provide training for adoptive, foster and residential carers, as well as health, social care and legal professionals, and have a network of associates who provide regular consultation into organisations.

However, our primary goal at present is nothing less than to improve the quality of placements for all Looked After Children in the UK. LAC are a particularly vulnerable group of children and young people because their needs are complex, and often include mental health, developmental difficulties, problems with relationships and behaviour. We hope to achieve this ambitious goal by training carers and implementing a new set of standards for care providers (PRIME) and through regular use of outcome measures (BERRI).

The PRIME standards are about ensuring that strategies carers use are evidence-based, individualised to the background and needs of each child, evolve as the child’s needs change, and are based on a thorough psychological assessment and a multi-faceted formulation of the child’s needs. We believe that having advice from a clinical psychologist to inform the care of all Looked After Children (and other children with complex needs) will both reduce stigma and improve outcomes, whilst helping carers to feel better equipped to meet the children’s needs. We have developed a training program and care pathway as one means to implement these standards for placements.

We have also developed a set of online tools for commissioners and placement providers to use to identify and track the needs of children in their care. The tools are known by the acronym ‘BERRI’ because they explore Behaviour, Emotional well-being, Risk to self and others, Relationships and Indicators of psychiatric or neurodevelopmental conditions that may require further assessment or diagnosis. We want every young person with complex needs to have a service that meets their needs in an effective and evidence-based way. We have therefore developed tools that allow us to gain a more holistic picture of children’s needs, to track how this changes over time and to target particular concerns and monitor the effectiveness of interventions to address them.

Our first data suggests that we can reduce concerns about children significantly within six months of using the pathway and tools we provide, and our services gain exceptional feedback from carers and professionals, but we hold ourselves to tough standards of evidence, and gather data about our effectiveness every step of the way.

Note: The BERRI questionnaire and online tools were developed to improve the outcomes for children Looked After in public care in the UK. However, the system is also applicable to those receiving other forms of intensive or multi-agency input, such as those on the edge of care, attending special schools, placed in inpatient services, secure units or involved with services for young offenders. The system would also be equally applicable in other countries, and could be adapted to other populations (eg adults using mental health inpatient services, people with learning disabilities, or those within the criminal justice system).

Gaining Influence

Quite a long time ago, I identified that it gives me most satisfaction when work gives me the opportunity to have 5 I’s: Intellectual challenge, Independence, Innovation, Income and Influence. This month I have really been working on the last of those, and trying to connect with the right people to make change within the Looked After Children sector as a whole, rather than individual by individual or company by company.

It transpires that over time I have accidentally built up a wide professional network, and a credible platform from which to connect with higher level influencers. It seems that all the time I’ve invested into unpaid stuff helps when it comes to connecting with new people and looking like I know what I’m talking about. This is helpful for me to hold in mind as committee work can all too often feeling like a drain on my time that is almost invisible to anyone else and may have little that is tangible as an outcome for what can be quite an onerous process. Logically I know that this type of activity is rewarded by the innate satisfaction of contributing to important work that needs doing, but this is something I find easier to recognise at the start of the process when I first put up my hand to volunteer and after the end of all the graft than whilst in the middle of it.

Being chair of CPLAAC, on the national CYPF committee for the BPS, part of the NICE guidance development group and the BPS/FJC standards group have let me contribute to various publications that will hopefully reach wider audiences and influence practice. Whether that is in terms of the support and interventions offered for children with attachment problems or the standards that should be expected of psychologists who act as experts to the family courts or the chapter on best practise for psychological services for children and families with high social care needs in What good looks like in psychological services for children, young people and their families, the paper I wrote about Social Enterprises as a vehicle for delivering psychological services or the CFCPR issue I edited on good clinical practise around attachment difficulties, I feel like I have been part of some good work that establishes professional standards and reference points.

And with those things on my CV and a network of allies who share my goals about improving outcomes for Looked After Children, I have been able to meet with various decision makers and influencers about my ideas. The first important contact I made was with Jonathan Stanley, the chair of the Independent Children’s Homes Association. He has been fantastic at promoting my work to residential care providers and helping me to gain a seat at the table. I then met Almudena Lara at the DfE, although she was very new to the role of being LAC lead, and moved on before she was able to pick up our discussion again. I have also met with Social Finance. More recently I was able to meet with Sir Martin Narey, the government advisor (and ex-chair of Barnardos) conducting a review of children’s homes in the UK, and a representative of the DfE. And latterly I had the opportunity to meet Lord Listowel at a recent conference and hope to speak with him further soon.

In all of these meetings, I have been promoting the value of clinical governance in the social care sector. That is, the importance of being able to evidence clinical outcomes and substantiate that you are doing what you claim to do – in this case, that placement providers are improving outcomes for children and young people in their care. My wider goal is to allow commissioners, social workers and Ofsted to be able to see what kind of placement a child needs, whether a placement is making positive change for a child and who can provide the most suitable and effective placement. I’m also keen that the idea of “therapeutic care” is better defined, and that therapists working within care organisations need to be qualified, supervised, regulated by a professional body and practice within their areas of competence. But my main goal is to stop the situation in which placements are paid to provide care for the most complex and vulnerable young people in society, and do so by providing accommodation, food and transport to education but do nothing to address their emotional, behavioural, mental health, developmental/learning needs, risk to self and others, or ability to form healthy relationships with others. I think the tools I have been developing, like http://www.BERRI.org.uk, and the training I provide for staff/carers can help with that, but my goal is nothing less than to change the culture of care in the UK.

Evidence has shown that money invested in the most complex children during their childhood is repaid tenfold in savings to the public purse in reductions in use of mental health, social care and criminal justice services over their lifetime. So why is it that the placements for the most complex children and young people are primarily provided by carers with very low levels of qualification and training? The first steps to improving standards are to ensure that all carers in the foster and residential sector get training about managing the impact of trauma and disrupted attachments, and that all children in public care are regularly monitored on outcome measurements. But these need to be meaningful, and linked into practice, rather than done as hoops to jump that are disconnected from daily care.

I can think of nothing more worthwhile to do with my professional life than to improve care for Looked After Children in the UK, and I hope that I can achieve enough reach and influence to make a genuine difference.