By Richard Devine, Social Worker for Bath and North East Somerset Council
NOTE: If you are receiving this via e-mail it may be cut short by your e-mail programme and/or the graphics may be distorted. You may wish to click the title and view it in full.
Several years ago, I read a book that blew my mind. It forever changed how I thought about how we, as social workers, evaluate risk and make decisions.
The book was called ‘Thinking, Fast and Slow’ by Daniel Kahneman (2011).
After a few years of practising social work, I believed I was a pretty good judge of character. I thought I could confidently assess the risk a child was experiencing and the probability of that risk occurring in the future.
I was utterly oblivious to the fact that the way that my mind, and indeed, any human mind, makes sense of the world is riddled with errors.
These errors, referred to as heuristics or cognitive biases, are ‘mental shortcuts that allow us to use our finite cognitive capacities to comprehend and manage an infinitely complex world. And, for the most part, they do a good job.
Surprisingly, these ‘mental shortcuts’ are predominately unconscious, and we don’t even know we use them. I had no idea of there existence until I read this book, nor the pervasive impact they had on my decision-making.
This is what makes them effective. They are automatic, fast and don’t require much cognitive effort, thus helping us navigate the myriad of everyday decisions we have to compute throughout the day.
They are also prone to error. And in reliable and systematic ways. In day-to-day life, the impact of these errors is minimal. However, if left unchecked in child protection social work, these errors can have disastrous, life-changing consequences for children and their families.
The evidence for the role and extent of these biases on the impact of the decision-making is extensive.
There are dozens of them. I will explore some that I think are most relevant as a social worker.
Overconfidence bias is the tendency to believe that our abilities are more significant than they are.
For example, I believed that after a couple of years of working in child protection that I was a reliable judge of character, capable of effectively being able to evaluate risk.
My first manager, Tanya Keating, used to tell me ‘confidence over competence is dangerous’. It was and still is one of the most important lessons I have learned in social work.
I think that is especially true in child protection when our actions and decisions can profoundly alter the lives of children and families.
Availability bias is the tendency to think of examples that spring to mind (familiar facts, recent and vivid memories), rather than thinking representatively when making estimations about the future.
For example, the shocking nature of a plane crash almost always gets reported in the news. The images and dreadful nature of a plane crash widely reported in the media springs to our mind much more readily than a car crash, which happens far more frequently and causes substantially more injury and death (Pinker 2018). Therefore, many believe that planes are a more dangerous form of transport than cars.
Kahneman (2011, p.8) writes: ‘People tend to assess the relative importance of issues by the ease with which they are retrieved from memory—and this is largely determined by the extent of coverage in the media’.
Following the death of Peter Connelly, there was a significant and relatively prolonged upsurge in children being made subject to care proceedings and placed into alternative care (See Baby P effect).
Arguably, one potential cause of this was availability bias.
Social Workers were suddenly and quite shockingly confronted with the possible outcome of a failure to intervene swiftly and decidedly. It would of been easy to conclude that a failure to intervene leads to child death and highly critical public, media, and government attention.
We also give disproportionate value to sensory input or information presented vividly, compared with plain, dryly presented information.
For example, over the years, I have read many reports of parents shouting and screaming at their children from neighbours or other professionals.
However, I found watching a video of this once incredibly impactful, making me feel intensely protective of the child subject to the abuse.
The same information presented in different formats can dramatically alter our view, leading us to underestimate or overestimate the level of risk.
Confirmation bias is the strong tendency to find or filter only the information that supports our pre-existing view or a conclusion we have made.
In other words, once we have formed a view, we find it extremely difficult to accept information contrary to this view.
We may also avoid gathering information from sources that may cause us to question our view, or discredit the reliability of information from certain sources.
In social work, this means that once we have assessed a child to be low risk, we find it hard to accept new information that a family is at higher risk. Equally, once we have assessed a child as high risk, we find it hard to accept new information that would prove the risk to the child has been reduced.
As pointed out by Munro (2008: 125)’ The single most important factor in minimising errors [in child protection] is to admit you may be wrong’.
There have been countless times during my career when I have needed to adjust my evaluation of risk, which I wouldn’t have done without the support of my supervisor. Effective supervision, in my opinion, is one of the most effective correctives to confirmation bias (and underestimating or overestimating risk).
HALO effect is the tendency to judge a person’s entire character based on one aspect of their appearance or presentation.
In addition, the trait of another person we are exposed to first alters how we interpret the traits we observe after.
Kahneman found, ‘The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted’ (2011 p.83)
One parent I worked with was incredibly welcoming, pleasant, and engaging to work with. She was extremely likeable. During my visits, she showed plenty of emotional warmth to her children. My first impression of her and the interpersonal relationship made it hard to fully acknowledge the harm her substance misuse had on her children. I had read the extensive chronology documenting substance misuse and criminal behaviour. I also subsequently received information from various reliable sources that her children were being neglected in life-endangering ways and she shouted and swore at them. Yet, I found this nearly impossible to imagine because of how inconsistent it was with my experience of spending time with her.
Because she was nice to me, I assumed she was always nice to everyone.
Fundamental attribution error is the tendency to explain other people’s behaviour due to fixed, unchangeable personality traits rather than the context or environment.
We ignore the situational factors that play a role, instead assuming that someone who does terrible things is a terrible person.
Ironically, when explaining our behaviour, we more readily explain it as a result of the environment.
I think we are more inclined to consider a child’s behaviour in light of the context and environment.
However, we often struggle to extend this perspective to adults.
The fundamental attribution is a deceptively powerful mechanism, and I think is why we constantly need tools or practice frameworks, such as signs of safety and system practice that deliberately facilitate our ability to think about the person in context. They help counteract the default setting of viewing parents’ behaviour stemming from their personality.
In a fascinating, thought-provoking book (recommended to me by Munro), titled ‘The End of Average’ (2015), the author Todd Rose argued that it is meaningless to talk about human behaviour without context.
He highlighted research that shows that our behaviour is highly context-dependant, coining the term ‘the context principle’ to describe this.
The context principle challenges us to think about ourselves and others in a way that is counter to how we’ve been taught to think about personality most of our lives…. Most of us believe that, when it comes right down to it, we are optimists at heart-or cynics. That we are nice-or rude. That we are honest-or dishonest. The idea that who we are changes according to the circumstances we find ourselves in-even if those changes are unique to our own self-seems to violate the fundamental tenet of identity: to us, our personality feels stable and steadfast.(Rose 2015 p. 118)
To take a slight detour, Munro argued that the fundamental attribution error is one factor that undermines the process of learning from decision-making errors. She found that when a social worker’s decisions are analysed, the context that the social worker made the decision in is often stripped away. Munro argued we should focus on the context, as much, if not more, than on the individual.
‘It is usually easy to identify the people close to the tragedy who made mistakes and target them for improvement whereas a study of the wider organisational context would take considerably more effort’(Munro 2011, p.1143).
She pointed out, quoting Reason, that ‘We cannot easily change human cognition, but we can create contexts in which errors are less likely and, when they do occur, increase their likelihood of detection and correction … situations can be more or less error-provoking (Reason, 2009 p.32)’ (Munro 2010, p.1140)
Echoing this view, Harry Ferguson (2011), in Child Protection Practice, considered the emotional impact of developing relationships with parents. He persuasively argued that most reviews in decision-making in social work that have resulted in tragic outcomes overlook the relational dynamics between social workers and families.
He writes, ‘the way that practice is written about in these enquiry reports is that the emotions and lived experience of practice are ignored’ (p.60, 2010).
A social worker confronted with a challenging, resistant, or hostile parent can be made to feel fearful, scared, or intimidated. This can influence their thought processes, emotions, and actions, such they make decisions they wouldn’t otherwise.
For example, I have frequently experienced fear and apprehension in asking difficult questions when faced with a parent who acts intimidatingly. I have also been made to feel so uncomfortable in a family home that I would prefer not to visit. On some visits, I experienced trepidation, fear, and anxiety while I approached and spent time in the house.
If I was unaware of these feelings, quite easily, I could of not asked essential questions or avoided spending time with children in their family home, which could result in me not having access to vital information that would aid an assessment of risk (see Ferguson et al 2020 for an excellent exposition of working with anxiety, hate and conflict).
The impact of the context on decision-making was illustrated in a paper by Ferguson, How Children Become Invisible in Child Protection Work (2017, p.1017).
He described how practitioners who were capable of effective direct work with most families could in their work with some families be ‘overcome by the sheer complexity of the interactions they encounter, the emotional intensity of the work, parental resistance and the tense atmospheres in the homes’ and this could result in them unintentionally losing sight of the child.
Our capacity to be effective practitioners and make sound decisions can be context-dependent. Under certain conditions, our thinking ability can be undermined.
What You See Is All There Is (WYSIATI) is a bias in which we make evaluations based on the coherence of an explanation we have conjured up and not based on the quality or the quantity of the information.
In other words, we build a story from the available information, even when it’s poor quality. If it’s a good story, we believe it. However, we don’t consider the information we don’t know or might be missing.
‘The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little’. We often fail to allow for the possibility that evidence that should be critical to our judgement is missing – what we see is all there is’(Kahneman, 2011, p.87)
WYSIATI, perhaps because it incorporates many biases, is one bias I think about the most.
I find it hard to imagine a child subject to significant harm when I visit the family and observe emotionally warm, positive, and playful interactions between the parent and child.
What I see is all there is.
I find it hard to imagine a parent acting abusively and violently towards their child when they present as entirely amenable and accommodating.
What I see is all there is.
I find it hard to imagine the parent’s chronic and severe drug use impacts their parenting when I visit the family and they live in a well-furnished, tidy, and clean home.
What I see is all there is.
Hindsight bias’ heavily distorts any review of past decisions and response to risk, especially when the outcome is undesirable. Once you know the outcome, it looks inevitable – in fact, ‘the worse the outcome, the greater the hindsight bias’ (Munro 2020, p.50).
Kahneman notes, ‘Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight‘ (2011, p.203).
Hindsight bias is related to ‘outcome bias’. The tendency to blame decision makers for good decisions that turned out badly and give too little credit for successful decisions that appear obvious only after the fact.
This points to an important debate on how social workers should evaluate decisions (Wilkins and Forrester, 2017).
On the one hand, good, thoughtful decisions can result in harmful, unwanted outcomes. Therefore, we should assess decision-making by the quality of the decision-making process, not the outcome.
On the other hand, poorly thought-through decisions can lead to positive outcomes. Therefore, it is argued that decision-making should be judged by the outcome, not the decision-making process’s quality.
I am inclined to prefer the former approach. A child can be assessed as low risk, yet an unexpected turn of events could result in a fatal outcome. Low risk doesn’t preclude the possibility of future harm, just that it was less likely than more likely based on the available information at the time. Therefore, the decision could be assessed as wholly reasonable, even though the outcome was tragic.
With that said, I have seen cases whereby social workers didn’t act at a point of high risk that would warrant it, yet, surprisingly, the family subsequently turned it around. From an ethical point of view, it would be hard to argue against this being a favourable outcome. The child was protected from the trauma of being placed in alternative care, and the circumstances were rectified. Even though, at one point, the child was clearly and evidentially at risk of sufficient harm that would warrant removal.
Hindsight and outcome bias is essential to be aware of when undertaking audits.
Towards the end of Effective Child Protection (2020, 3rd edition), Munro outlines 9 principles to judge professional decision-making.
One includes the need to factor in the effect of hindsight bias.
‘hindsight bias can seriously affect people’s appraisal of a decision when it is followed by an adverse outcome but the quality of decisions is inevitably affected by the many influences that decision makers are subjected to. When a decision is being reviewed, the full conditions and influences existing at the time should be identified and examined to determine whether the action taken was reasonable in those circumstances. Implementing this principle provides a rich source of learning about how the organisation is functioning and identifying strengths as well as areas of weakness.(Munro 2020, p.200)
In Thinking, Fast and Slow (2011), Kahneman invokes the metaphor of system 1 and system 2.
System 1 is fast, automatic, effortless, and mostly unconscious. System 1 effectively helps us navigate the world and our day-to-day life. Constantly evaluating our decisions throughout the day would be exhausting and paralysing. However, it is predictably prone to errors as we have discovered.
System 2 is slow, deliberate, effortful, and conscious. It tends to be more accurate, especially with complex decisions. However, it is cognitive costly and time expensive. Therefore, it can’t be relied upon as a substitute for making routine decisions.
Taking into consideration the advantages and limitations of each system, Kahneman recommends ‘The best we can do is compromise: learn to recognise situations in which mistakes are likely and try harder to avoid significant mistakes when stakes are high’ (2011, p.28)
System 1 is efficient yet reliably vulnerable to distortions. System 2 is cognitively demanding and time-consuming yet more accurate.
So, how do we use our understanding of these systems in our practice?
I have several ideas from practice; however, I should note that these are not evidential.
Firstly, recognise that our mind predictably uses heuristics and mental shortcuts to make sense of the world. We don’t get to opt-in or out of them.
They are an unavoidable feature of decision-making that effect not just social workers but everyone.
Recognising this requires knowledge of the biases and self-awareness to notice them taking effect.
Secondly, in my opinion, supervision is one of the most effective antidotes to the harmful excesses of cognitive biases. I have frequently been supported and encouraged to adjust my evaluation following a discussion with my supervisor.
Even after 12 years of practice, I need a supportive yet challenging space to evaluate and assess risk. I would be severely compromised with it.
Thirdly, use discrepancies as a vital source of information and make hypotheses explicit.
A young child I visited was always happy and smiling whenever I interacted with them. Yet, the home conditions were impoverished, with no toys. The mother was a chronic drug user with severe depression and rarely interacted with the child. When she did, it wasn’t warm or playful.
This highlighted a discrepancy.
Perhaps, the child was accessing alternative care from a family member who was providing high levels of warmth, stimulation and care to compensate for the lack of this at home? Or perhaps the child had learned to not express negative emotions, having learned that it doesn’t yield a useful or comforting response? Therefore, the child falsely displays happiness to maximise the chance of getting some type of reaction.
By noticing the discrepancy and generating hypotheses, I could use that as a basis for collating information. Both hypotheses were reasonable in light of the evidence at the time. Holding two in mind and looking to prove or disprove also counteracts confirmation bias.
Fourthly, learn to play devil’s advocate. I have been cross-examined several times, and while this is always uncomfortable, you learn that the same information can be interpreted differently. Therefore, every claim I make has to be evidentially robust and take into consideration counterarguments.
As a result of these experiences, when I write assessments now, I often think, ‘what’s an alternative explanation for this?’, or ‘what would someone who disagreed with this point say?’.
This can be difficult if you’re a newly qualified social worker. Still, you can ask your manager or a critical friend to read your assessment. You can ask them:
Are there any essential factors I haven’t considered (WYSIATI)?
Have I given too much weight to some evidence and overlooked other evidence (Availability bias)?
Is the child or parents’ behaviour better explained through the environment and circumstantial factors instead of personal and psychological issues (Fundamental Attribution Error)?
Fifthly, (if you can) work in an environment and culture that facilitates free thinking and hypothesis generation. You need a supportive environment whereby you can share your worries and feelings, so these don’t unduly influence your assessment of the family.
You also want to explore multiple perspectives and different hypotheses and receive a healthy challenge to your thinking that supports you notice what you haven’t seen or revise your judgement.
When we feel stressed, unsupported, or unable to freely express ourselves, our thinking capacity becomes even more constrained. In these conditions, we are more likely to rely upon System 1.
Sixthly, where possible, evaluate your decisions after the fact to see whether or not you were accurate.
It is crucial to have a feedback loop to understand if you are underestimating (false positive) or overestimating (false negative) risk.
Follow Munro’s advice and admit when you are wrong. I know from experience that this can be painstakingly difficult. I have found some solace in realising that ‘it is one of the most effective ways to reduce error in child protection’ (ibid).
Lastly, buy Eileen Munro’s book Effective Child Protection. It’s one of the most informative books I have read, especially on the topic of assessing risk.
I have focussed on biases that impact the individual social worker, and I have reflected on some ways I have sought to minimise the impact of cognitive biases.
It is worth noting, however, that the child protection system, culture of the organisation and management support all have a considerable impact.
Indeed, some have argued that improving decision-making is best achieved through providing the systems and support that facilitate the use of System 2 thinking. At the very least, given what we know about how difficult it is to make objective decisions, we all (managers, service managers, organisations, etc.) need to take responsibility for supporting social workers make effective decisions
In a future blog, I intend to write about how to effectively assess risk. For example, I think a chronology is the most effective tool. However, it requires we know what information we need, why and how to use it.
If you have found this interesting/useful, you may wish to join a growing community of 750+ others by signing up for free blogs to be sent directly to your inbox.
I *intend to write every fortnight about matters related to child protection, children and families, attachment, and trauma. Or you can read previous blogs here.
By Richard Devine (26.07.2022)
Wilkins, D & Forrester, D (2020) Predicting the Future in Child and Family Social Work: Theoretical, Ethical and Methodological Issues for a Proposed Research Programme, Child Care in Practice, 26:2, 196-209, DOI: 10.1080/13575279.2019.1685463
Ferguson H. (2011) Child Protection Practice, Basingstoke, Palgrave.
Ferguson, H (2017) How Children Become Invisible in Child Protection Work: Findings from Research into Day-to-Day Social Work Practice, The British Journal of Social Work, Volume 47, Issue 4, June 2017, Pages 1007–1023, https://doi.org/10.1093/bjsw/bcw065
Kahneman, Daniel. (2011). Thinking, Fast and Slow. London: Penguin Books.
Munro, Eileen (2010) Learning to reduce risk in child protection. British Journal of Social Work, 40 (4). pp. 1135-1151. ISSN 1468-263X
Munro, Eileen (2020) Effective Child Protection (3rd Edition). Los Angeles. Sage Publications.
Pinker, S (2018). Enlightenment Now: The Case for Reason, Science, Humanism and Progress. London. Penguin Books.
Rose, T (2015) The End of Average.: How to Succeed in a World that Values Sameness. Great Britain. Allen Lane.