Are you a practitioner, keen to practice in an evidence-based way but with little time to keep up with the research? Maybe you find scientific papers unreadable? Or perhaps you support the aim in principle, but find it hard to set up gold-standard science-based evaluations of your interventions with your clients? You are not alone.
Psychology, following medicine and other applied disciplines, has become very keen on the idea of evidence-based practice. And I’m all for it, in principle: it’s a hard idea to argue against. I religiously peruse the contents page of the academic journals that thud onto my doormat, rarely finding a title that gets my juices going. I thought the failing was mine, until I read Joanna Wilde’s book ‘The Social Psychology of Organizations’.
In this book she brilliantly explains how ‘scientific knowledge’ and ‘helping practice’ relate to each other. She suggests it is somewhat optimistic to hope that laboratory methods and facts can be just plonked down in the field and have positive impact; rather, there is a translation process involved if we are to get the best from the research.
Exploring this further Joanna mounts a spirited defence of the evidence-base that practitioners can call upon; an evidence base that is different, but no less valid, than the science evidence base.
We are not scientists, we have to problem solve not experiment
She offers a number of interesting ideas to help us be evidence-based in our practice in complex system fields.
She notices that we are in a subtly different business to science: we aren’t seeking primarily to establish knowledge, we are primarily seeking to help. We are working in a different context to different ends to scientists.
Given this, the intervention is judged by impact, and not by the facts it generates. This shifts the focus of the evaluation question subtly from ‘does it work?’ to ‘does it help?’
Our practice is client-centric, not knowledge-accumulation-centric. She suggests that practice is the process by which knowledge from one situation is converted into a different form designed to be effective for the particular situation at hand. The situation at hand often being a WICKED problem.
A WICKED problem is defined as ‘a complex problem that is evolving and can not be completely solved.’ WICKED problems offer a sharp contrast to the type of bounded problem required in scientific work. What works in one context may not work in another, and what can be tightly investigated in one context may not be trackable in another. The practice is specific to the context.
She notes that in contrast to conducting experiments, what practitioners do is
o Engage with WICKED problems, with an awareness of problem mutation
o Access and use a wide range of evidence from multiple sources
o Work in relationships
o Design interventions, monitoring emergence, enabling course correction
o Focus on the impact in context
She suggests that field knowledge is based on broad observation and ‘evidence by experience’. Our evidence base exists, but it extends beyond experimental results.
Some examples of ‘immediate and evolving (sources of ) evidence’ that are field specific include:
Emerging events in talk and context
Practitioner experience and authentic intuition
In other words, we are cognisant of data emerging in the moment and attempt to form hypothesis of ‘what is going on here?’ against which we can select our possible next action.
We are not detached observers
This is a key difference: how we engage with and work with our clients is key to our practice. Scientists, on the other hand, generally work to keep themselves out of the science. We, or at least I, am well aware that I am monitoring the effectiveness of my practice almost on a moment to moment basis. In my head I have a set of criteria against which I am evaluating the conversation: is it moving productively forward? Is it enhancing or at least not damaging relationships? Are they ‘hearing’ what I’m saying? am I ‘getting’ what they are saying? And of course fundamentally ‘does this seem to be making a positive difference? Is it helping the situation?’
Sadly the answer to these questions isn’t always yes. But that’s ok because I can try something else. After all as Wilde so succinctly note, ‘Intervention practice requires the capacity to work in real time with uncertainty.’ And ‘For those of us that have built a career as practitioners, it is the dynamic nature of translating emerging knowledge into changing complex environments that makes the work engaging and rewarding.’ And all the while I’m building up my practice evidence base.
This isn’t to say that laboratory work isn’t valuable. It is and we need to be able to work with trust in the scientific disciples we draw from. But few of us have the time, patience or skill to critique the papers. To be honest, we rely on the academic refereed paper system to ensure that for us. We want to be able to take it and run with it. This sounds interesting, how can I apply it here? How might it help?
I love Joanna’s work and regard this book very highly. What I have presented here is a much simplified and reduced part of a much richer and more complex argument about the relationship between science and practice. If you are interested, I encourage you to invest in the book. It’s great.
Sarah Lewis is the owner and principal psychologist of Appreciating Change. She is author of ‘Positive Psychology in Business’ ‘Positive Psychology at Work’ and ‘Positive Psychology and Change’ both published by Wiley. She is also the lead author of 'Appreciative Inquiry for Change Management'.
APPRECIATING CHANGE CAN HELP
Appreciating Change is skilled and experienced at supporting leaders in working in this challenging, exciting and productive way with their organisations. Find out more by looking at how we help with Leadership, Culture change and with employee Engagement.
If you want to know more about implementing approaches and processes that positively affect people’s happiness, engagement, motivation, morale, productivity and work relationships, see Sarah’s positive psychology books