Dr Kristian Hudson is an implementation specialist working within the NIHR Yorkshire Humber Applied Research Collaboration (YHARC). Kristian provides implementation support to many stakeholders including researchers, programme teams and healthcare staff. He is interested in empowering and facilitating these stakeholders to generate local implementation knowledge so they can implement things that matter to them, overcome implementation barriers as they arise and generate ‘within system learning’. Kristian and his team have developed an innovative approach to capturing the all-important practical implementation knowledge that arises from within system learning using a rapid, responsive and relevant approach to evaluating implementation.
Kristian runs a podcast called Essential Implementation where he talks to implementation specialists and researchers around the world.
Unscheduled care coordination hubs could be a potential solution to overburdened ambulance demand and pressures on accident and emergency (A&E) departments in the UK. They offer a single point of access for unscheduled care where a multidisciplinary team takes calls off the ambulance service call stack and rather than send an ambulance, attempt to provide care to patients in their normal place of residence in less time. The aim is to reduce ambulance conveyance rates and improve patient experience. Implementation science principles along with improvement practices (‘tests of change’ and ‘plan, do, study, act’ (PDSA) cycles) were used to implement the care hub. A rapid, relevant and responsive evaluation was carried out to evaluate implementation and aimed to capture the complexity of the implementation process, generate transportable findings and facilitate ‘within system learning’ and implementation success.
The aims of this study were:
• To develop a process map of the implementation of the USCCH hub model (including the engagement process and the tests of change).
• To capture in detail the complexity of the implementation journey i.e. the interaction between the USCCH, the ever-changing context and the test of change approach
• To understand what worked well, what didn’t work well and identify key practical insights and transportable findings for implementation elsewhere.
• Use the Consolidated Framework for Implementation Research to understand multi-level contextual determinants of implementation.
For the implementation, a process of engagement was followed by an initial 5-day test of change. This was then followed by three one-month tests of change. For the evaluation rapid qualitative analysis techniques (Stanford Lightning reports) were used to capture ‘within system learning’ that occurred across the test of change period. Baseline and end-of-study interviews were also conducted. The idea was to capture contextual evidence about what worked well, what didn’t go well and any key insights from participants or from our research team on ‘how to’ practically implement USCCH. Results were consolidated into transportable findings and the Consolidated Framework for Implementation Research was used to understand multi-level contextual determinants of implementation. The methodology therefore presents a combination of improvement science practices and implementation science techniques with the aim of producing transportable findings suitable for use across contexts, systems and cultures.
The initial engagement period and the tests of change proved to be an effective approach to implementation. The evaluation proved to be useful in capturing ‘within system learning’ and producing transportable findings. It also facilitated the implementation effort. High tension for change, external change agent, key stakeholder engagement and having an ambulance member present in the care hub were strong facilitators of implementation. Commitments, ownership and governance, learning environment, reflecting and evaluation and political drivers had mixed or negative effects on implementation. Lightning reports proved useful to both researchers and the unscheduled care coordination hub team.
Unscheduled care coordination hubs have the potential to improve unscheduled care provision through a single point of access. However, the main objectives of the hub need to be agreed from the start and the learning environment needs to encompass individuals and teams outside of the hub. Tests of change seem to be a highly effective approach to implementation. The combination of improvement practices and implementation science evaluation techniques offered an effective approach to implementing and evaluating unscheduled care provision. Engagement seems to be an important precursor to this approach. It might be a good idea for implementation researchers to move from traditional, top-down research approaches to participatory and embedded implementation research evaluations as this seems to be a good way to capture practical implementation knowledge.