14 strategies to implement digital services into routine work: article review

In this blog, Dr Ben Newton, Senior Research Evaluator at Health Innovation West of England, shares his reflections on a qualitative interview study among health and social care professionals on their experiences of digital service implementations.

I came across an article called How to implement digital services in a way that they integrate into routine work in the Journal of Medical Internet Research whilst writing a survey report led by our Evaluation and Insight team.

Our survey sought to understand NHS staff views and feelings on a digital app being implemented across two NHS trusts and I was curious about how other researchers have understood the role of staff capability, opportunity and motivation (COM-B model1) in implementing digital tech.

What I liked about the article was its relevance to a broad church of staff working to deliver, implement and evaluate digital technology.

If you work, as I do, in the health innovation space, you will definitely want to read the full article. The future appears to be a digital world; one which Covid-19 has accelerated in the health sector – a point the authors make.

The article is a primary research piece set in the Finnish healthcare system. The researchers used group interviews of health and social care professionals working in primary care. One of the key inclusion criteria was that participants had recent experience of implementing digital services.

Some of you will be familiar with Normalisation Process Theory2 (NPT), a framework that helps us understand what helps interventions be successfully implemented AND integrated into routine work. Nadav and colleagues used the NPT framework as an analytical tool, which I felt was a novel approach to qualitative analysis.

Anyway, I’m sure you are keen to read the headline findings, so here goes. The authors identified 14 strategies that they embedded within the four key concepts of NPT – coherence (sense-making); cognitive participation (relational work); collective action (enacting work) and reflexive monitoring (appraisal work).

I’m not going to list all these strategies here – you can read the open-access paper yourself. However, here are key headline results that struck me:

  1. People have had bad experiences of interventions being implemented (you might relate to this!), which led to losing faith and trust. Therefore, implementation processes must be consistent, whilst communication must be multi-channel and continuous. Alongside this, to motivate staff (I’m seeing the link with the COM-B here), they should be provided with a good justification of why the service is needed.
  2. Staff need to have a positive attitude to using the digital service – or they won’t use it. The authors highlight the importance of giving voice to staff to influence (you could say co-design!) the implementation.
  3. Thinking about support, this should be ‘close and readily available’. In some instances, this can be through a physical presence on site. Staff should have time to get familiar with the service and have opportunities to practice. Having sufficient knowledge of the intervention gives staff confidence to use it.
  4. Some of you will have come across the UTAT3 (Unified Theory of Acceptance and Use of Technology) and I would recommend Marikyan and Papagiannidis’ easier explanation of it4. The UTAT seeks to explain a person’s intention to use technology. Two key concepts of this are performance expectancy (how the technology will help the individual perform at work) and effort expectancy (how easy the technology is to use). The final theme in Nadav’s paper contains clear links to the UTAT concepts (although Nadav et al. don’t make these links themselves).

Participants in Nadav’s study reported that having bad experiences of usability, where the technology is not easy to use, can jeopardise successful implementation. This is a clear echo of the effort expectancy concept in UTAT. Moreover, participants said the service should be useful to them; an echo of performance expectancy. Finally, participants said they should have opportunity to offer feedback; a point the authors relates to evaluation factors.

Concluding thoughts? For experienced staff in programme work, the factors identified in this article are unlikely to surprise you. They should, however, reinforce good practice points that, in the heat of project timelines being pressurised, can all too easily get lost.

For the novice (or as my son likes to say in relation to my Minecraft skills: the ‘newb’), you would do well to start with this article and reflect on some of the key pointers and models that can support successful implementation (and evaluation) of programme work.

Read the full article here.

References
1. Michie, S., van Stralen, M. M., & West, R. (2011). The behaviour change wheel: A new method for characterising and designing behaviour change interventions. Implementation Science, 6(1), 42. https://doi.org/10.1186/1748-5908-6-42
2. Murray, E., Treweek, S., Pope, C., MacFarlane, A., Ballini, L., Dowrick, C., Finch, T., Kennedy, A., Mair, F., O’Donnell, C., Ong, B. N., Rapley, T., Rogers, A., & May, C. (2010). Normalisation process theory: A framework for developing, evaluating and implementing complex interventions. BMC Medicine, 8(1), 63. https://doi.org/10.1186/1741-7015-8-63
3. Venkatesh, Morris, Davis, & Davis. (2003). User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly, 27(3), 425. https://doi.org/10.2307/30036540
4. Marikyan, D., & Papagiannidis, S. (2021). Unified Theory of Acceptance and Use of Technology: A review. In TheoryHub Book. http://open.ncl.ac.uk.


Posted on October 10, 2023 by Dr Ben Newton, Senior Research Evaluator, Health Innovation West of England

> Back to index


Subscribe to our blog posts via email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.