615
Views
0
CrossRef citations to date
0
Altmetric
Psychosocial interventions

EmpRess: an eHealth implementation readiness checklist for dementia developed through an interview study of stakeholder needs

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 791-800 | Received 17 Jun 2023, Accepted 20 Feb 2024, Published online: 11 Mar 2024

Abstract

Objective

This study aimed to create a tool to assess eHealth interventions for dementia by adapting an existing implementation readiness (ImpRess) checklist that assessed manualised interventions.

Methods

In Part 1, online semi-structured interviews with individual stakeholders (N = 9) with expertise in eHealth and dementia were conducted (response rate 83%). The Nonadoption, Abandonment, and challenges to the Scale-Up, Spread, and Sustainability of Health and care technologies (NASSS) framework was applied, both to guide the construction of the interview guide, as well as to use its subdomains as codes in the deductive qualitative thematic analysis. Respondents were industry professionals (n = 3), researchers (n = 3), policy officers (n = 2), and a clinician (n = 1). In Part 2, the items of the original ImpRess checklist were supplemented by items that covered determinants discussed in the interviews, that were not included in the original checklist.

Results

The main findings from the interviews included: Participants’ preference for a non-dementia-specific, more general approach to the checklist; the importance of searching for shared values with implementers; and the need for more systematic monitoring of implementation.

Conclusions

The EmpRess checklist applies an inclusive design approach. The checklist will help evaluate the implementation determinants of eHealth interventions for dementia and provide up-to-date information on what is, and is not, working in eHealth for dementia care.

Introduction

Background

eHealth for dementia

Increasingly, policy makers are worried about the ageing population and rising social and economic cost of care for people with dementia, eHealth interventions. eHealth is ‘an emerging field of medical informatics, referring to the organization and delivery of health services and information using the Internet and related technologies’ (Boogerd et al., Citation2015). Indeed, eHealth interventions can provide information on dementia, guide people with dementia through the care process, reduce carers’ symptoms of depression, anxiety, burden, and stress, and increase carers’ sense of self-efficacy, competency, and dementia knowledge (Boots et al., Citation2014). Moreover, eHealth interventions are low-cost, low-threshold, easy to personalise, and adaptable to changing needs. The COVID-19 pandemic has emphasised that there is a need to offer people with dementia and carers remote support options (Alzheimer Nederland, Citation2020). Among caregivers of people with long-term and chronic conditions, dementia is the most researched area in E-health interventions (Sin et al., Citation2018). Relevant examples of eHealth interventions for dementia include online interventions to help informal carers adapt to their new role (Boots et al., Citation2018) or apps to help networks organise dementia care (Christie et al., Citation2022).

The implementation issue

However, very few of these eHealth interventions for dementia are implemented from the research context into practice. One of the most important reasons for this is the lack of knowledge on barriers and facilitators to implementation, outside of the research context (Christie et al., Citation2018).

Implementation readiness

Previous research by Streater et al. has resulted in the development of the ImpRess checklist, to evaluate the implementation readiness (ImpRess) of manualised interventions for people with dementia (Streater et al., Citation2016). Manualised interventions are replicable and systematised treatment approaches or therapies documented in a manual or guide, providing detailed instructions for consistent delivery (Forbat et al., Citation2015). The ImpRess checklist helps form a picture of the barriers and facilitators to bringing evidence-based manualised interventions for dementia into practice. However, there is currently no checklist to evaluate the ImpRess of eHealth interventions for dementia. Previous work has identified several aspects of the ImpRess checklist that would require modification to assess characteristics specific to the implementation of eHealth interventions for dementia (Christie et al., Citation2019), such as the provision of resources for (long and short term) software/hardware maintenance and updates, compliance with data privacy and security laws (such as the General Data Protection Regulation [GDPR]), and the assessment of the intervention’s fit with the health care context (including compatibility with its existing digital systems).

Aim

The aim of this study is to adapt the ImpRess checklist to assess the ImpRess of eHealth interventions for dementia in a new (eHealth) version of the checklist: the EmpRess checklist. An updated checklist was needed to allow researchers to better map the ImpRess of eHealth interventions, in order to identify common failings and help researchers improve the sustainable implementation of dementia eHealth interventions.

Methods

Study design

The first part of this study consisted of online, semi-structured individual interviews that were based on an interview guide. In the second part, these interviews gave input to the development of the new version of the EmpRess checklist.

Part 1: interviews

Recruitment

Participants were invited via email by authors HC and GA after being identified through the authors’ own networks and snowball sampling. A total of 15 people were contacted to participate in the study, to achieve recruitment of 8–10 participants, based on previous recommendations of sample sizes for in-depth, exploratory qualitative studies (Boddy, Citation2016). Thirteen people responded (response rate of 87%), though four of these 13 had to decline participation due to busy schedules (n = 2) and lack of alignment with their goals and objectives in their current positions (n = 2).

Study population and sample size

In total nine participants participated in the interview study. Interview participants were professionals, with the inclusion criteria that they were (1) researchers, policy makers, clinicians, or other stakeholders and (2) within the field of dementia, eHealth, and/or implementation. The sole exclusion criterion was the absence of availability, interest or response from the potential participant. depicts this study’s participant backgrounds and different types of expertise. Participants originated from a variety of (Western) countries: United Kingdom (n = 4), Spain (n = 1), the Netherlands (n = 2), France (n = 1), and Canada (n = 1). In this study, people with dementia and informal carers were not approached to participate, as its focus was to study implementation determinants (including its organisational and contextual aspects), which would be more relevant to the previously described groups.

Table 1. Participant backgrounds.

Data collection

Nine online, semi-structured interviews with relevant stakeholders were conducted between May and June 2022. The interviews took place with one respondent at a time, with the interview being conducted by author HC, while author GA observed and facilitated audio recording. The interviews had an average duration of 38 min, were audio recorded, and transcribed verbatim. The interviews took place in English using MS Teams.

Informed consent

All participants had received an information letter explaining the aims of the study, which also guaranteed the anonymous processing of their data and responses, in addition to the option of discontinuing study participation at any point. Informed consent was obtained from all participants by author HC. Ethical approval for the study was granted by Maastricht University’s Medical Ethical Oversight Commission (approval number 2022-3176).

Interviews

The interview guide was based on the Nonadoption, Abandonment, and challenges to the Scale-Up, Spread, and Sustainability of Health and care technologies (NASSS) framework (Greenhalgh & Abimbola, Citation2019) for (1) informing the design of a new technologies; (2) identifying technological solutions that (perhaps despite policy or industry enthusiasm) have a limited chance of achieving large-scale, sustained adoption; (3) planning the implementation, scale-up, or rollout of a technology programme; and (4) explaining and learn from programme failures. It includes questions in seven domains: (1) Condition or illness, (2) Technology, (3) Value proposition, (4) Adopter system, (5) Organisation(s), (6) Wider context and (7) Embedding and adaptation over time. Reviews of the NASSS framework have identified it as a valuable tool for understanding the challenges associated with the implementation of health and care technologies, though its limitations include its complexity and a lack of emphasis on individual factors versus contextual factors (Greenhalgh et al., Citation2018; Shin et al., Citation2023). Hence, the NASSS framework was chosen over other established implementation frameworks, such as the Consolidated Framework for Implementation Research (Damschroder et al., Citation2009) or the RE-AIM framework (Glasgow et al., Citation1999), due to its specific emphasis on technology and contextual factors such as care support. This emphasis was well-suited to this study’s described focus on implementing eHealth in dementia care and increasing knowledge of organisational and contextual determinants of eHealth for dementia. The interview guide consisted of questions that were grouped into the seven NASSS framework domains (Appendix 1). All participants were asked the entire set of questions in the interview guide.

Data analysis

Authors HC and GA independently coded the semi-structured interviews using deductive thematic analysis (Evers, Citation2015) in Atlas.ti version 8.3 for Macintosh (Atlas.ti Scientific Software Development GmbH, Berlin, Germany). Due to the design of the interview guide, it was expected that the interviews would reflect the thematic domains of the NASSS framework and not new inductive groups. Deductive thematic analysis used codes based on NASSS subdomains. HC and GA applied the NASSS codes in to interview transcriptions and compared interview segments with the same deductive codes across interviews. HC and GA compared the independently applied codes. They subsequently had a consensus meeting with author MdV to resolve any differences of opinion.

Table 2. Deductive NASSS framework codes.

Part 2: checklist development

The ImpRess checklist

The original ImpRess checklist consists of nine themes: Motivation, Theory of change, Implementation context, Experience, Planning consultations, Delivery collaborations, Manager Support, Resources, and Population characteristics. These themes contain a total of 26 questions. An intervention is given a score of 0, 1, or 2 for each question, resulting in a potential minimum score of 0 and a maximum score of 52. The ImpRess checklist was derived from a set of criteria to appraise quality of reporting of the implementation of workplace interventions and its development was guided by the Medical Research Council (MRC) Framework for complex interventions (Craig et al., Citation2008).

The EmpRess checklist

Based on the input from the interviews, a new version of this checklist was developed to specifically assess the ImpRess of eHealth interventions for dementia, in addition to the existing checklist that assessed the ImpRess of manualised interventions. As with the interviews, the NASSS framework guided the development of the new checklist and helped authors HC and GA identify areas that were not yet included in the nine domains of the original ImpRess checklist. In a consensus meeting, the authors (including MO, a developer of the original ImpRess checklist) worked collaboratively to finalise the new checklist items, phrasing, and scoring.

Results

Overview

In the first part of this results section, the deductive interview themes are discussed, illustrated with quotes for each numbered respondent (R). The second part describes how these findings were integrated into an eHealth version of the checklist.

Part 1: deductive interview themes

Condition

Nature of the condition

The majority of the participants (across professional backgrounds) voiced a preference for a more general, non-dementia-specific approach to the checklist. Multiple participants said that they thought the issues of eHealth implementation would largely be the same across populations. One participant (an industry professional) also mentioned, however, that dementia is associated with specific challenges: it is a progressive disease, often containing multiple phases with changing needs, and diminished cognitive capacity that would affect eHealth-related skills, including motor ability and learning the new digital processes.

For example, just with communication and language, if it is on something on the Internet, it has to be easy, because probably in six months or one year, they will have less capacities to understand. R4, industry professional (Spain)

One participant (a policy officer) expressed that proven suitability of an eHealth intervention in the dementia care context could function as an indicator of the accessibility of the eHealth intervention for a variety of target groups, due to the unique challenges of eHealth implementation in dementia care.

I think if you get it right for people with dementia, you get it right for everybody. Looking at environments for people with dementia and all those kinds of things. If you get it right for them, anybody else can then access and use those things. R8, policy officer (United Kingdom)

Socio-cultural influences and comorbidities

Several participants (largely the policy makers and one researcher) described that it was important to consider whether the evidence provided for the eHealth intervention had been collected from and evaluated in diverse samples of users and contexts. What works for one group does not necessarily work for another, especially when it comes to eHealth for dementia, where there are specific cognitive, digital, and health literacy considerations (see above).

Technology

Material features

Participants across backgrounds emphasised the importance of keeping the eHealth intervention as simple as possible, especially given the target group of people with dementia and (often older) informal carers.

Simple instructions and just taking all those key areas of poor cognition into account, so you don’t set people up to fail. You don’t want to create something that is confusing, or is really challenging to use, because they wouldn’t be able to use it. R5, policy officer (United Kingdom)

Industry professionals and the clinican also mentioned that there should be real-world interactions (such as face-to-face meetings or telephone calls) incorporated into the eHealth intervention, so the intervention does not only take place online. Furthermore, participants mentioned accessibility (in terms of reading level, disabilities, such as seeing and hearing problems, and affordability) and the ability to personalise the intervention as key eHealth intervention features.

Type of knowledge generated through the technology

Participants (often from industry) described that it is often unclear for people with dementia and their carers whether an intervention is evidence-based and should be seen as valid, especially when many of the users are not accustomed to evaluating the validity and origin of an online tool. Moreover, one participant (a researcher) emphasised the potential importance of continued access to the knowledge generated through the eHealth intervention, and how losing access (for instance, through a failed implementation) to the generated data can cause serious distress. It is important to be aware of potential harm in these cases.

I spoke with a lot of developers and users and one of the things that they really highlighted was the loss of trust when something doesn’t work or when something disappears, and so one of them was talking about - she’d been using this app to monitor her moods for two years and then all of a sudden the app just disappeared with all of her data. And it was because the app company weren’t making any money and just closed down. R2, researcher (United Kingdom)

Knowledge needed to use

The participants across backgrounds described how there were many types of knowledge involved in implementing eHealth for dementia – from the digital skills necessary to download the app, to confidence in navigating potential hurdles related to data security, especially in environments, such as care homes, where there is often not much technical support.

If you compare users to non-users, it’s mostly younger people who are higher educated and have more tech skills - you need to ensure that your products get to more people than just that group. And it’s actually used by them, and that it works for them too. R7, researcher (the Netherlands)

Technology supply model

For sustainability, multiple participants (with policy and clinical backgrounds) highlighted the factor of interoperability, in that clinicians do not have the time to learn to use many different technological applications. Therefore, eHealth interventions must be able to operate within existing structures.

I suppose things like interoperability, well, we’ve got so many different systems in place you know, that we’re putting in the workable with other systems and processes. Because you don’t want to be doubling up on things either. R5, policy officer (United Kingdom)

An added benefit is that intervention users can then more easily understand how to access the intervention, and hopefully implementation costs are minimised.

Technology is only as good as people know what it is and that it’s available to them, and it’s affordable to them. – R6, clinician, occupational therapist (Canada)

Value proposition

Supply-side value

An industry professional in the field of health technology provided valuable insights into this topic, highlighting the importance of finding out who is involved in the development process, who is paying, and how different partners in the business model can work together to create shared value for everyone.

So before you start implementing, do you have clear view of who is paying for what and why? I would say try to figure out who gets the benefits. And make that as tangible as possible, and that’s the difficult part, because it’s often difficult to make it tangible, but make it as tangible as possible. Find out where the benefits are and find out according to the systematic rules who should be paying, and then try to get all these people in one room to discuss. Because there needs to be some kind of middle road. It cannot be that one party pays and the other benefits. And when the party that benefits sees how much they benefit, hopefully that convinces them also that they should also look into the finances. R9, industry professional (the Netherlands)

This participant also emphasised how, in his experience, this perspective is often lacking in academic research contexts.

Researchers say: “Yeah, we hope to get a big subsidy or grant and then everybody can use it.” It’s not a commercial view, and that’s what’s lacking. Think about how this is commercially going to work, if that’s the intention. If it’s purely research, fine, then it’s scientific…You want to offer things for free. The earlier you think about it the better. R9, industry professional (the Netherlands)

Demand-side value

First, multiple policy makers mentioned that eHealth interventions can provide significant added value to an organisation by providing eHealth training to the implementing staff, even when they are aimed at a different target group, such as informal carers.

Is there some way that eHealth could upskill the workforce so that might be something that they could see value in? We were like, OK this can be delivered by anyone, but you’re upskilling the people who are delivering it, so they’re learning how to use this intervention and deliver this intervention. They’re learning therapeutic skills. There’s lots of transferable stuff from using eHealth. R5, policy officer (United Kingdom)

Therefore, having an aspect of the eHealth intervention that is aimed at upskilling the implementing staff could be a significant implementation facilitator. These participants said that eHealth developers could consider offering certification for training personnel in how to use their intervention. However, two participants (an industry professional and a policy officer) mentioned that the turnover in residential dementia care settings is extremely high. This was named as a barrier for organisations to invest in training staff, as a great number of them move on to other positions within just a few years.

Also, one participant (a researcher) suggested applying these insights about upskilling to the eventual checklist itself. By offering training on eHealth implementation and using the checklist as a tool in this training, the checklist would become embedded in a process. This, in turn, could result in the checklist itself being used more widely and sustainably.

I imagine that if you disseminated the checklist through, like, training courses and things like that, that would be attractive - like certification in implementation readiness or something like that. Particularly if you can offer that for free, if it’s going to benefit the development in the long run, why not? Why not offer it as like a free online course or something like that? Or in computer science engineering, having it as a class that people can take. R7, researcher (the Netherlands)

Second, several policy makers mentioned seeing eHealth for dementia as having the potential to ease pressure on dementia care services and even revitalise them.

The adopter system

Staff

As described in the NASSS framework, changes in staff roles, practices, and identities are important determinants of eHealth implementation. One participant (a researcher) pointed out that the evaluation of eHealth in practice takes a specific skillset that is often not present in the implementing staff.

So there’s a lot of charities in the UK that are offering a lot of activities for people with or without dementia, dementia cafés and more innovative things. But they are struggling because they have to ask for funding all the time and they have to show to policymakers that what they are doing is useful. It’s difficult for them to explain, because sometimes it’s very qualitative and it’s not their job. R2, researcher (United Kingdom)

Another participant (an industry professional) also cautioned that, in their experience, health care professionals in the field of dementia sometimes saw the implementation of eHealth interventions in residential settings as a threat to their job, as it might make their face-to-face services redundant.

People with dementia and carers

Another factor is what is expected of the person with dementia and the extended informal care network. One participant (a researcher) emphasised that without buy-in from local dementia carer support networks, it had been next to impossible to implement a government-funded dementia intervention in her region. Moreover, multiple participants (across backgrounds) described a perceived distrust in formal carers towards online tools. However, some also expressed that the felt that these more negative attitudes were slowly changing (to more positive attitudes) over time.

I suppose there’s been such massive changes with digital over that since the pandemic hasn’t there? I mean early on in the pandemic when we moved very quickly to doing digital consultations via MS Teams and other formats, I mean that was something completely new for us and not something that we’d ever done with our patients or ever considered…I think there’s a lot of scope to do a lot more digitally for our patients and carers for sure. R8, policy officer (United Kingdom)

Organisation

Capacity and readiness to innovate

Almost all participants mentioned the lack digital capacity in residential dementia care settings. Factors include the previously described negative attitudes of some staff towards online tools, but also a lack of digital infrastructure in residential dementia care settings. For example, a participant mentioned that many care homes do not have a Wi-Fi network.

Well you have some people who don’t like tech and so they will stop tech from coming into their organisation, because they either think that it is, for lack of a better word, callous. It just loses the personality that is needed to implement dementia care. Some people would see the efficiency gains as a threat to their livelihood. And then you also have people who see who see the benefits and how it can make their lives easier. R1, industry professional (United Kingdom)

Nature of the adoption and funding decision

Crucial in organisational eHealth implementation is the board or management level adoption decision on whether to allocate funds to a particular intervention. However, the data needed for organisational decision makers to be fully informed in this process is often lacking in interventions originating in an academic research context. One participant (an industry professional) expressed the issues with this dearth of information as follows.

How much time does the eHealth intervention need to complete? How much time does the coach need to invest in training to get to know the tool? How much time do they need to support a client? Do they have other administration tasks on top of it? Those questions are always asked. They’re asked first. R9, industry professional (the Netherlands)

Work needed to implement change and new routines

Participants (mostly industry professionals) emphasised that it is very important to consider the work involved in making the intervention function well within the new implementation context. For instance, one participant mentioned that for an online dementia coaching platform, the coaches themselves must also be supervised and monitored, to ensure that they feel sufficiently confident to carry out the coaching work, as this can also sometimes have an additional emotional toll.

Wider system

Political/policy

One participant (an industry professional) mentioned how difficult it was to scale up a successful eHealth intervention, when different regions within a country employ different political and health care systems.

Regulatory/legal

One industry professional described how time-consuming and expensive it was to apply for eHealth accreditation (at a national overseeing digital tool body) and said this was significant barrier that hampered quick implementation. A researcher explained how, in his experience, a lack of knowledge among researchers about intellectual property was a barrier, as owning the intellectual property proves necessary to scale up eHealth interventions, even in a public health care context.

I’m certainly not in favour of privatising health and care or anything like that in this country, but the fact is that the National Health Service has always bought products and services from companies from industry vendors, whether that be drugs, scanners, whatever it is, and the same is true for digital. R5, policy officer (United Kingdom)

Professional

Policy officers said that many of the implementing staff are not familiar with digital tools and in many cases do not speak the local language as a first language, due to an immigration background. This professional dementia care context makes eHealth implementation challenging.

There’s former mining communities in the North of the city that are very White, very traditional. In the city centre, we’ve got a huge Asian diaspora, huge Afro Caribbean diaspora, Asian - both South-East and Far East. Having the ability to engage with various different types of language and cultural references, I think is really important and also factoring in that a lot of the people working in the care home workforce are possibly also not from a traditional White British background. R5, policy officer (United Kingdom)

Socio-cultural

Finally, several participants across backgrounds noted that eHealth interventions for dementia often only reach a very specific, affluent section of society, as this care context has more resources to innovate and there is a higher level of engagement with their own health needs. This current inequity in the potential scope of eHealth for dementia should be taken into account.

Social economics determines directly the kind of patients that are going to use the eHealth. I mean all the private clinical centres? They’re full of high-up social economic people so normally these are the kind of people who are worried about their health. Now the lower-medium class are not attending - they are covered by the Social Security system, but not the private part. R4, industry professional (Spain)

Embedding and adaptation over time

Scope over time

One participant (an industry professional) suggested continuing to collect evidence on the eHealth implementation, generating new data on the medium- and long-term feasibility of implementing the intervention. Industry professionals and researchers expressed that being able to update the platform (and communicate to users about updates, for instance through a mailing list) according to this continuing feedback was crucial.

It’s such a practical thing we have built into the process we used to recruit users, that they sign up for a mailing list. Because it’s such a practical barrier, but it it goes to what we were saying about supervision and communication and the time and infrastructure. R2, researcher (United Kingdom)

Organisational resilience

In line with this need for continued monitoring and implementation data collection, is the importance of being able to respond to problems with the intervention. In this sense, industry professionals emphasised that this is a particular strength of eHealth interventions versus more traditional, offline interventions, as this sort of instantaneous feedback and status monitoring is easier to build into eHealth processes.

With an online service it’s easy, because the algorithms are working for you. But when dealing with all offline material - how to ensure that the whole process is achieved from the beginning to the end? You can spend several months without any result and you don’t know and after months you don’t know where it’s going on. R4, industry professional (Spain)

Part 2: integrated checklist items

The findings from the interview themes were integrated into the new EmpRess checklist. Items 1–26 were kept from the original ImpRess checklist (i.e. all of the items). Items were added to cover any of the interview themes that were not covered by the original ImPress items. The final checklist, with revised item numbers, is included as Appendix 2. It contains 45 items, which results in a maximum checklist score of 90 (with items being scored at either 0, 1or 2, as with the original checklist). There are subtheme scores for each of the NASSS domains: Condition (maximum subtotal 4), Technology (maximum subtotal 40), Value proposition (maximum subtotal 10), Adopters (maximum subtotal 4), Organisation (maximum subtotal 16), Wider system (maximum subtotal 4) and Embedding and adaptation over time (maximum subtotal 14).

Discussion

Main findings

This study successfully applied the NASSS framework to develop and analyse semi-structured interviews with nine stakeholders in the field of dementia and technology. The findings from the resulting analysis were integrated into a new eHealth version of the ImpRess checklist: the EmpRess checklist. Instruments for monitoring and improving the ImpRess of eHealth interventions are greatly needed, as previous research has shown that only 3% of evidence-based eHealth interventions for dementia are implemented into clinical practice (Mair et al., Citation2012). In particular, eHealth interventions for informal carers of people with dementia have shown high dropout rates. A large part of the challenges reported by these carers in accessing and using eHealth can be attributed to the current implementation limitations (Sin et al., Citation2018). A previous review of implementation determinants of eHealth for dementia identified that common barriers to implementation include use-friendliness, personal contact, digital skills of the target group (Christie et al., Citation2018). However, very little is known about organisational and contextual determinants. The new EmpRess checklist that will help shed light on these determinants.

The additions to the EmpRess checklist mainly target the NASSS themes Technology, Scope and embedding over time, and Organisations. This was due to insights into how eHealth technologies have specific, additional implementation determinants (such as the usability of the eHealth, or how up-to-date it is), the need to ensure sustainable use (for instance through diligent longer-term financial planning and communication strategies), and the unique identified needs of the dementia care context (such as high staff turnover and the high level of digital skills required for implementation). This section explores two important themes for the future of eHealth implementation in dementia care, as well as directions for future research on the EmpRess checklist and methodological considerations.

Inclusive eHealth design versus customised design

Previous research has explored the tension between opposite pushes for so-called ‘easy-to-use’, inclusive versus specifically targeted, customised design (Bianchin & Heylighen, Citation2017). On the one hand, frameworks such as the NASSS framework (Greenhalgh & Abimbola, Citation2019) and the International Organisation for Standardisation (ISO) standard for health and wellbeing apps (Standardization I. O. F., Citation2021) call for as little customisation as possible to facilitate easy implementation in a variety of contexts. One respondent (R8) in this study stated: ‘If you get it right for dementia you get it right for everyone.’ As such, by designing for the dementia health care context, with its described challenges in the declining capacity of the target group and organisational pressures regarding language and high staff turnover, interventions might be applicable in a wide range of contexts.

On the other hand, many studies have highlighted the importance of personalisation in eHealth, mentioning it as a contributing factor to effectiveness and successful implementation (Christie et al., Citation2018; Gibson et al., Citation2019). Taking the more general approach runs the risk of not addressing the needs of the whole target group, which can differ greatly. For instance, eHealth interventions to support carers of people with Alzheimer’s disease often do not have the same benefits for carers of people with frontotemporal dementia (Bruinsma et al., Citation2021). Moreover, ‘easy-to-use’, general approaches may fail to take into account diversity characteristics and reduce self-efficacy when users do find the interventions difficult to use (Neal et al., Citation2022). Finally, there are many populations with disease-specific needs (e.g. tremor in Parkinson’s disease), which could most likely often not be met by dementia-specific eHealth interventions and would not benefit from the more generalised approach.

For the EmpRess checklist, the participants advocated an approach that was not dementia-specific. However, items relating to accessibility were added to minimise risk of not taking determinants into account that are related to the representation of the entire, diverse target group. Language, such as ‘simple’ and ‘easy to use’ was avoided. Combining the existing NASSS framework and ImpRess checklist has resulted in a tool that is suitable for both the dementia care and eHealth contexts, facilitating the measurement of ImpRess in a variety of care settings. As such, the EmpRess checklist meets the need for a tool that can address the specific challenges of the dementia care context and its particular organisational challenges, as well as the need for a tool that can evaluate eHealth interventions that are suitable for – but not specific to – the dementia population.

Offline versus online dementia interventions

All items from the original 26 ImpRess were kept in the EmpRess, as they covered themes discussed by the interview participants. In addition, 19 items were added to the EmpRess checklist. As there are more items for this eHealth version, things raises this question: ‘Are eHealth interventions more complicated to implement than offline interventions?’ While there has been an increasing push from policy makers to improve the digitalisation of health care (Brătucu et al., Citation2020), there has also been a call to critically examine its (lack of) sustainable implementation (Christie et al., Citation2019, Citation2021; Svendsen et al., Citation2021). The EmpRess checklist could be used to identify what (aspects of) eHealth interventions are not working, potentially reducing inefficient spending of research funding. Indeed, non-technological (or offline) interventions are still important and deserve attention for their potential to improve dementia care. Previous research has shown that personal contact is an important determinant of implementation in eHealth for dementia (Boots et al., Citation2014; Chiu & Eysenbach, Citation2011; Cristancho-Lacroix et al., Citation2015; Schaller et al., Citation2015). However, participants in this study describe how difficult it is to sustainably monitor effective implementation in purely offline dementia interventions. As such, the EmpRess checklist places an emphasis on incorporating both online and offline interactions to facilitate implementation.

Directions for future research

This study provided many potential avenues for future research. First, there seems to be a need for different versions of the EmpRess checklist, based on the unique challenges associated with specific types of eHealth interventions. This would provide information for the NASSS subdomain ‘Type of knowledge’, which is still under addressed. In addition, future research could develop a broader dementia eHealth research toolkit, in line with what was done in previous research on the NASSS complexity assessment tool (NASSS-CAT)(Greenhalgh et al., Citation2020), which aids in understanding, guiding, monitoring, and researching technology innovations in the health care context (though this was not specific to eHealth). An EmpRess toolkit could guide researchers through the eHealth implementation process, and avoid leaving implementation as an issue to be discussed at the very end of a research project, as is implied by the concept of a one-time ‘checklist’. Next, the authors wish to collect evidence from researchers on the use and implementation of the EmpRess checklist, to validate, evaluate, and improve its items. This could be done through vignette studies (Keane et al., Citation2012) and applying the existing NASSS-CAT tools to this eHealth context. Finally, future research on the next iterations of the EmpRess checklist will ask people with dementia and informal carers to provide input on this first version of the checklist, which was based on the stakeholders initial views.

Strengths and limitations

A first strength of this study is its interdisciplinary approach, which incorporates perspectives from academia, industry, policy and clinical practice. Second, it has a solid theoretical base in the NASSS framework, which is used to bring research insights into sustainable practice. This enabled a detailed discussion with respondents that added deep insight from users into the new EmPress tool. However, there are also important limitations to this study. First, there is a risk of socially desirable responses, which could lead to respondents describing their attitudes towards eHealth implementation more positively. Second, this study focused only on Western implementation contexts, and as such its findings may not be applicable to other regions. Third, this study interviewed a small sample of stakeholders and as a result the participants may not represent the full spectrum of stakeholders involved in the implementation of eHealth for dementia. It is possible that interviewing more respondents would have increased the findings’ generalisability to more contexts and represented more points of view. Finally, this study did not involve people with dementia or informal carers, as the topic of organisational and contextual implementation as thought to be more relevant to stakeholders with positions in those contexts. However, this study constitutes a necessary first step into mapping the determinants of ImpRess of eHealth for dementia, to be validated by a wide range of groups and updated in future research.

Conclusions

This study constitutes a first step in the development of the EmpRess checklist. The main findings include participants’ preference for a non-dementia-specific approach to the checklist; the importance of searching for shared values with implementers (for instance, through upskilling the workforce through the eHealth intervention); and the need for more systematic monitoring of implementation (both to improve adoption decisions and increase self-efficacy in health care professionals). As a result, the EmpRess checklist will apply an inclusive design approach, that also takes into account the importance blended online and offline intervention to facilitate implementation. The final Empress checklist will help evaluate the implementation determinants of eHealth interventions for dementia and provide up-to-date information on what is, and is not, working in eHealth for dementia care.

Authors’ contributions

A designed the study, with input from co-authors MC, MO, and MdV. HC and GA collected the data and performed the analyses. A prepared the initial manuscript for publication. MO and MdV supervised the data collection and analysis. All co-authors reviewed and approved this manuscript.

Acknowledgements

We would like to thank all participants involved in these interviews for their time and support. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care.

Disclosure statement

None declared.

Data availability statement

The data that support the findings of this study are available on request from the corresponding author, HC. The data are not publicly available due to restrictions (e.g. their containing information that could compromise the privacy of research participants).

Additional information

Funding

This research was made possible by Alzheimer Nederland, under grant agreement WE.15-2021-02. C is also supported by NIHR Nottingham Biomedical Research Centre and NIHR MindTech MedTech Co-operative.

References

  • Alzheimer Nederland. (2020). Resultaten Corona peiling onder mantelzorgers van mensen met dementie. https://www.alzheimervrijwilligers.nl/sites/default/files/users/user912/resultaten-coronapeiling-alzheimer-nederland%20(1).pdf
  • Bianchin, M., & Heylighen, A. (2017). Fair by design. Addressing the paradox of inclusive design approaches. The Design Journal, 20(1), S3162–S3170. https://doi.org/10.1080/14606925.2017.1352822
  • Boddy, C. R. (2016). Sample size for qualitative research. Qualitative Market Research: An International Journal, 19(4), 426–432. https://doi.org/10.1108/QMR-06-2016-0053
  • Boogerd, E. A., Arts, T., Engelen, L. J., & van De Belt, T. H. (2015). “What is eHealth”: Time for an update? JMIR Research Protocols, 4(1), e4065. https://doi.org/10.2196/resprot.4065
  • Boots, L. M., de Vugt, M. E., Van Knippenberg, R., Kempen, G., & Verhey, F. (2014). A systematic review of Internet‐based supportive interventions for caregivers of patients with dementia. International Journal of Geriatric Psychiatry, 29(4), 331–344. https://doi.org/10.1002/gps.4016
  • Boots, L. M., de Vugt, M. E., Kempen, G. I., & Verhey, F. R. (2018). Effectiveness of a blended care self-management program for caregivers of people with early-stage dementia (partner in balance): Randomized controlled trial. Journal of Medical Internet Research, 20(7), e10017. https://doi.org/10.2196/10017
  • Brătucu, G., Tudor, A. I. M., Dovleac, L., Sumedrea, S., Chițu, I. B., & Trifan, A. (2020). The impact of new technologies on individuals’ health perceptions in the European Union. Sustainability, 12(24), 10349. https://doi.org/10.3390/su122410349
  • Bruinsma, J., Peetoom, K., Boots, L., Daemen, M., Verhey, F., Bakker, C., & de Vugt, M. (2021). Tailoring the web-based ‘Partner in balance’intervention to support spouses of persons with frontotemporal dementia. Internet Interventions, 26, 100442. https://doi.org/10.1016/j.invent.2021.100442
  • Chiu, T. M., & Eysenbach, G. (2011). Theorizing the health service usage behavior of family caregivers: A qualitative study of an internet-based intervention. International Journal of Medical Informatics, 80(11), 754–764. https://doi.org/10.1016/j.ijmedinf.2011.08.010
  • Christie, H., Martin, J., Connor, J., Tange, H., Verhey, F., de Vugt, M., & Orrell, M. (2019). eHealth interventions to support caregivers of people with dementia may be proven effective, but are they implementation-ready? Internet Interventions, 18, 100260. https://doi.org/10.1016/j.invent.2019.100260
  • Christie, H. L., Bartels, S. L., Boots, L. M., Tange, H. J., Verhey, F. R., & de Vugt, M. E. (2018). A systematic review on the implementation of eHealth interventions for informal caregivers of people with dementia. Internet Interventions, 13, 51–59. https://doi.org/10.1016/j.invent.2018.07.002
  • Christie, H. L., Boots, L. M., Hermans, I., Govers, M., Tange, H. J., Verhey, F. R. J., & de Vugt, M. (2021). Business models of eHealth interventions to support informal caregivers of people with dementia in the Netherlands: Analysis of case studies. JMIR Aging, 4(2), e24724. https://doi.org/10.2196/24724
  • Christie, H. L., Dam, A. E. H., van Boxtel, M., Köhler, S., Verhey, F., & de Vugt, M. E. (2022). Lessons learned from an effectiveness evaluation of inlife, a web-based social support intervention for caregivers of people with dementia: Randomized controlled trial. JMIR Aging, 5(4), e38656. https://doi.org/10.2196/38656
  • Craig, P., Dieppe, P., Macintyre, S., Michie, S., Nazareth, I., & Petticrew, M. (2008). Developing and evaluating complex interventions: The new Medical Research Council guidance. BMJ, 337, a1655. https://doi.org/10.1136/bmj.a1655
  • Cristancho-Lacroix, V., Wrobel, J., Cantegreil-Kallen, I., Dub, T., Rouquette, A., & Rigaud, A. S. (2015). A web-based psychoeducational program for informal caregivers of patients with Alzheimer’s disease: A pilot randomized controlled trial. Journal of Medical Internet Research, 17(5), e117. https://doi.org/10.2196/jmir.3717
  • Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 1–15. https://doi.org/10.1186/1748-5908-4-50
  • Evers, J. (2015). Kwalitatief interviewen: Kunst én kunde. Boom Lemma Uitgevers Amsterdam.
  • Forbat, L., Black, L., & Dulgar, K. (2015). What clinicians think of manualized psychotherapy interventions: Findings from a systematic review. Journal of Family Therapy, 37(4), 409–428. https://doi.org/10.1111/1467-6427.12036
  • Gibson, G., Dickinson, C., Brittain, K., & Robinson, L. (2019). Personalisation, customisation and bricolage: How people with dementia and their families make assistive technology work for them. Ageing and Society, 39(11), 2502–2519. https://doi.org/10.1017/S0144686X18000661
  • Glasgow, R. E., Vogt, T. M., & Boles, S. M. (1999). Evaluating the public health impact of health promotion interventions: The RE-AIM framework. American Journal of Public Health, 89(9), 1322–1327. https://doi.org/10.2105/ajph.89.9.1322
  • Greenhalgh, T., & Abimbola, S. (2019). The NASSS framework-a synthesis of multiple theories of technology implementation. Studies in Health Technology and Informatics, 263, 193–204. https://doi.org/10.3233/SHTI190123
  • Greenhalgh, T., Maylor, H., Shaw, S., Wherton, J., Papoutsi, C., Betton, V., Nelissen, N., Gremyr, A., Rushforth, A., Koshkouei, M., & Taylor, J. (2020). The NASSS-CAT tools for understanding, guiding, monitoring, and researching technology implementation projects in health and social care: Protocol for an evaluation study in real-world settings. JMIR Research Protocols, 9(5), e16861. https://doi.org/10.2196/16861
  • Greenhalgh, T., Wherton, J., Papoutsi, C., Lynch, J., Hughes, G., A’Court, C., Hinder, S., Procter, R., & Shaw, S. (2018). Analysing the role of complexity in explaining the fortunes of technology programmes: Empirical application of the NASSS framework. BMC Medicine, 16(1), 66. https://doi.org/10.1186/s12916-018-1050-6
  • Keane, D., Lang, A. R., Craven, M., & Sharples, S. (2012). The use of vignettes for conducting healthcare research. Advances in human aspects of healthcare (pp. 451–460). Independent Publisher.
  • Mair, F. S., May, C., O’Donnell, C., Finch, T., Sullivan, F., & Murray, E. (2012). Factors that promote or inhibit the implementation of e-health systems: An explanatory systematic review. Bulletin of the World Health Organization, 90(5), 357–364. https://doi.org/10.2471/BLT.11.099424
  • Neal, D., Engelsma, T., Tan, J., Craven, M. P., Marcilly, R., Peute, L., Dening, T., Jaspers, M., & Dröes, R.-M. (2022). Limitations of the new ISO standard for health and wellness apps. The Lancet Digital Health, 4(2), e80–e82. https://doi.org/10.1016/S2589-7500(21)00273-9
  • Schaller, S., Marinova-Schmidt, V., Gobin, J., Criegee-Rieck, M., Griebel, L., Engel, S., Stein, V., Graessel, E., & Kolominsky-Rabas, P. L. (2015). Tailored e-Health services for the dementia care setting: A pilot study of ‘eHealthMonitor. BMC Medical Informatics and Decision Making, 15(1), 58. https://doi.org/10.1186/s12911-015-0182-2
  • Shin, H. D., Hamovitch, E., Gatov, E., MacKinnon, M., Samawi, L., Boateng, R., Thorpe K., & Barwick, M. (2023). The NASSS (non-adoption, abandonment, scale-up, spread and sustainability) framework use over time: A scoping review. medRxiv 2023.2011. 2022.23298897. https://doi.org/10.1101/2023.11.22.23298897
  • Sin, J., Henderson, C., Spain, D., Cornelius, V., Chen, T., & Gillard, S. (2018). eHealth interventions for family carers of people with long term illness: A promising approach? Clinical Psychology Review, 60, 109–125. https://doi.org/10.1016/j.cpr.2018.01.008
  • Standardization I. O. F. (2021). Health software—part 2: Health and wellness apps—quality and reliability (ISO/TS 82304-2). https://www.iso.org/standard/78182.html
  • Streater, A., Spector, A., Aguirre, E., Stansfeld, J., & Orrell, M. (2016). ImpRess: An Implementation Readiness checklist developed using a systematic review of randomised controlled trials assessing cognitive stimulation for dementia. BMC Medical Research Methodology, 16(1), 167. https://doi.org/10.1186/s12874-016-0268-2
  • Svendsen, M. T., Tiedemann, S. N., & Andersen, K. E. (2021). Pros and cons of eHealth: A systematic review of the literature and observations in Denmark. SAGE Open Medicine, 9, 20503121211016179. https://doi.org/10.1177/20503121211016179

Appendix 1.

Interview questions

Appendix 2.

EmPress checklist.