364
Views
2
CrossRef citations to date
0
Altmetric
BACKGROUND INFORMATION

What might European general practice learn from New Zealand experience of practice accreditation?

&
Pages 40-44 | Published online: 11 Jul 2009

Introduction

As evidence grows that organisational characteristics can influence patient outcomes in general practice Citation[1], Citation[2], many health systems are demonstrating increased interest in externally assessing and improving the quality of practice management. This development has coincided with European unification and the opportunity to develop a common set of quality indicators to organise and manage practices across Europe. For instance, a European Practice Assessment (EPA) instrument has been developed and validated Citation[3], Citation[4]. Key questions now are how best to refine and use such instruments for assessing organisational performance in general practice.

For almost a decade, New Zealand (NZ) has been developing a programme to accredit general practices while supporting their quality improvement and achievement of organisational excellence Citation[5], Citation[6]. This paper considers potential lessons for Europe. Comparing the NZ experience with Europe (rather than with one European country) is consistent with the development of European instruments for quality assessment, maximises our target audience, and respects the ability of readers in different European countries to assess the applicability of our discussion to their own health system.

In NZ (and, for example, the United Kingdom [UK] and the Netherlands), the profession leads the accreditation process for the voluntary development of practice organisations. This contrasts with the trend in some insurance-based health systems in Europe towards contracting for the “regulation” of health providers Citation[7], Citation[8]. As these systems signal using accreditation that is more summative than formative, it is appropriate to look at accreditation systems, such as in NZ and the Netherlands, with a formative commitment to quality improvement. First, therefore, we describe the experience of NZ and Europe in measuring performance at the practice level, using the EPA instrument as a reference. Then, we recommend possible directions for further development of the accreditation process, regarding what to assess and how.

NZ experience

NZ general practices are predominantly independently owned businesses providing first-contact care in a health system centrally funded from general taxation. The Royal NZ College of General Practitioners (RNZCGP) manages practice accreditation using its Cornerstone programme. As a voluntary process, Cornerstone combines quality assurance with quality improvement by practice teams. This complements the College's Maintenance of Professional Standards programme for general practitioners (GPs) and the government's Performance Management Framework for improving the health of enrolled populations and managing unplanned expenditure growth. An audit agency, Health and Disability Auditing NZ (HDANZ), independently oversees Cornerstone and externally validates the final report of the outreach assessment of each participating practice. Using this report, HDANZ recommends whether the RNZCGP should accredit the practice. Decisions to accredit are based on evidence of achievement of minimum standards, and do not affect the remuneration available to practices.

These minimum standards for accreditation are made explicit in a booklet titled Aiming for Excellence Citation[6]. First developed in 1999 by general practice teams and service users, they aim to put patients first while reflecting different perspectives of quality in general practice care. Having undergone a validation trial in 2001 and revisions in 2000, 2002, and 2006, the standards are now being redeveloped through informal consensus for release in early 2008. The revised standards are attempting to bridge personal healthcare and population healthcare, and to give increased emphasis both to performance vis-à-vis competence and to intermediate outcomes compared with processes. Nevertheless, the standards will still specify how care is delivered, for example to give confidence that practice achievements will be maintained or bettered and because intermediate outcomes cannot always be easily measured.

The current version of Aiming for Excellence has five sections: factors affecting patients; physical factors affecting the practice; practice systems; practice and patient information; and quality improvement and professional development. Eleven indicator groups span the sections. Forty-six indicators identify individual standards, and each indicator is assessed by five measurable criteria. There are three categories of criteria: legal and safety criteria, which identify and manage risk; other essential criteria, as determined by the College; and “desirable” criteria, signifying opportunities for quality improvement.

For the purpose of quality assurance, practices first complete a self-assessment (internal audit cycle) against the standards in Aiming for Excellence. This stage (and the local provider organisation to which practices may belong) helps to prepare them for the second stage: an external assessment, in relation to the same standards, by two trained, peer assessors—a GP and either a nurse or a practice manager. The external assessment involves interviews with team members and a review of clinical records, policies, protocols, and practice systems. Guide notes support the assessors. Each criterion in Aiming for Excellence is recorded as “not met”, “partially met”, “met”, “exemplary”, or “not applicable”. For criteria not met or partially met, assessors can record gaps or areas for improvement and make recommendations. For all the criteria, the methods of verification and feedback can be recorded. Collected data are managed electronically using an application management utility called GDSL_Gateway. It can also upload assessment reports and combine input from both assessors, but it cannot provide benchmarking data to stimulate practice improvements.

To support quality improvement, a debriefing session is held with the whole practice team. It is used to acknowledge success by the team, identify areas for improvement, recommend changes, and help the practice to develop appropriate action plans for quality improvement under a team leader. There follows a draft report to the RNZCGP and post-assessment dialogue on outstanding items to complete. Given sparse literature on target setting Citation[9], the College requires each practice to meet all the legal, safety, and other essential criteria to qualify for accreditation. This is reasonable if each criterion labelled as “essential” is, indeed, necessary, and not merely appropriate. Once satisfied that the accreditation standard has been met, the College sends the report to HDANZ to verify and, as noted above, accredits practices only on the recommendation of HDANZ. Of the 950 to 1000 practices in NZ, one-third are participating in Cornerstone; 267 have been accredited as of 19 September 2007.

European experience

Since the 1990s, instruments for assessing practice management have been developed and used in various European countries, including Denmark, the Netherlands, and the UK. The 2005 EPA instrument was developed by GPs, researchers, and other experts on quality in primary care from six European countries between 2001 and 2004. Its development and content are described elsewhere Citation[4], Citation[10], and only briefly summarised here. It has content analogous to Aiming for Excellence but specifies no particular structure of use, such as practice accreditation, quality improvement, public reporting, or pay for performance. Instead, it offers a standardised list of indicators of practice management to use as an independent activity or combine with practice-specific feedback, benchmarks, and suggestions for quality improvement. It can also be used to compare the performance of practices within and between countries.

Supported by a definition of practice management as “systems (structures and processes) meant to enable the delivery of good-quality patient care”, excluding clinical processes and clinical outcomes, a conceptual framework was developed for the EPA project. This framework comprises 171 indicators in five domains—infrastructure, information, people, quality and safety, and finance—which are divided into dimensions Citation[11]. These indicators resulted from a modified Delphi procedure with GPs. The indicators have been used to generate questions for the principal GP or practice manager in each assessed practice; for all GPs in the practice, and for all clinical and non-clinical staff, respectively. The questions are administered via questionnaires that are self-completed before the practice visit. Moreover, at least 30 patients per practice complete the EUROPEP questionnaire for evaluating general practice care Citation[12]. During a practice visit, a trained outreach visitor administers a structured interview to the (principal) GP or practice manager and applies an observer checklist.

A computerised feedback instrument, “Visotool”, is available for individualised, immediate feedback and benchmarking. So far, it has been used in Germany and Switzerland. The feedback is available throughout the assessment process and in a post-assessment session with the practice team in some European countries. It includes benchmarking data in various formats to cater to the different learning styles of team members, and it enables comparisons with past performance by the practice, with strategic goals, and with an appropriate reference group. The EPA instrument has been validated in 10 European countries Citation[4], Citation[10]. Individuals from another 10 countries (as well as from outside Europe) have expressed interest in using it.

Moving forward

There are plans to evaluate and update the EPA instrument every 3 years. Its initial development was informed by consensus and a literature review that included the 2000 version of Aiming for Excellence Citation[13]. However, in this NZ instrument, many indicators have since been rewritten. The method of delivering Cornerstone to NZ practices has also been refined. These developments indicate an opportunity for Europe to revisit the NZ instrument and seek lessons for the EPA instrument and similar instruments. These lessons relate to what content to include and how to apply it, notwithstanding that the compromises that have been “made to develop an international instrument may limit the possibilities for its improvement” Citation[12] and “overlook or mask national issues” Citation[10].

What content to include in the EPA instrument

Comparison of the content of the EPA instrument with Aiming for Excellence reveals various differences. One reason may be the need for compromise occasioned by the complex task of reaching agreement in the European Union. Other reasons are that different priorities might be given to specific aspects in NZ compared to Europe, and there might not be agreement on the key principles of general practice care and dimensions of quality against which to map the content of accreditation instruments. The RNZCGP is now making explicit its position in these areas.

In this context, the EPA instrument pays greater attention to information management and finances but is less comprehensive in other practice domains constituting Aiming for Excellence. For example, in the NZ instrument, the domain “factors affecting patients” consists of indicators in content areas not covered by the EPA instrument. These areas include evidence that practices meet patients’ rights; incorporate patient input in service planning; enable patients to make informed choices about their care; and have policies to support appropriate prescribing outside consultations. Many content areas, such as confidentiality and privacy, are common to both instruments, but the NZ instrument covers them in comparatively greater depth.

The EPA instrument lists indicators on medical equipment, including drugs, but could not agree on “essential basic equipment needed in a practice” Citation[10]. This reflects, at least partly, the difficulty of reaching consensus across countries. In contrast, the NZ assessment instrument specifies which basic and emergency equipment are essential, while a) listing the additional off-site equipment required by rural practices and b) requiring evidence of systems to maintain equipment and drugs.

The domain that Aiming for Excellence calls “practice systems” includes indicators on surveillance systems, diagnosis, disease management, and health promotion, which the EPA does not fully cover. These indicators include the use of appropriate clinical guidelines and an effective system to identify and record immunisations. While the EPA instrument includes indicators of preventive care and chronic disease management, it could relate more explicitly to models such as the Chronic Care Model Citation[14].

However, all instruments for accreditation face the issue of whether accreditation applies to areas that are organisational and clinical in nature. The RNZCGP has decided to increase the number and type of clinical standards as a stand-alone module of Aiming for Excellence. Similarly, the EPA project, which was explicitly not about clinical indicators, is developing clinical indicators for cardiovascular risk management. In favour of including clinical indicators is the increasing difficulty in distinguishing the aggregate performance of individual professionals from team-based or practice performance. However, clinical indicators can lose focus on the variables influencing quality at the levels of the team and practice. These indicators, if included, should usually describe intermediate clinical outcomes close to the end of the causal pathway from process to outcome Citation[15]. They should also reflect priority population health objectives and be based on good research evidence.

In the Aiming for Excellence domain of “practice and patient information management”, content areas not covered by the EPA instrument include indicators of service integration, continuity of care, and palliative care. This probably reflects a lack of consensus in European general practice on how much integration to aim for and with whom. NZ's single health system facilitates such consensus.

In the domain of quality improvement and professional development, the indicators distinctive of Aiming for Excellence include practices’ use of a Significant Event Management system, a documented strategic plan, and a range of educational resources that the practice team can use for reference purposes. In the EPA instrument, patient safety is not explicitly addressed and is an area needing development.

Lastly, we offer observations about the content to include. First, in the EPA instrument and Aiming for Excellence, measures could be eventually ordered to reflect levels of increasing organisational development, as characterises the Organisational Maturity Matrix. This would engage in a simple and straightforward manner the majority of practices, which “are neither at the remedial end nor leading edge” Citation[16]. Secondly, we note that the EPA project uses the EUROPEP instrument, which has been slightly revised in 2007, and will be tested in 2008. Use of this instrument exceeds the current requirement by Cornerstone that practices complete a satisfaction survey and include patient input in planning service delivery. Only seven of the 23 items in EUROPEP focus on the practice rather than the practitioner, which reflects the difficulty in developing such an instrument to use across Europe. Nevertheless, EUROPEP is probably still the instrument most widely used in European general practice to elicit patient preferences Citation[17]. Cornerstone is likely to develop its own patient survey instrument.

How to use the EPA instrument

Cornerstone combines an assessment instrument and a system for managing accreditation. Different European countries will use the EPA instrument in their own way to help meet the needs of their own performance management system. Optional uses of this instrument include: educational feedback to practices; control on meeting minimum quality standards; practice accreditation and certification; public reporting on scores to facilitate patient choice; and input into pay-for-performance systems.

For example, The Netherlands has integrated the EPA instrument into its national system for practice accreditation. A practice visit by a trained person also helps practices to develop improvement plans, as required for accreditation. In Germany and Switzerland, the EPA instrument (and Visotool for feedback) is one of the competing instruments for practice accreditation. The UK is considering whether to integrate the EPA or aspects of it in its revised Quality and Outcomes Framework Citation[17].

It may be felt that a choice for a specific use excludes other potential uses or assessment instruments. However, Cornerstone shows that one programme for assessing practice management can combine various aims. In particular, summative assessment (to meet minimal standards) and formative aims (for quality improvement) can coexist. A lesson from Cornerstone is that incorporating a formative ethos of organisational development at all stages dissolves or at least weakens the potential distinction between quality assurance and quality improvement Citation[18]. Practice teams are enabled in Cornerstone to work with external assessors before their practice visit or more specifically from the stage of self-assessment. This promotes a shared understanding of, and commitment to, the assessment process; leads practices to anticipate improvement; and can help to prevent problems during the practice visit. Practices still require—and receive—post-assessment support to reach the standard. But, although only five of the 267 accredited practices met the standard at the time of the practice visit, only one practice ultimately failed to achieve accreditation (under exceptional circumstances). This indicates the importance of the ongoing support enjoyed by participating practices.

Yet, there is still scope to improve the feedback and support offered to practices in Cornerstone. For example, a programme such as the German Visotool could be used to provide benchmarking data as part of individualised feedback. Support could also be facilitated by a practice visitor trained in knowledge transfer Citation[19]. Our point stands, nevertheless, that the near-universal achievement of practice accreditation in NZ indicates that current structures for supporting practices are meeting the needs of the key stakeholder groups. This is a credit to the leadership demonstrated by the profession. The key driver for quality improvement by practices is, however, the professionally, or economically, motivated desire for accreditation. The challenge now is to motivate practices to continue to improve the quality of their management post-accreditation. The Netherlands exhibits a similar need: a randomised trial found that, compared with usual care, BIK-VIP (an intervention to guide team-based quality improvement after a practice visit) increased the number of improvement projects but not improvement on dimensions of practice management. Following the withdrawal of an outreach visitor, only just over half the practices were still using the continuous quality improvement model 6 months later Citation[3]. One solution in NZ and the Netherlands could be to continually raise the standard of accreditation over time, having enabled practices to become increasingly independent of outreach support.

Conclusion

A Lancet editorial recently emphasised the need for a “new discipline” of “comparative health systems studies” because of the “global value of projecting a new and stronger voice for country experiences in the increasingly complex international architecture of health” Citation[20]. This is relevant to the health policies used by different health systems to externally assess and improve their quality of practice management. We see “considerable benefits in using work from other settings” to help modify existing indicators of quality (rather than develop indicators de novo) Citation[21], and refine the structures in which quality can be assessed and improved. This paper has therefore considered lessons that NZ experience of externally assessing practices for accreditation, whilst encouraging quality improvement, might have for European general practice. In particular, a formative ethos of organisational development can be seamlessly incorporated at all stages of the systems review for quality assurance and the approach taken to improve quality. It is desirable and feasible for all practices to anticipate and achieve accreditation.

References

  • Grol R, Baker R, Moss F. Quality improvement research: the science of change in health care. Quality improvement research: understanding the science of change in health care, R Grol, R Baker, F Moss. BMJ Publishing Group, London 2004; 1–5
  • Linzer M, Manwell L, Mundt M, Williams E, Maguire A, McMurray J, et al. Organizational climate, stress and error in primary care: the MEMO Study. Agency for Healthcare Research and Quality, Rockville, MD 2005
  • Engels Y. Assessing and improving management in primary care practice in the Netherlands and in Europe. Centre for Quality of Care Research, Nijmegen 2005
  • Engels Y, Campbell S, Dautzenberg M, van den Hombergh P, Brinkmann H, Szecsenyi J, et al. Developing a framework of, and quality indicators for, general practice management in Europe. Fam Pract 2005; 22: 215–22
  • Royal New Zealand College of General Practitioners (RNZCGP). Aiming for excellence. An assessment tool for general practice. Wellington: RNZCGP; 2002.
  • Royal New Zealand College of General Practitioners (RNZCGP). Aiming for excellence. An assessment tool for general practice. Revised edition. Wellington: RNZCGP; 2006.
  • Shaw C. Accreditation in European health care. Jt Comm J Qual Patient Safety 2006; 32: 266–75
  • Elwyn G, Buetow S, Tapp L, Grol R. Developing accreditation standards for primary medical care organisations: towards the involvement of multiple stakeholders. An occasional report. London: RCGP; 2007.
  • van Herten LM, Gunning-Schepers LJ. Targets as a tool in health policy. Part I: lessons learned. Health Policy 2000; 53: 1–11
  • Engels Y, Dautzenberg M, Campbell S, Broge B, Boffin N, Marshall M, et al. Testing a European set of indicators for the evaluation of the management of primary care practices. Fam Pract 2006; 23: 137–47
  • Dautzenberg M, Engels Y. European practice assessment: an overview. Quality management in primary care, R Grol, M Dautzenberg, H Brinkmann. Die Deutsche Bibliothek, Leipzig 2004
  • Wensing M. A standardised instrument for patient evaluations of general practice care in Europe. Eur J Gen Pract 2000; 6: 82–7
  • Royal New Zealand College of General Practitioners (RNZCGP). Aiming for excellence in general practice. Standards for general practice. Wellington: RNZCGP; 2000.
  • Wagner E. Chronic disease management: what will it take to improve care for chronic illness?. Eff Clin Pract 1998; 1: 2–4
  • Chassin MR. Does paying for performance improve the quality of health care?. Med Care Res Rev 2006; 63: 122S–5S
  • Elwyn G, Rhydderch M, Edwards A, Hutchings H, Marshall M, Myres P, et al. Assessing organisational development in primary medical care using a group based assessment: the Maturity Matrix. Qual Saf Health Care 2004; 13: 287–94
  • Grol R, Wensing M. Measuring performance quality in general practice: is international harmonization desirable?. Br J Gen Pract 2007; 57: 691–2
  • Buetow SA, Wellingham J. Accreditation of general practices: challenges and lessons. Qual Saf Health Care 2003; 12: 129–35
  • Rhydderch M. Developing a facilitation model to promote organisational development in primary care practices. BMC Fam Pract 2006; 7: 38
  • Horton R. A new discipline is born: comparative health-systems studies. Lancet 2006; 368: 1949–50
  • Marshall M, Shekelle PG, McGlynn EA, Campbell S, Brook RH, Roland M. Can health care quality indicators be transferred between countries?. Qual Saf Health Care 2003; 12: 8–12

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.