726
Views
9
CrossRef citations to date
0
Altmetric
Original Articles

Risk, safety and organizational change in health care?

Pages 315-318 | Published online: 15 Aug 2006

Abstract

The increased emphasis on risk assessment and management in health care systems has major implications for the overall management and delivery of health care. Concerns with ensuring safety and minimizing harm are driving ‘modernization’ of health care systems. In this editorial I use recent articles published in Health, Risk and Society to consider the impact of such changes on professional decision-making, especially the pressure to move from decisions grounded in tacit knowledge to those based on knowledge encoded in clinical guidelines or computer-based decision support systems. These articles indicate that the current changes are unlikely to have the desired impact as they tend to disregard the reality of professional decision-making, especially the ways in which professionals need to use tacit knowledge when using decision-support systems and they also fail to recognize that professionals can exploit and use risk and uncertainty to override formal decision-making systems.

Introduction

As Kemshall (Citation2002) has noted, governments in welfare democracies are increasingly moving from need to risk as the basis of health, welfare and criminal justice systems. There are a number of factors driving this shift:

Cost. Needs can never be fully met; in contrast resources can be targeted at high-risk individuals or groups.

Control. Needs are professionally defined and controlled; risk can be independently assessed using actuarial technologies.

Safety. Needs cover a broad range of issues; risk provides a clear focus on protecting individuals from harm.

This shift has major implications for the ways in which health care is organized and managed. For example, in the past decade governments have sought to reengineer health care systems to ensure that they effectively identify risk and do not harm the very people they are meant to protect and cure. Within these restructurings it is possible to identify the influence of organizational research especially Turner and Pidgeon's (Citation1997) analysis of man-made disasters and the organizational factors leading to collective failure to identify and deal with ‘incubating’ hazards; also Hood and his colleagues' (Citation1992) analysis of the different ways in which organizations can structure their response to hazards (e.g., by using internal incentive systems to identify and manage hazards).

The move towards risk is evident both in the restructuring of the health care system and in the development of specific systems to identify and counteract errors. The systemic changes focus on the clinical decision-making process through clinical governance and in particular ensuring professional practice is research- or evidence-based. Flynn has noted the ways in which clinical governance is designed to shift clinical decision-making from various forms of tacit knowledge to knowledge encoded within clinical guidelines – the type of knowledge used in clinical practice (Flynn Citation2002, p. 168). Alongside clinical governance are new error reporting systems that emphasize the importance of learning from (minor) errors and near misses rather than blaming staff for mistakes (Department of Health Citation2001, Chapter 3).

Research on the development of systems to manage risk more effectively shows that the transfer of organizational models from industries such as air travel into health care is not simple or straightforward. The delivery of health care involves a diverse range of professional and occupational groups. Such models tend to underestimate the complexity of health care both in terms of the range of skills and expertise required to deliver healthcare safely and the difficulty of identifying ‘near misses’ and even accidents. While formal organizational structures seek to control the relationship between groups and the ways in which risks are identified and managed, actual practice often bears little relationship to formal prescription. Lankshear and her colleagues (Citation2005) explore the social processes that impeded the introduction of structured decision-making systems in NHS delivery systems. Such decision-making systems are grounded in formal organizational charts (i.e., a hierarchy of decision-makers with consultants at the peak and midwives at the base). In their study Lankshear and her colleagues found that formal prescriptions of the ways in which decisions should be made were a very flawed guide to the reality of decision-making. Midwives played a key role in decision-making; not only did they have close relationships with women in labour but they could also restrict access to such women to protect ‘privacy’. Midwives effectively decided when a labour was no longer ‘normal’ and therefore when the level of threat to the well-being of the mother and her baby had risen. They often tested their judgements within a community they trusted (i.e., with their midwife colleagues). Only when the attendant midwives had effectively classified a labour as ‘abnormal’ or ‘risky’ did others become involved.

As Horlick-Jones (Citation2003) has demonstrated, formal systems designed to structure or control professional decision-making do play a role but more in terms of justifying a course of action which a professional has decided to take on the basis of ‘experience-based practical reasoning’ than in controlling practice (p. 225). Indeed risk can be invoked to justify overriding the formal system. Horlick-Jones noted how in borderline cases or ambiguous conditions:

Professionals routinely ‘play safe’ and use a variety of accounting practices—the need to ‘seek clarification’ or an observation that the assessment ‘only just don't fit’—to rationalize their decision to override the formal criteria (Horlick-Jones Citation2003, p. 225).

The shift to encoded knowledge in decision-making and risk management may not achieve the desired policy goals. French (Citation2005), for example, examined the ways in which nurses used research evidence and identified unarticulated rules of risk management that influenced the uptake of evidence. Risk was unacceptable ‘if it is unpredictable, avoidable, if the nurse causes the damage, if they are held responsible without authority, or if there is no support system for dealing with the consequences’ (French Citation2005, p. 188).

McDonald and her colleagues (Citation2005) explore the ways in which different professions respond to risk in a highly structured environment in which their actions are both visible and open to scrutiny, that of the hospital operating theatre. They note that the key decision-makers in the hospital operating theatres do not conceptualize or seek to objectively measure risk; rather they see it in terms of a ‘failure to behave in an appropriate manner’. However, they differ in the ways in which they define appropriate manner. For nurses, ‘appropriateness’ is seen as the maintenance of order and routine that they see as a necessary condition for the safe performance of nursing duties. They sought to prevent disruptions to their routines through careful management and planning. They did use risk management techniques such as checklists to identify potential hazards but these tended to form part of a safety ritual. They did not question the evidence or knowledge on which these guidelines were grounded. They accepted them at face value and did not seek to use their professional judgement to disregard protocols when they felt this was necessary. While nurses used risk management techniques to avoid uncertainty, doctors not only recognized it they also used it to justify disregarding risk management. Since accidents were bound to happen it was best to accept them as inevitable rather than waste time and energy trying to prevent them. As MacDonald and her colleagues noted, doctors exploited the ambiguity of medical practice invoking ‘science as a legitimizing discourse, but also invoke craft mystery to justify its departure from science and its claims to particular forms of knowledge. Some elements of medical work can be clearly specified, while others must be left to the judgement of the physician’.

While there are few areas of health care in which there has been such a complete shift to encoded knowledge, in many areas of practice there is increased use of computer-based systems designed to ‘support’ clinical decision-making. Such systems provide a new context and set of resources but they do not remove judgement or negotiation from decision-making. Prior and his colleagues examined the ways in which clinicians used Cyrillic, a computer-based programme, to estimate patients' risk of cancer. Cyrillic made risk ‘visible’ by using inputted data on relatives to draw a family tree of cancer and providing a numerical estimate of personal risk (Prior et al. Citation2002, p. 248). Prior et al. found that clinicians had to make ‘sense’ of results and images and this involved craftwork especially in the laboratory. Such craftwork meant that there was ‘always a large chunk of ‘tacit knowledge’ embedded in professional decision-making’ (Prior et al. Citation2002, p. 256).

Researchers have identified the move towards evidence-based practice and the use of encoded knowledge in clinical practice but this shift is unlikely to improve the quality of clinical decision-making (however that is judged) unless there is recognition of the organizational contexts which shape responses to new knowledge. As Ferlie (Citation2005) noted in his review of development of evidence-based practice, knowledge by itself does not change practice. The application of new knowledge requires changes in the current relationship between staff and the development of new ones. Without organizational leadership the necessary changes in work practice will not take place and the anticipated benefits of the new knowledge will not be realized. In particular, change needs to acknowledge the complexity of such decision-making, the influence of informal power relations and the use of tacit knowledge.

Comment

Risk is a field of study in which empirical studies can be used to both generate theory and guide practical action. The current restructuring of health care systems is influenced by the desire of governments to create, inter alia, safer systems which can identify and neutralize hazards. In developing such systems they have drawn on studies of the ways in which organizations manage or fail to manage risk as well as some relatively common sense assumptions about ensuring professional decision-making is grounded in evidence (i.e., shifting from the use of tacit to knowledge encoded in guidelines). As articles published in Health, Risk and Society demonstrate, such changes are unlikely to achieve their objectives. Risk researchers have shown that managing risk involves complex social processes operating at individual, group and societal levels. It is not a simple straightforward process of objectively measuring probability and consequence of specific hazards and using rational decision-making systems to take action to minimize harm.

Acknowledgements

This editorial draws on a chapter (Alaszewski Citation2006) due to be published in a study of risk and social science edited by Peter Taylor-Gooby and Jens Zinn (Citation2006). I thank them for permission to reproduce some of that chapter.

References

  • Alaszewski , A. 2006 forthcoming . “ Health and risk ” . In Risk in Social Science , Edited by: Taylor-Gooby , P. and Zinn , J. Oxford : Oxford University Press .
  • Department of Health . 2001 . Building a Safer NHS for Patients: Implementing an Organisation with a Memory , London : Department of Health .
  • Ferlie , E. 2005 . “ Conclusion: From evidence to actionable knowledge? ” . In Knowledge to Action? Evidence-based Health Care in Context , Edited by: Dopson , S. and Fitzgerald , L. 182 – 197 . Oxford : Oxford University Press .
  • Flynn , R. 2002 . Clinical governance and governmentality . Health, Risk & Society , 4 : 155 – 173 .
  • French , B. 2005 . Evidence-based practice and the management of risk in nursing . Health, Risk and Society , 7 : 177 – 192 .
  • Hood , C. , Jones , D. and Pidgeon , N. 1992 . “ Risk management ” . In Risk, Analysis, Perception and Management, Report of a Royal Society study group , (London : The Royal Society) . The Royal Society
  • Horlick-Jones , T. 2003 . Managing risk and contingency: Interaction and accounting behaviour . Health, Risk and Society , 5 : 221 – 228 .
  • Kemshall , H. 2002 . Risk, social policy and welfare , Buckingham : Open University Press .
  • Lankshear , G. , Ettorre , E. and Mason , D. 2005 . Decision-making, uncertainty, and risk: Exploring the complexity of work processes in NHS delivery suites . Health, Risk and Society , 7 : 361 – 377 .
  • McDonald , R. , Waring , J. and Harrison , S. 2005 . ‘Balancing risk, that is my life’: The politics of risk in a hospital operating theatre department . Health, Risk and Society , 7 : 397 – 411 .
  • Prior , L. , Wood , F. , Gray , J. , Pill , R. and Hughes , D. 2002 . Making risk visible: The role of images in the assessment of (cancer) genetic risk . Health, Risk and Society , 4 : 241 – 258 .
  • Taylor-Gooby , P. and Zinn , J. 2006 forthcoming . Risk in Social Science , Oxford : Oxford University Press .
  • Turner , B. A. and Pidgeon , N. 1997 . Man-made Disasters , Oxford : Butterworth-Heinemann .

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.