771
Views
4
CrossRef citations to date
0
Altmetric
ORIGINAL ARTICLE

Health centres' view of the services provided by a university hospital laboratory: Use of satisfaction surveys

, &
Pages 24-28 | Received 23 Apr 2009, Accepted 03 Feb 2010, Published online: 08 Mar 2010

Abstract

Customer orientation has gained increasing attention in healthcare. A customer satisfaction survey is one way to raise areas and topics for quality improvement. However, it seems that customer satisfaction surveys have not resulted in quality improvement in healthcare. This article reports how the authors' university hospital laboratory has used customer satisfaction surveys targeted at the health centres in their hospital district. Closed-ended statements of the questionnaire were planned to cover the essential aspects of laboratory services. In addition, an open-ended question asked what was considered to be the most important problem in services. The questionnaires were sent to the medical directors of the health centres. The open-ended question proved to be very useful because the responses specified the main problems in service. Based on the responses, selected dissatisfied customers were contacted to specify their responses and possible corrective actions were taken. It is concluded that a satisfaction survey can be used as a screening tool to identify topics of dissatisfaction. In addition, further clarifications with selected customers are needed to specify the causes for their dissatisfaction and to undertake proper corrective actions.

Customer satisfaction surveys have not resulted in quality improvement in healthcare [Citation3–5].

  • A customer satisfaction survey can be used as a screening tool to identify topics of dissatisfaction.

  • After the survey, further clarifications are often needed to discover customer-specific causes of dissatisfaction.

  • Customer satisfaction surveys cannot result in quality improvement, if proper corrective actions are not carried out.

Healthcare professionals are encouraged to make improvements in the healthcare processes in their own local settings and to learn at the same time [Citation1,Citation2]. Improvement and learning occur when the cycle of planning, doing, studying, and acting is implemented in day-to-day work [Citation1].

Customer orientation has gained increasing attention in healthcare. A customer satisfaction survey is one way to raise areas and topics for quality improvement. However, it seems that customer satisfaction surveys have not resulted in quality improvement in healthcare [Citation3–5]. Lack of specificity of questions [Citation6], difficulties in interpreting results [Citation7], and insufficient discussions of the results [Citation8] of satisfaction surveys have made it difficult to proceed to improvement actions. The question of how the information obtained from customers can be utilized to improve services poses a great challenge.

Quality standards, such as EN ISO 15189 [Citation9] and ISO/IEC 17025 [Citation10], point out the use of customers' perspective in clinical laboratories. In this article, we report how a university hospital laboratory has used customer satisfaction surveys targeted at the health centres in our hospital district to uncover problems in services for corrective actions.

Material and methods

Setting

The laboratory of Oulu University Hospital, Finland, provides services in clinical chemistry and haematology (including blood banking service) at a total rate of 3 million investigations annually. Some 60% of the requests come from units within the same hospital, while 40% come from external customers representing hospitals within the central hospital district and the public primary health centres in the district. Our laboratory has ISO 17025 accreditation for the majority of routine tests.

The satisfaction surveys conducted in 2002 and 2006 were targeted at the health centres in the Northern Ostrobothnia Hospital District that buy laboratory services from the university hospital laboratory. The results of satisfaction surveys conducted in the city of Oulu were not included in this report because the arrangement of laboratory services in Oulu differs greatly from those of the other health centres in the hospital district. In 1998, the laboratory of Oulu City Hospital and the laboratory units of the seven health stations were combined with the laboratory of Oulu University Hospital.

Surveys

The questionnaire used in both surveys was planned by the chief physician and the associate chief physician, both specialists with long experience in clinical chemistry, together with the planning officer of the laboratory [Citation11,Citation12]. The statements were designed to reflect the essential aspects of services the laboratory provides to the regional health centres (see ). The design of the questionnaire was reviewed by the customer service working group and the managing board of the laboratory. The purpose of the questionnaire was stated in the covering letter, which was signed by the chief physician of the laboratory [Citation13,Citation14].

Table I. Distribution of responses to the statement exploring health centres' views of laboratory services in 2002 and 2006.

The respondents were asked to rate their perceptions on a five-point Likert scale: strongly agree, agree, neither agree nor disagree, disagree, and strongly disagree [Citation12], or “not applicable” if appropriate. In addition to the closed statements, the respondents were asked an open-ended question: “What is the most important problem in laboratory services between your health centre and the university hospital laboratory?” Moreover, the respondents were asked to give their contact information to make them responsible for their responses and to allow clarification of problems that might arise [Citation12].

In 2002, the questionnaire was sent to the medical directors of 42 health centres. Administrative changes had taken place at the health centres. For this reason, in 2006, the questionnaire was sent to the medical directors of 19 independent health centres and to the medical directors of six joint municipal health centres covering 19 health centres. Medical directors were asked to respond together with their senior laboratory technologist. A follow-up letter was sent to non-respondents in both years. After the 2006 survey, respondents who were persistently dissatisfied were contacted by telephone and they were asked to specify their responses to find out the problems behind their dissatisfaction.

Frequency and percentage distributions of the responses were calculated. A combined percentage of the two disagreement levels (disagree and strongly disagree) of 20% or higher was considered to represent a high level of dissatisfaction. The statistical significance of the differences in distributions between 2002 and 2006 surveys was calculated by Fisher's exact test by using the SPSS 16.0 statistical package for Windows (SPSS Inc., Chicago, IL, USA).

The responses to the open-ended question were classified into categories by content analysis with calculated frequencies by two coders (PO, AP). Unclear cases were discussed and the final categories were formed through consensus formation.

Results

What did the surveys reveal?

The response rate to the closed statements was 83% in 2002 and 100% in 2006. In 2002, 29 responses covered 35 health centres and in 2006, 25 responses covered 38 health centres. The response rate to the open-ended question was 62% in 2002 and 76% in 2006. In 2002, 21 respondents reported 27 problems, and in 2006, 17 respondents reported 19 problems.

The distributions of the responses to the statements on laboratory services are shown in . When comparing changes in the responses to the statements between the surveys in 2002 and 2006, the only statistically significant changes (reduced satisfaction) were obtained in the responses regarding sent-out information on changes in laboratory services (, statement 2) and usability of the reference value booklet (statement 3b), both caused, at least partly, by increases in neutral judgements.

In 2002, the highest percentage of dissatisfaction (71%) was related to electronic data transfer of test requests and reports between health centres and the university hospital laboratory (, statement 4a). Only seven health centres made a statement on this. In 2002, all health centres had electronic patient records including the requisition of laboratory tests from their own laboratory. However, in 2002, only five health centres had data transfer connections with the university hospital laboratory. In 2006, the dissatisfaction percentage was still high (43%). Both in 2002 and in 2006, the personnel in health centres needed further instructions on the preparation of patients for laboratory tests (28% and 33%, respectively) (statement 8a) and on the collection and handling of samples (24% and 29%, respectively) (statement 8b). Many respondents, up to 33% in 2002 and 28% in 2006, were not aware of the frequency of analysis of different laboratory tests at the university hospital laboratory (statement 11a). In 2006, 25% of the respondents needed to contact the laboratory because of delayed test results (statement 10a). In 2006, 23% of the respondents did not consider the paper request form suitable for use (statement 4b).

In 2006, further enquiries revealed that the additional need for instructions meant that laboratory technicians in some health centres needed instructions for specimen handling for certain laboratory tests, e.g. parathyroid hormone, cortisol, or prostate-specific antigen. General instructions for patient preparation and specimen collection and handling were also requested. Two health centres could not specify the instructions they reported they needed. Further enquiries concerning delays in test results revealed that, in two health centres, delays were related to the newly assembled electronic data transfer system between these health centres and the university hospital laboratory, while two other health centres reported single cases of delays.

The most important problems in laboratory services based on the responses to the open-ended question in 2002 and in 2006 are summarized in . Both in 2002 and in 2006, the most frequently reported problem was related to data transfer between health centres and the university hospital laboratory. These problems included lack of electronic data transfer connections, problems in existing connections, or differences in the number codes of laboratory tests between health centres and the hospital laboratory. Health centres also reported various problems in the practice of decentralised phlebotomy services for university hospital outpatients. For example, laboratory test orders were lacking or incomplete, or the patients had not received instructions on preparation for the tests.

Table II. Classification of the most important problems in laboratory services based on responses to the open-ended question in 2002 and 2006.

Corrective actions

Only five health centres had an electronic data transfer connection with the university hospital laboratory in 2002. Before the 2006 survey, electronic data transfer connections were established with an additional 13 health centres. The construction and remodelling continued until 2008, when 37 health centres out of 38 used electronic data transfer for test requisition and reporting with the university hospital laboratory. In 2004, the number codes of the laboratory tests in the information system of the university hospital laboratory were changed to conform with the standardised national codes that were already in use in the health centres. In addition, information on the frequency of analysis of laboratory tests was added to the Laboratory Users' Handbook in 2004, and the usability of the handbook was improved by an internet application in 2006 after the survey. A reference value booklet was no longer printed after the 2006 survey, since the booklet had become superfluous after the attachment of reference values to each laboratory result. To reduce problems in the decentralised phlebotomy services, instructions were produced in 2003 and delivered both to the laboratories of the health centres and to the requesting clinics at the university hospital. As an improvement, only one health centre mentioned defects in these services as the most important problem in 2006.

Discussion

What did we learn from the customer satisfaction surveys?

The purpose of our satisfaction surveys was not to measure the satisfaction level of the health centres as a whole, but to find out topics of dissatisfaction. We tried to plan the closed-ended statements so as to be specific enough and to cover subjects that are essential in the service process and that the customers can observe, as well as subjects that may be prone to problems. In the present study, the responses to the open-ended question strengthened the view of the main problems based on the responses to the closed-ended statements. The problems concerned could be further specified in the responses to the open-ended question. Responses to an open-ended question may also reveal problems not covered by closed-ended statements. In our survey, the problems in the practice of decentralized phlebotomy services would not have been revealed on the basis of the closed-ended statements. Thus, the open-ended question concerning the most important problem proved to be very useful in these surveys. Open-ended questions have been recognized as particularly useful in other studies as well, because responses may contain detailed information [Citation6,Citation8]. However, if open-ended questions are used, the responses should be analysed properly. Boynton and Greenhalgh [Citation15] point out that if expertise to analyse qualitative data is not available, free-text responses should not be invited.

The response rates of the surveys were very good. This evidently reflects both the traditional collaboration between the regional health centres and the university hospital laboratory and the existence of motivation to participate when important aspects of laboratory services are assessed. The follow-up letters sent to the non-respondents in both years have evidently also contributed to the high response rates.

In our earlier study [Citation16], concerning satisfaction of the university hospital's clinical units with laboratory services, it appeared that additional contacts with several clinical units were needed after the survey to specify the causes of their dissatisfaction and to target corrective actions properly. This also proved to be necessary after the present surveys targeted at health centres. For example, we contacted customers to clarify what kind of additional instructions they needed and what they meant by delayed test results. It appeared that the instructions they needed were to be found in the Laboratory Users' Handbook. These customers were guided on how to use the handbook. Some customers were not able to specify what instructions they had meant when responding to the statement. It appeared that delays in test results were not systematic but, rather, single cases, or due to a transient disorder of the electronic data transfer system. In cases of delays, erroneous test results, or other such incidents they were guided to use immediate spontaneous feedback. It is important to note that without the respondents' contact information further clarifications and customer-specific corrections would not have been possible and the information obtained from the surveys would have remained more inconclusive.

Corrective actions are not necessarily reflected as a decrease in dissatisfaction level. In the 2002 survey, one-third of the respondents reported that they were not familiar with the frequency of different laboratory tests. Although this information was added to the Laboratory Users' Handbook in 2004, the dissatisfaction remained at the same level in the 2006 survey.

It is characteristic of this kind of satisfaction survey that problems of a single customer may remain unobserved in satisfaction ratings. On the other hand, customer-specific corrections are not necessarily seen as improvements in satisfaction rates. In customer satisfaction surveys, information is obtained afterwards and there are usually long intervals between repeated surveys. However, many problems in services cannot wait for the next survey to be uncovered. Direct contacts between the customer and the service provider are an efficient way to bring out and deal with problems. In addition, spontaneous customer feedback is a useful way to report single cases or problems which can then be investigated immediately [Citation17].

It can be concluded that a satisfaction survey can be used as a screening tool to identify topics of dissatisfaction among customers. In addition, further clarifications with selected customers are needed to discover customers' specific problems and to undertake targeted corrective actions.

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.

References

  • Berwick DM. Developing and testing of changes in delivery of care. Ann Intern Med 1998;128:651–6.
  • Kilo CM. Educating physicians for systems-based practice. J Contin Educ Health Prof 2008;28:S15–8.
  • Cleary PD. Editorials: The increasing importance of patient surveys. BMJ 1999;19:720–1.
  • Apolone G, Mosconi P. Editorial. Satisfaction surveys: Do we really need new questionnaires? Int J Qual Health Care 2005;17:463–4.
  • Coulter A. Editorials. Can patients assess the quality of health care? Patients' surveys should ask about real experiences of medical care. BMJ 2006;333:1–2.
  • Reeves R, Seccombe I. Do patient surveys work? The influence of a national survey programme on local quality-improvement initiatives. Qual Saf Health Care 2008;17: 437–41.
  • Wensing M, Vingerhoets E, Grol R. Feedback based on patient evaluations: A tool for quality improvement. Patient Educ Couns 2003;51:149–53.
  • Boyer L, Francois P, Doutre E, Weil G, Labarre J. Perception and use of the results of patient satisfaction surveys by care providers in a French teaching hospital. Int J Qual Health Care 2006;18:359–64.
  • EN ISO 15189. Medical laboratories: Particular requirements for quality and competence. Geneva: International Organization for Standardization; 2007.
  • ISO/IEC 17025. General requirements for the competence of testing and calibration laboratories. Geneva: International Organization for Standardization; 2005.
  • Kelley K, Clark B, Brown V, Sitzia J. Good practice in the conduct and reporting of survey research. Int J Qual Health Care 2003;15:261–6.
  • Streiner DL, Norman GR. Health measurements scales: A practical guide to their development and use. 3rd Oxford: Oxford University Press; 2003.
  • Bourque LB, Fielder EP. How to conduct self-administered and mail surveys. Thousand Oaks, CA: Sage Publications; 1995.
  • Gillham B. Developing a questionnaire. 2nd London: Continuum; 2007.
  • Boynton PM, Greenhalgh T. Hands-on guide to questionnaire research: Selecting, designing, and developing your questionnaire. BMJ 2004;328:1312–15.
  • Oja PI, Kouri TT, Pakarinen AJ. From customer satisfaction survey to corrective actions in laboratory services in a university hospital. Int J Qual Health Care 2006;18:422–8.
  • Oja PI, Kouri TT, Pakarinen AJ. Utilisation of customer feedback in a university hospital laboratory. Accred Qual Assur 2009;14:193–7.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.