2,499
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Development and Validation of a Knowledge Scale About the Behavioral and Psychological Symptoms of Dementia (KS-BPSD) Among Chinese Formal Caregivers

, MD, , PhD, , MD, , MD, , MD, , Dr, , Dr, , MD, , PhD & , PhD show all

ABSTRACT

Objectives

The current study aimed to develop a scale assessing knowledge about behavioral and psychological symptoms of dementia (KS-BPSD) among Chinese formal caregivers and to investigate its psychometric properties and factorial structure.

Methods

The scale was generated with a systematic development process, and 229 formal caregivers working at nursing homes were recruited to construct and assess the psychometric properties of the scale. The preliminary scale was reviewed by an expert panel and items were selected based on item discrimination, difficulty, and item-total correlation.

Results

The final KS-BPSD version consisted of 12 items, loaded into three factors (i.e., Disease Characteristics, Care and Risks, and Treatment Needs) following principal component analysis (PCA). The KS-BPSD showed good test-retest reliability, internal consistency, as well as construct and concurrent validity.

Conclusions

The 12-item KS-BPSD was found to have high reliability and preliminary validity in assessing the level of knowledge about patient’s BPSD among formal Chinese caregivers in nursing homes.

Clinical Implications

KS-BPSD is a reliable tool to address the knowledge discrepancies and support needs among dementia caregivers, helping to develop and evaluate educational programs in the management of patient’s BPSD.

Introduction

According to the latest report of global prevalence (Guerchet, Prince, & Prina, Citation2020), there are over 50 million people suffering from dementia in 2020, with this number predicted to double in the next 20 years. China, with the largest population of people living with dementia (accounting for about 25% of the entire dementia population worldwide; Prince et al.), is inevitably facing a heavy burden in dementia healthcare.

Although behavioral and psychological symptoms of dementia (BPSD) are present in up to 90% of patients with dementia (Gauthier et al., Citation2010) and the frequency of those symptoms increase with the progression of dementia (Hessler et al., Citation2018), BPSD has received less attention compared with cognitive symptoms in dementia care research. BPSD refers to the symptoms of disturbed perception, thought content, mood, or behavior that frequently occur in patients with dementia, such as agitation, aggression, sleep disturbance, depression, anxiety, and apathy, etc. (Finkel, E Silva, Cohen, Miller, & Sartorius, Citation1997). BPSD could be a result of complex interactions between factors at the individual level (e.g., dementia severity) and contextual level (e.g., relationship with the caregivers (Gauthier et al., Citation2010; Sampson et al., Citation2014). Importantly, BPSD is associated with the breakdown of home care, premature institutionalization, increased hospitalization, greater healthcare costs, and significant loss of quality of life for patients and their families (Cerejeira & Lagarto, Citation2012; Finkel et al., Citation1997). It is also a major source for and contributor to burden and distress of formal caregivers (Miyamoto, Tachimori, & Ito, Citation2010). Despite the significant impact of BPSD, awareness and knowledge about BPSD may have been relatively overlooked, probably due to a lack of validated instruments assessing the knowledge about BPSD.

The World Health Organization has called for greater dementia awareness and education in response to increasing global prevalence of this syndrome (World Health Organization, Citation2006). As a result, a variety of measures have been developed to assess dementia knowledge, such as the Dementia Knowledge Assessment Scale (DKAS; Annear et al., Citation2015), the Alzheimer’s Disease Knowledge Scale (ADKS; Carpenter, Balsis, Otilingam, Hanson, & Gatz, Citation2009), the Knowledge of Memory Loss and Care Test (Kuhn, King, & Fulton, Citation2005), and the Dementia Knowledge Assessment Tool (DKAT-2; Toye et al., Citation2014). Abovementioned measures have been validated (Annear et al., Citation2015; Carpenter et al., Citation2009; Spector, Orrell, Schepers, & Shanahan, Citation2012) and adopted in various education programs for dementia (Annear et al., Citation2016; Eccleston et al., Citation2019). However, in the existing literature, most knowledge assessments focus on cognitive symptoms of dementia, especially memory loss, while non-cognitive symptoms are relatively neglected (Schindler, Engel, & Rupprecht, Citation2012).

Understanding BPSD is essential for appropriate management of BPSD, affecting caregiver responses to BPSD. For instance, caregivers who are well-informed about BPSD symptoms can better cope with the distress resulting from BPSD (Bessey & Walaszek, Citation2019; Devor & Renvall, Citation2008; Trivedi et al., Citation2019). For professional caregivers, understanding patient’s BPSD could be particularly important yet challenging. For one reason, as dementia progresses, patient’s BPSD could become more common. The Cache County study has reported that the prevalence of BPSD has significantly increased across a five-year period (Steinberg et al., Citation2008). In addition, patients at nursing homes usually suffer with moderate to severe dementia and with more BPSD across a larger variety population(Selbæk, Kirkevold, & Engedal, Citation2007; Zuidema, Derksen, Verhey, & Koopmans, Citation2007). Therefore, it is necessary to address the level of formal caregiver’s knowledge about BPSD to better understand how caregiver’s caregiving experiences are affected and contribute to their needs in caregiving training.

In summary, BPSD is very common among patients with dementia (Brodaty et al., Citation2001; Selbæk et al., Citation2007; Steinberg et al., Citation2008; Zuidema et al., Citation2007)and have significant impacts on caregiving burden for formal caregivers. Caregiver’s knowledge of BPSD may affect the caregiving experience and efficacy. However, little attention has been given to the level and role of BPSD knowledge in caregiving, probably because no specific assessment of BPSD knowledge has been developed. To fill this research gap, the present study develops a knowledge scale assessing the understanding of BPSD (KS-BPSD) among formal caregivers. By collecting data from 229 formal caregivers at nursing homes, the present study also evaluates the psychometric properties of the KS-BPSD.

Methods

Participants

Formal caregivers were recruited from four nursing homes in Guangzhou and Nanchang, China. The recruitment criteria included: 1) over 18 years old; 2) having provided care for patients with dementia for more than one month; and 3) no difficulties in communicating, understanding, and administering the questionnaire. The study was approved by the Ethics Committee of Jinan University and all participants provided written consent before taking part in the study.

Scale development

Six experts, including: three clinicians, two researchers in the field of psychology and rehabilitation, and one researcher in the field of aging and psychology, were invited to form an expert panel for scale development. All experts had rich experience in scale development, validation, and culture adaption. Using previous scales assessing knowledge of dementia (or Alzheimer’s disease; Sullivan & Mullan, Citation2017), guidelines for dementia care, and opinions and knowledge of the expert panel as reference, five graduate students, each with a nursing or gerontology background, generated a set of 15 preliminary items each. The items covered knowledge of causes, characteristics, care, treatment, risk, and health promotion etc. of BPSD. All items were reviewed and discussed by three experts in the panel and redundant items were removed. Items with ambiguity, confusion, or using jargon were revised to obtain clarity. The goal of this process was to enhance content validity of the scale (Trochim & Donnelly, Citation2007) and to ensure the coverage and relevance of the content (Streiner, Norman, & Cairney, Citation2015). After the graduate students generated items and experts reviewed, the preliminary version included 27 items. All discrepancies were resolved. Adopting the response style from existing instruments (Annear et al., Citation2017; Annear et al., Citation2015), a 5-point Likert-type scale was used (1 = false, 2 = probably false, 3 = probably true, 4 = true, and 5 = I do not know).

Three other experts in the panel were invited to participate in a three-round Delphi study to evaluate the 27 items. In the first round, they were asked to evaluate the items based on: 1) how important the item is for understanding BPSD using a 6-point scale (1 = not important at all to 5 = very important); 2) whether the wording was appropriate; and 3) whether the items covered important knowledge about BPSD. Items were considered as having high consensus if they were scored with a 5 on importance at least once, and a median score greater than or equal to 4 among expert ratings of importance. Items with high ratings and levels of consensus were retained. A total of 20 items were entered into the second round of review, in which the experts re-rated the importance of the remaining items. Again, items with high consensuses and high importance were selected for the third round of review (Annear et al., Citation2015). The median for each item in the previous round was provided to the experts to increase their consensus in the second and third round. With all discrepancies resolved, 14 items were included in the scale. A graduate student translated the Chinese version of the scale into English with the help of two bilingual speakers. For details, see .

Table 1. Expert’s ratings on the 14 items.

Measurements

Participants’ demographic information was collected, including gender, age, marital status, education, and care experience. Care experience was measured using the questions: “How long you have been providing care for people with dementia?” and “Have you participated in any education programs related to dementia?.”

The 25-item Dementia Knowledge Assessment Scale (DKAS, version 2.0) was used to assess knowledge related to dementia (Annear et al., Citation2015). Four domains were included, namely: Causes and Characteristics (7 items), Communication and Behavior (6 items), Care Considerations (6 items), and Risk and Health Promotion (6 items). By summing the number of correct responses, sum scores of each sub-scale and the overall scale were generated. The overall score for DKAS 2.0 ranged from 0–50. This version had acceptable internal consistency and Cronbach’s alpha of the overall scale was 0.85 (Annear et al., Citation2017). Significant differences between different cohorts supported discriminative validity (Annear et al., Citation2017). At the same time, the Chinese version of DKAS 2.0 has also showed acceptable reliability and validity in healthcare providers (Zhao et al., Citation2020).

Data analytic methods

Correct answers (e.g., “true” or “probably true” to a true statement) on the KS-BPSD were scored 2 or 1 respectively. Incorrect answers (i.e., “false,” “probably false,” or “I don’t know” to a true statement) were scored “0.” After three rounds of review, the initial scale consisted of 14 items with a total score of 0–28. The higher the score, the higher the knowledge level of participants about BPSD.

Data analysis was conducted in three steps with IBM SPSS Statistics V22.0. Firstly, individual items were analyzed by calculating the discrimination indexes, item difficulty indexes, and item-total correlations. Secondly, an exploratory factor analysis (EFA) using principal component analysis (PCA) with varimax rotation (Chan & Idris, Citation2017; Costello & Osborne, Citation2005) was performed to ascertain the underlying structure of the scale. According to the recommendations of Trochim and Donnelly (Citation2007), once the final scale was obtained, the psychometric properties were evaluated including: test-retest reliability, internal consistency, concurrent validity, discrimination between groups, and convergent and discriminant validity. For test-retest reliability, intra-class correlation coefficients (ICCs) were calculated. Cronbach’s alpha was used as a measure of internal consistency. Concurrent validity is reflected by the correlation between DKAS 2.0 and KS-BPSD. A t-test was used to compare whether there is a significant, statistical difference in KS-BPSD scores with or without a history of attending an education workshop on dementia. Convergent validity was supported by a satisfactory item-subscale correlation (r ≥0.40). Discriminant validity was indicated by item-subscale correlations higher than correlations between the items and other subscales.

Results

In total, 229 valid responses were collected (22.3% male). About half of the participants were aged older than 45 years (48%), and a vast majority of participants were married (81.7%). Almost half of the participants have completed college (47.2%; for details, see ). Regarding caregiving experience, over half of the participants have been a caregiver for over 3 years (54.6%). In addition, the vast majority of participants have taken part in dementia-related education programs (80.8%).

Table 2. Demographic information (N = 229).

The number of people who indicated I don’t know (5) ranged from 1 (0.4%) to 28 (12.2%) for each item on the KS-BPSD. The number of people who indicated I don’t know (5) ranged from 3 (1.7%) to 50 (21.8%) for each item on the DKAS 2.0.

Item analysis

In the first step to screen items, the discrimination index, difficulty index, and item-total correlation of each item was analyzed. The purpose of discrimination index analysis was to eliminate items that were ineffective in discriminating participants with high and low overall scores. Using the overall score of the 14 items, a group with high scores (top 27%, n = 62) and a group with low scores (bottom 27%, n = 62) were identified. For each item, the percentage of participants from the above two groups who answered the item correctly, was calculated. The difference between these two percentages indicated the discrimination index (D), how sensitive the item is in differentiating formal caregiver’s level of BPSD knowledge in their patients, for each item (D scores between 0 and 0.19 indicated poor discrimination; D scores between 0.2 and 0.29 indicated acceptable discrimination; D scores between 0.3 and 0.39 indicated good discrimination; D >0.4 indicated excellent discrimination; Pande, Pande, Parate, Nikam, & Agrekar, Citation2013; Rao, Prasad, Sajitha, Permi, & Shetty, Citation2016; Sharma, Citation2021). The D scores of the 14 items ranged from 0.19 to 0.74, indicating that the discrimination capability was acceptable.

A difficulty index was also generated for each item, representing the percentage of people who answered the item correctly. If the rate of correctness is too high or low, it may be suggested that the item’s difficulty level is not appropriate. For example, a difficulty index of 0.95 indicates that 95% participants answered correctly, thus providing little useful information and detracts from the validity of the scale (Streiner et al., Citation2015). Therefore, items were retained only if they had a difficulty index lower than 0.95 and higher than 0.05.

The item-total correlation, indicated by the correlation between each item and the total score (omitting that item), was calculated for each item. It is recommended that this correlation be above 0.20 for item retention (Tabachnick & Fidell, Citation2013) and item 9 was removed as a result (for details, see ).

Table 3. The properties of individual item.

Principal component analysis

The EFA was conducted using PCA with varimax rotation. Preliminary analysis confirmed that the data was suitable for factor analysis (KMO = 0.79, Bartlett test of Sphericity P < .001). The cutoff for each factor loading was 0.4 (Peterson, Citation2000). Since item 4 had strong cross loadings (cross-loadings >75%) and the lowest absolute maximum loading of all the factors, item 4 was removed and the PCA was re-run (Samuels, Citation2017). Finally, PCA with the 12-item scale revealed a three-component structure (eigenvalues >1.0) explaining 56.9% of the cumulative variance. Varimax rotation showed that three to six variables were loaded significantly on each component. The items in each component were inspected and interpreted, resulting in three labels: Disease Characteristics (3 items), Care and Risks (6 items), and Treatment Needs (3 items). The loadings and interpretations indicated that the factorial validity of the 12-item scale was acceptable (for details, see ). The final scale consisted of 12 items with a total score of 0–24. The mean (SD) of the total scale scores was 16.15 (5.29). The means (SD) of the subscale scores were 4.45 (2.00), 7.97 (3.08), and 3.73 (1.96) respectively in Disease Characteristics, Care and Risks, and Treatment Needs. The correlation coefficients of the three subscales were between 0.39 and 0.28 (p <.01).

Table 4. EFA item loadings.

Reliability and validity

The test-retest reliability of the 12-item scale was obtained with an interval of one week. The minimum retest sample size of 30 was acceptable when ICC was used to measure the test-retest reliability (Bujang & Baharum, Citation2017). The “Randbetween” function in Excel was used to randomly select an institution out of the four nursing homes, from which 34 caregivers were randomly invited to take part in the retest. The test-retest reliability coefficient was 0.86 (p < .001), suggesting good test-retest reliability. Cronbach’ alpha of the 12-item scale was 0.79, indicating the internal consistency was acceptable. The Cronbach’s alphas of the subscales were 0.79, 0.74, and 0.65 respectively in Disease Characteristics, Care and Risks, and Treatment Needs.

As a recent tool for measuring dementia knowledge, DKAS 2.0 was found to have a high correlation with well-established knowledge scales. Good reliability and validity were reported of DKAS 2.0 and it also includes three items on patient’s psycho-behavioral symptoms. Therefore, DKAS 2.0 was considered appropriate for conducting concurrent validity. In our sample, the overall score of the 12-item KS-BPSD was highly correlated with the total score of DKAS 2.0 (r= 0.56, p < .001). As expected, participants who have taken a education workshop on dementia showed higher scores (16.62 ± 4.91) than those without attending such workshops (14.18 ± 6.34, t = 2.38, p < .05). Scores of each item was significantly correlated with its corresponding subscale (r > 0.4, P < .01). Moreover, the correlation between the item and the subscale was higher than that between the item and other subscales. The results showed that the 12-item KS-BPSD has good convergent and discriminant validity.

Discussion

The present study has reported the development and psychometric properties of a 12-item scale assessing formal caregiver’s knowledge about patient’s BPSD (KS-BPSD). To develop the scale, a three-round Delphi method was adopted, and 15 items were removed according to the exclusion criteria at each stage. The PCA results showed that the scale fit a three-factor solution: Disease Characteristics, Care and Risks, and Treatment Needs. In specific, Disease Characteristics referred to the understanding about key information of BPSD clinical manifestation, progress, and correlates; Care and Risks focused on the caregiving factors that influence the development of BPSD; and Treatment Needs captured the diagnosis, treatment, and prognosis aspects of BPSD. The three factors covered the fundamental information about BPSD and showed a good factorial validity.

The reliability and validity of the 12-item KS-BPSD were good, with decent content, concurrent, and construct validity. In addition, by comparing the score of those who have participated in the dementia-related education programs with those who haven’t, it was suggested that the dementia-related education programs could increase formal caregiver’s knowledge about their patient’s BPSD as the former group’s score was significantly higher than the latter.

The current study has developed a scale assessing formal caregiver’s knowledge of patient’s BPSD, which offers a critical complement for the existing measures on BPSD. Despite the correlation and overlap with existing dementia knowledge scales, such as the DKAS 2.0, the focuses are different. Actually, DKAS only includes two items that are related with depressive symptoms: “People with dementia are unlikely to experience depression” and “symptoms of depression can be mistaken for symptoms of dementia,” despite that depressive symptoms are an important cluster of BPSD. Moreover, clusters of behavioral and psychological symptoms other than depression could also have significant impact on caregiving burden. In addition, KS-BPSD captures an important yet often neglected component in dementia care. Previous studies showed that nurse’s knowledge plays an important role in their competence of providing care, care outcomes (Josefsson, Sonde, & Wahlin, Citation2008), as well as caregiver’s mental health and well-being (Evripidou, Charalambous, Middleton, & Papastavrou, Citation2019; Marx et al., Citation2014). Therefore, a knowledge scale covering more comprehensive aspects of BPSD is necessary for understanding the relationship between caregiver’s knowledge, attitude, behavior and care outcomes.

The development of this scale has important implications for designing dementia care training and education for formal caregivers. Knowledge assessments of nursing staff can help us understand their knowledge gaps and training needs, and develop tailored training programs to improve the care quality, thus benefiting both patients and caregivers. It is noteworthy that although our sample was made up of formal caregivers from nursing homes, the level of BPSD knowledge was not close to the full possible score (the average score was 16.15, SD = 5.29), suggesting no ceiling effect was found. This is probably because the educational level of our sample was relatively low and the trainings programs on BPSD offered to caregivers has remained limited. On the other hand, it is assumed that when applied to family caregivers, lay public, or the patients, KS-BPSD could also serve as an effective tool in assessing their knowledge of BPSD. Moreover, it is possible that different populations may show discrepancies across the three dimensions of KS-BPSD (i.e., Disease Characteristics, Care and Risks, and Treatment Needs) which can help diagnose the specific education needs for each group. It may also draw attention to BPSD and prompt early preparations for better managing the symptoms along with the deterioration of dementia.

Limitations and future directions

A major limitation of the current study is that the sample used to develop this scale was only formal caregivers working at nursing homes. Therefore, their level of knowledge about BPSD could be potentially higher than other populations, restraining the representativeness of the sample and the generalizability of the scale. Further work is warranted to test the scale’s generalizability with family caregivers, the public, or dementia patients to evaluate the psychometrics and norms across different populations. In addition, with more responses, confirmatory factor analysis (CFA) should be conducted to evaluate the validity and scale structure. At last, future studies should also look at the role of knowledge about BPSD in caregiver burden and care outcomes, to provide more evidence for the scale’s capacity of assessing BPSD knowledge, as well as to better understand its clinical relevance.

Clinical implications

  • KS-BPSD is a reliable measurement to assess formal caregiver’s knowledge about patient’s BPSD and three factors were included: Disease Characteristics, Care and Risks, and Treatment Needs.

  • Adopting KS-BPSD could address caregivers’ knowledge gaps about BPSD and develop tailored training programs to help them provide better care.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Data availability statement

The data is available upon reasonable request addressing to the correspondence author (http://dx.doi.org/10.17632/f38m748pz9.1).

Additional information

Funding

Financial support: This study was supported by the National Key R&D Program of China under Grant [2020YFC2005802; 2021ZD0203103]; and the Science and Technology Program of Guangzhou under Grant [202007030012].

References

  • Annear, M. J., Eccleston, C. E., McInerney, F. J., Elliott, K. E. J., Toye, C. M., Tranter, B. K., & Robinson, A. L. (2016). A new standard in dementia knowledge measurement: Comparative validation of the dementia knowledge assessment scale and the Alzheimer’s disease knowledge scale. Journal of the American Geriatrics Society, 64(6), 1329–1334. doi:10.1111/jgs.14142
  • Annear, M. J., Toye, C. M., Eccleston, C. E., McInerney, F. J., Elliott, K. E. J., Tranter, B. K., … Robinson, A. L. (2015). Dementia knowledge assessment scale: Development and preliminary psychometric properties. Journal of the American Geriatrics Society, 63(11), 2375–2381. doi:10.1111/jgs.13707
  • Annear, M. J., Toye, C., Elliott, K.-E. J., McInerney, F., Eccleston, C., & Robinson, A. (2017). Dementia knowledge assessment scale (DKAS): Confirmatory factor analysis and comparative subscale scores among an international cohort. BMC Geriatrics, 17(1), 1–11. doi:10.1186/s12877-017-0552-y
  • Annear, M. J., Toye, C., McInerney, F., Eccleston, C., Tranter, B., Elliott, K.-E., & Robinson, A. (2015). What should we know about dementia in the 21st Century? A Delphi consensus study. BMC Geriatrics, 15(1), 1–13. doi:10.1186/s12877-015-0008-1
  • Bessey, L. J., & Walaszek, A. (2019). Management of behavioral and psychological symptoms of dementia. Current Psychiatry Reports, 21(8), 1–11. doi:10.1007/s11920-019-1049-5
  • Brodaty, H., Draper, B., Saab, D., Low, L. F., Richards, V., Paton, H., & Lie, D. (2001). Psychosis, depression and behavioural disturbances in Sydney nursing home residents: Prevalence and predictors. International Journal of Geriatric Psychiatry, 16(5), 504–512. doi:10.1002/gps.382
  • Bujang, M. A., & Baharum, N. (2017). A simplified guide to determination of sample size requirements for estimating the value of intraclass correlation coefficient: A review. Archives of Orofacial Science, 12(1), 1-11.
  • Carpenter, B. D., Balsis, S., Otilingam, P. G., Hanson, P. K., & Gatz, M. (2009). The Alzheimer’s disease knowledge scale: Development and psychometric properties. The Gerontologist, 49(2), 236–247. doi:10.1093/geront/gnp023
  • Cerejeira, J., Lagarto, L., & Mukaetova-Ladinska, E. B. (2012). Behavioral and psychological symptoms of dementia. Frontiers in Neurology, 3(73). doi:10.3389/fneur.2012.00073
  • Chan, L. L., & Idris, N. (2017). Validity and reliability of the instrument using exploratory factor analysis and Cronbach’s alpha. International Journal of Academic Research in Business and Social Sciences, 7(10), 400–410. doi:10.6007/IJARBSS/v7-i10/3387
  • Costello, A. B., & Osborne, J. (2005). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical Assessment, Research, and Evaluation, 10(1), 7.
  • Devor, M., & Renvall, M. (2008). An educational intervention to support caregivers of elders with dementia. American Journal of Alzheimer’s Disease & Other Dementias®, 23(3), 233–241. doi:10.1177/1533317508315336
  • Eccleston, C., Doherty, K., Bindoff, A., Robinson, A., Vickers, J., & McInerney, F. (2019). Building dementia knowledge globally through the understanding dementia Massive Open Online Course (MOOC). Npj Science of Learning, 4(1), 1–6. doi:10.1038/s41539-019-0042-4
  • Evripidou, M., Charalambous, A., Middleton, N., & Papastavrou, E. (2019). Nurses’ knowledge and attitudes about dementia care: Systematic literature review. Perspectives in Psychiatric Care, 55(1), 48–60. doi:10.1111/ppc.12291
  • Finkel, S. I., E Silva, J. C., Cohen, G., Miller, S., & Sartorius, N. (1997). Behavioral and psychological signs and symptoms of dementia: A consensus statement on current knowledge and implications for research and treatment. International Psychogeriatrics, 8(S3), 497–500. doi:10.1017/S1041610297003943
  • Gauthier, S., Cummings, J., Ballard, C., Brodaty, H., Grossberg, G., Robert, P., & Lyketsos, C. (2010). Management of behavioral problems in Alzheimer’s disease. International Psychogeriatrics, 22(3), 346–372. doi:10.1017/S1041610209991505
  • Guerchet, M., Prince, M., & Prina, M. (2020). Numbers of people with dementia worldwide: An update to the estimates in the World Alzheimer Report 2015.
  • Hessler, J. B., Schäufele, M., Hendlmeier, I., Junge, M. N., Leonhardt, S., Weber, J., & Bickel, H. (2018). Behavioural and psychological symptoms in general hospital patients with dementia, distress for nursing staff and complications in care: Results of the General Hospital Study. Epidemiology and Psychiatric Sciences, 27(3), 278–287. doi:10.1017/S2045796016001098
  • Josefsson, K., Sonde, L., & Wahlin, T.-B. R. (2008). Competence development of registered nurses in municipal elderly care in Sweden: A questionnaire survey. International Journal of Nursing Studies, 45(3), 428–441. doi:10.1016/j.ijnurstu.2006.09.009
  • Kuhn, D., King, S. P., & Fulton, B. R. (2005). Development of the knowledge about memory loss and care (KAML-C) test. American Journal of Alzheimer’s Disease & Other Dementias®, 20(1), 41–49. doi:10.1177/153331750502000108
  • Marx, K. A., Stanley, I. H., Van Haitsma, K., Moody, J., Alonzi, D., Hansen, B. R., & Gitlin, L. N. (2014). Knowing versus doing: Education and training needs of staff in a chronic care hospital unit for individuals with dementia. Journal of Gerontological Nursing, 40(12), 26–34. doi:10.3928/00989134-20140905-01
  • Miyamoto, Y., Tachimori, H., & Ito, H. (2010). Formal caregiver burden in dementia: Impact of behavioral and psychological symptoms of dementia and activities of daily living. Geriatric Nursing, 31(4), 246–253. doi:10.1016/j.gerinurse.2010.01.002
  • Pande, S. S., Pande, S. R., Parate, V. R., Nikam, A. P., & Agrekar, S. H. (2013). Correlation between difficulty and discrimination indices of MCQs in formative exam in physiology. South-East Asian Journal of Medical Education, 7(1), 45–50. doi:10.4038/seajme.v7i1.149
  • Peterson, R. A. (2000). A meta-analysis of variance accounted for and factor loadings in exploratory factor analysis. Marketing Letters, 11(3), 261–275. doi:10.1023/A:1008191211004
  • Rao, C., Prasad, H. K., Sajitha, K., Permi, H., & Shetty, J. (2016). Item analysis of multiple choice questions: Assessing an assessment tool in medical students. International Journal of Educational and Psychological Researches, 2(4), 201. doi:10.4103/2395-2296.189670
  • Sampson, E. L., White, N., Leurent, B., Scott, S., Lord, K., Round, J., & Jones, L. (2014). Behavioural and psychiatric symptoms in people with dementia admitted to the acute hospital: Prospective cohort study. The British Journal of Psychiatry, 205(3), 189–196. doi:10.1192/bjp.bp.113.130948
  • Samuels, P. (2017). Advice on exploratory factor analysis. Birmingham,UK: Birmingham City University.
  • Schindler, M., Engel, S., & Rupprecht, R. (2012). The impact of perceived knowledge of dementia on caregiver burden. GeroPsych, 25(3), 127–134. doi:10.1024/1662-9647/a000062
  • Selbæk, G., Kirkevold, Ø., & Engedal, K. (2007). The prevalence of psychiatric symptoms and behavioural disturbances and the use of psychotropic drugs in Norwegian nursing homes. International Journal of Geriatric Psychiatry, 22(9), 843–849. doi:10.1002/gps.1749
  • Sharma, L. R. (2021). Analysis of difficulty index, discrimination index and distractor efficiency of multiple choice questions of speech sounds of English. International Research Journal of MMC, 2(1), 15–28. doi:10.3126/irjmmc.v2i1.35126
  • Spector, A., Orrell, M., Schepers, A., & Shanahan, N. (2012). A systematic review of ‘knowledge of dementia’ outcome measures. Ageing Research Reviews, 11(1), 67–77. doi:10.1016/j.arr.2011.09.002
  • Steinberg, M., Shao, H., Zandi, P., Lyketsos, C. G., Welsh‐Bohmer, K. A., Norton, M. C., … Tschanz, J. T. (2008). Point and 5‐year period prevalence of neuropsychiatric symptoms in dementia: The cache County study. International Journal of Geriatric Psychiatry, 23(2), 170–177. doi:10.1002/gps.1858
  • Streiner, D. L., Norman, G. R., & Cairney, J. (2015). Health measurement scales: A practical guide to their development and use. USA: Oxford University Press.
  • Sullivan, K. A., & Mullan, M. A. (2017). Comparison of the psychometric properties of four dementia knowledge measures: Which test should be used with dementia care staff? Australasian Journal on Ageing, 36(1), 38–45. doi:10.1111/ajag.12299
  • Tabachnick, B., & Fidell, L. (2013). Using Multivariate Statistics (6th Edn. Northridge ed.). CA: California State University.[Google Scholar].
  • Toye, C., Lester, L., Popescu, A., McInerney, F., Andrews, S., & Robinson, A. L. (2014). Dementia knowledge assessment tool version two: Development of a tool to inform preparation for care planning and delivery in families and care staff. Dementia, 13(2), 248–256. doi:10.1177/1471301212471960
  • Trivedi, D. P., Braun, A., Dickinson, A., Gage, H., Hamilton, L., Goodman, C., … Manthorpe, J. (2019). Managing behavioural and psychological symptoms in community dwelling older people with dementia: A systematic review of the effectiveness of interventions. Dementia, 18(7–8), 2925–2949. doi:10.1177/1471301218762851
  • Trochim, W., & Donnelly, J. (2007). The research methods knowledge base 3rd Ed: Mason. OH, USA: Thompson Publishing Group.
  • World Health Organization. (2006). Neurological disorders: Public health challenges. Geneva, Switzerland: Author ISBN: 92-4-156336-2 .
  • Zhao, Y., Eccleston, C. E., Ding, Y., Shan, Y., Liu, L., & Chan, H. Y. (2020). Validation of a Chinese version of the dementia knowledge assessment scale in healthcare providers in China. Journal of Clinical Nursing, 31, 1776–1785. doi:10.1111/jocn.15533.
  • Zuidema, S. U., Derksen, E., Verhey, F. R., & Koopmans, R. T. (2007). Prevalence of neuropsychiatric symptoms in a large sample of Dutch nursing home patients with dementia. International Journal of Geriatric Psychiatry, 22(7), 632–638. doi:10.1002/gps.1722