2,564
Views
24
CrossRef citations to date
0
Altmetric
ARTICLES

Addressing the “Other” Health Literacy Competencies—Knowledge, Dispositions, and Oral/Aural Communication: Development of TALKDOC, an Intervention Assessment Tool

, , &
Pages 160-175 | Published online: 03 Oct 2012

Abstract

Most health literacy assessments evaluate literacy skills including reading, writing; numeracy and interpretation of tables, graphs, diagrams and charts. Some assess understanding of health systems, and the ability to adequately apply one's skills to specific health-related tasks or demands in health situations. However, to achieve functional health literacy, the ability to “obtain, process, and understand basic health information and services needed to make appropriate health decisions,” other health literacy dimensions should be assessed: a person's knowledge and attitudes about a health issue affects his or her ability to and interest in participating in his or her own care. In patient care settings, the abilities to listen, ask questions and check one's understanding are crucial to making appropriate decisions and carrying out instructions. Although literacy is a skill associated with educational attainment and therefore difficult to change in a short time, health education interventions can address health literacy domains such as knowledge, attitudes and oral communication skills. For this reason, an instrument that can assess these constructs is a valuable part of a health educator's toolbox. The authors describe the development and process and outcomes of testing a novel instrument targeted to assess HPV and cervical cancer health literacy competencies, TALKDOC, including its validation with the Health Activities Literacy Scale.

[Supplemental materials are available for this article. Go to the publisher's online edition of Journal of Health Communication for the following free supplemental resource: Appendix: Result of Internal Consistency Analysis: For Each Domain, Remaining Questions, Alpha, Response Options, and Correct Answer (in Bold). This appendix shows the remaining questions (and correct answers in bold) after 32 questions were eliminated.]

Strong evidence points to a link between literacy and health; low literacy is associated with both higher health costs and unfavorable health outcomes (Parker & Ratzan, Citation2010). Consumers are expected to make decisions with their providers, and to manage their own care. It is unfortunate that more than one third (36%) of U.S. adults have only a basic- or below-basic level of health literacy; up to 9 out of 10 adults cannot understand or use everyday health information effectively (Benjamin, Citation2010; Ratzan, Citation2010).

The acknowledged definition of health literacy is the “degree to which individuals have the capacity to obtain, process, and understand basic health information and services needed to make appropriate health decisions” (Nielson-Bohlman, Panzer, & Kindig, 2004, p. 4; Ratzan & Parker, Citation2000). Constructs underlying this definition indicate that a functionally health literate person should have a complex, multidimensional set of dispositions, knowledge and skills that include reading, writing, numeracy (interpreting and analyzing numbers or data—in lab test results, tables, and graphs); listening to get information; oral and visual communication; problem solving and decision making. This person can/does understand health systems and topics, and can successfully handle health-related tasks or demands in health care situations (Dubow, Citation2004; Nielson-Bohlman et al., 2004; National Network of Libraries of Medicine, Citation2004; Paasche-Orlow, Wilson, & McCormack, Citation2010; Ratzan, Citation2010).

Rudd, Zobel, and colleagues (Citation2004) noted that context influences how well individuals are able to use their health literacy abilities. Because patients need to navigate our increasingly complicated health system, health literacy tasks include knowing and acting on patient rights and responsibilities; completing medical, insurance and informed consent forms—while recognizing the implications for care; clearly describing symptoms, negotiating care, then understanding, recalling and following physician instructions for self-care. Beyond skills, motivation, attitudes, and self-efficacy contribute strongly to whether individuals handle health literacy-related tasks; one must be not only informed, but willing to seek the care they need (Benjamin, Citation2010; Nutbeam, Citation2008; Paasche-Orlow & Wolf, Citation2010; McCormack, Bann, Squiers, Berkman, & Squire, 2010; Rudd & Anderson, Citation2006).

In addition to context, functional health literacy is not only affected by individual abilities, but is also influenced by environmental factors such as culture and assimilation, race/ethnicity, education level, language facility, the mass media, technology access, the health condition and health care settings (Gee, Citation1996; Nielson-Bohlman et al., 2004). Being health literate assumes not just that generic literacy skills are applied in a health setting, but that individuals must often have multiple health condition-specific literacy competency. An individual may be health literate about diabetes, but health illiterate about cervical cancer, for example.

Health Literacy Assessment for Intervention Research

The Institute of Medicine (Nielson-Bohlman et al., 2004) called for new health literacy indicators and tools to better assess intervention-specific health literacy outcomes, noting that without a way to measure health literacy competencies, it is difficult to design and evaluate interventions meant to improve those skills. Health promotion/disease prevention programs designed without addressing related health literacy factors may well find their impact undermined.

Multiple assessment tools—whether for research, clinical use or field interventions – have been developed to assess various aspects of health literacy (Chew, Bradley, & Boyko, Citation2004; Clayman et al., Citation2010; Davis et al., Citation1993; Gazmararian, Beditz, Pisano, & Carreón, Citation2010; Hahn, Choi, Griffith, Yost, & Baker, Citation2011; McCormack, Bann, Squiers, Berkman, & Squire, 2010; Parker, Baker, Williams, & Nurss, Citation1995; Wallace, Rogers, Roskos, Holiday, & Weiss, 2006; Weiss et al., Citation2005). Yet, none of these general health literacy assessments have an exclusive set of indicators to evaluate interventions focused on improving a broad range of health literacy abilities, especially oral communication and listening. Few assessment tools are specific enough to adequately evaluate interventions focused on context- and health-condition specific cultural and conceptual factors that affect health literacy, or which explore the health literacy demands of a particular disease, such as cervical cancer (Helitzer, Hollis, Cotner, & Oestreicher, Citation2009).

The National Adult Literacy Survey (NALS), although considered the most comprehensive measure of literacy, requires a precise set of data analysis skills and is not available for general use (McCormack et al., Citation2010). Given this, the Health Activities Literacy Scale (HALS), created from selected NALS and International Adult Literacy Survey items (Rudd, Kirsch, & Yamamoto, Citation2004), identifies respondents' levels of certain health literacy skills in five areas: health promotion, health protection, disease prevention, health care, and systems navigation.

In each area, the HALS measures how well respondents process tasks (survey items), ranging in degree of difficulty from locating information (simplest Level 1 skill), through cycling, integrating, generating information, to analyzing information, formulating and calculating responses (Level 5). Task difficulty depends on the amount and location of relevant information in the text, length, and complexity of surrounding text, and the level of cognitive analysis needed. HALS evaluates three factors influencing health literacy level: (a) context and content of the information to be grasped; (b) presentation of information (prose, tables, charts, maps); and, (c) cognitive/analytic strategies needed to handle a task. Developed using a two-parameter logistic regression model from item response theory, the HALS provides mean scores correlated with categories of demographic variables: ethnicity/race, gender, education level, country of birth, health status, and income (Rudd, Kirsch, & Yamamoto, Citation2004). Although the instrument can measure functional health literacy in specific contexts, it has not been used to evaluate health literacy interventions. We theorized that if the HALS were effectively combined with an intervention and/or disease-specific health literacy evaluation tool, we could better assess health literacy skills and specific intervention-related knowledge and attitudes.

Cervical Cancer

Cervical cancer kills approximately 310,000 women annually; most (85%) cervical cancer cases occur in women living in less-developed countries. Though widespread screening with the Pap test has decreased the U.S. cervical cancer mortality rate approximately 75%, at least 12,000 cases of invasive cervical cancer are still diagnosed and 4,200 women still die from the disease annually in this country (American Cancer Society, Citation2007, Citation2010).

HPV infection, a common STI, is the prime risk factor for cervical cancer (Waller, McCaffery, Forrest, & Wardle, Citation2004; Wood, Shin, Duval, & Schmitt, Citation2006), and continues to be a major threat to women's (and men's) health. Its treatment expenses—economic and medical—cost the U.S. health care system up to $33,000 per person per year.

Extensive HPV vaccination coverage with either Gardasil or Cervarex, combined with effective health education and regular Pap test screening, could reduce the high rates of cervical cancer and its high treatment costs. Several factors—many related to health literacy— may impede achievement of this goal. Both vaccines are expensive, require a three-dose administration at different, specifically set times, and specify a young recommended age (11–16 years) for vaccination. In 2010, 32% of female adolescents 13–17 years of age had received all three doses (Centers for Disease Control and Prevention, Citation2011). Low utilization and new recommendations about the HPV vaccine for boys increase the need for HPV vaccine-specific health education interventions.

Parents/guardians perceptions of the HPV vaccine are critical for gaining consent to administer the vaccine to minor children. Studies examining parental and young adults' views indicate that prior experience with abnormal Pap test results and/or HPV infection are associated with greater knowledge of HPV; however, misunderstandings are widespread (even about the Pap test); and parent knowledge of HPV does not necessarily encourage vaccine acceptability. Major reasons for rejecting the vaccine include feeling their children were not at risk for HPV and having fears that it might cause children to begin early sexual activity. In addition, parents with lower education levels and/or those receiving public health services were more favorable toward vaccination (Black, Zimet, Short, Sturm, & Rosenthal, Citation2009).

When deciding whether to vaccinate their children, parents and guardians must apply health literacy skills to obtain trustworthy information, analyze or reject noncredible data, and possibly overcome certain attitudes or perceptions. Decision makers should be positively disposed toward vaccination, ready to implement action, and confident enough to discuss it with their physician. To assess these health literacy–related knowledge, dispositional, and oral/aural communication factors, this study designed, tested, and empirically validated a web-based assessment tool called TALKDOC, which intended to measure disease (HPV and cervical cancer)-specific health literacy.

Method

Instrument Development

All research activities and the TALKDOC assessment instrument, developed in phases, were approved by the University of New Mexico Health Sciences Center's Human Research Review Committee. In the formative stage, we identified the basic sets of knowledge and skills that an individual should have to understand, prevent, and screen for cervical cancer, to participate in treatment, and handle health/health system procedures to be considered health literate in cervical cancer. We defined the cervical cancer health literacy knowledge and tasks, engaged patients/community members in formative work, recorded patient/provider encounters, and consulted with an expert advisory board.

First, we outlined the cervical cancer prevention–related tasks/behaviors that an individual would have to undertake; these included identifying knowledge and skills needed to accomplish core tasks. Some tasks, such as making appointments for Pap screening—a health system navigation action—requires general health literacy skills (e.g., making any medical appointment). The team then refined the key attitudes, beliefs, and/or dispositions, such as self-efficacy and social norms supportive of vaccination, that an individual needs to undertake essential cervical cancer prevention and treatment tasks.

Second, we conducted formative interviews with five racially/ethnically diverse women with a past history of cervical dysplasia to gain their insight into the cervical cancer experience, and of how emotional, cultural, racial/ethnic, and contextual factors influence cervical cancer health literacy. We asked them to reinforce or negate the list of key tasks we identified, and we elicited factors that inhibited or promoted decisions and actions. We learned that Pap tests are painful for some women (a task-related physical and/or emotional aversion that must be overcome); that an empathetic physician can help in communication; and that concern about a family history of one disease, such as diabetes, may lead to neglecting prevention actions related to other diseases, such as taking regular Pap tests to prevent cervical cancer.

Third, with the informed consent of doctors and patients, we audiotaped five provider–patient discussions about Pap tests and STIs, which took place during gynecological visits. Five different providers agreed to be taped, and we asked for the consent of one patient from each provider, attempting to include patients from different age groups. Review of the tapes provided cues to the language used in these conversations and emotions expressed. Follow-up discussions with providers and patients provided greater detail about the dynamics of such health care talks, and the transcripts served as a basis for developing our instrument's oral/aural health literacy section.

Last, to structure and refine a core, underlying set of HPV/cervical cancer health literacy constructs/competencies for generating items, and to address instrument content validity, we convened an advisory board composed of medical experts, researchers and health care providers. Members included a microbiologist, adolescent medicine and STI specialists, and cervical cancer prevention researchers.

The advisory board first considered and categorized the major constructs or competencies needed to obtain, process, and understand basic HPV and cervical cancer prevention information into three domains—disease-specific knowledge, dispositions (attitudes and existential competencies such as self-efficacy), and communication skills. We used a modified Delphi technique (Cline, Citation2000) to prioritize the domain concepts to serve as the basis for generating assessment tool items.

The advisory board recommended that (a) the knowledge domain should include measurement of individual's knowledge about HPV transmission, infections, and vaccine; women's anatomy; cervical cancer risks and causes; Pap test screening and results; and safe sex and other forms of cervical cancer prevention; and understanding of how to navigate the health system; (b) dispositions should address self-efficacy and attitudes to prevention; and (c) communication skills should cover asking questions and listening abilities. On the basis of this input, our team agreed upon eight core informational categories to be addressed in each domain. These included definitions and concepts; causes and risk factors; information-seeking capacity; prevention and protection factors; secondary prevention/disease screening; treatment; cultural or emotional factors; and health system navigation.

Advisory board members suggested topics to be covered in the draft instrument. We prioritized essential constructs to measure and created a set of proposed questionnaire/instrument items, which were subjected to a second Delphi process.

The Nomological Network

Using the formative research findings, the research team created a nomological network (or chart; see Figure ) that indicated the entire set of constructs (competencies) and categories—and their relationships—in all three health literacy domains to be addressed in the final instrument. Conceptualized by Cronbach and Meehl (Citation1955) as a way of providing evidence for construct validity, a nomological network embodies the conceptual framework underpinning design of the assessment tool. It gives each factor or construct meaning, distinguishes each factor from the others, and demonstrates how they are related. A nomological network must contain at least two empirical constructs to be measured, theoretical propositions specifying linkages among constructs, and rules to operationalize measurements.

Figure 1 Nomological chart indicating constructs, categories, and their relationships in TALKDOC health literacy domains. (Color figure available online.).

Figure 1 Nomological chart indicating constructs, categories, and their relationships in TALKDOC health literacy domains. (Color figure available online.).

Item Generation

The TALKDOC health literacy measurement tool (designed from the aforementioned process) consists of three interconnected categorical domains of indicators, measured through scaled or multiple-choice item responses that examine (a) context-specific health literacy knowledge; (b) dispositions of self-efficacy and prevention; and (c) communication/comprehension abilities.

We developed four scripts for the listening and oral communication section of TALKDOC using the knowledge gained from our original formative interviews and the review of transcripts of real-life physician–patient interactions. The scripts were acted out by the University of New Mexico actors trained to be standardized patients for medical students; all scenarios were professionally audiotaped. One script modeled a woman making a Pap test appointment (Tape A); the second, third, and fourth scripts presented different aspects of patient–provider clinical interactions—dealing with a genital wart diagnosis (Tape B), going through a routine Pap test (Tape C), and discussing an abnormal Pap test result (Tape D).

For each audio script, the research team prepared and cognitively tested a brief set of multiple-choice questions to test listening recall and understanding of the interaction being enacted. Thus, during the actual administration of this fifth component, unlike the other sections in which participants entered their own responses, for the oral/aural competencies, the data entry was completed by the researcher/interviewer. The interviewer would play the audiotaped scenarios (following along with the script text on the computer screen, which the participant could not see) and stop the tape at specifically indicated break points. Each research interviewer would read aloud the standard multiple-choice questions from the computer screen to each participant, as well as the four response options. The interviewer, upon request of the participant, would repeat the questions and potential responses as needed; however, the tape was never replayed. Once each participant stated aloud their response choice (i.e., A, B, C, or D), the interviewer would enter that response into the computer. Each interviewer thus orally asked each respondent the same nine multiple-choice questions prepared for each of the four scenarios. Multiple-choice answers were quantitative in nature; only one response per item was correct.

Some questions examined comprehension (“What was going on in this conversation?”). Others confirmed knowledge (“What is a pelvic exam?”), some checked recall of orally/aurally received information, and others determined whether a respondent could identify the most appropriate question a patient should ask at particular points in a conversation (“What question is Sandra most likely to ask at the end of this discussion?”). Additional questions such as “Not related to the tape, have you, personally, ever asked to have a medical appointment rescheduled?” and “Not related to the tape, have you, personally, ever asked a health care provider to explain a lab result to you?” explored respondents' own behaviors; responses used a 4-point Likert scale ranging from 1 (never) to 4 (always).

Our first instrument draft included more than 300 items. We conducted cognitive tests of the instrument (Willis, Citation2004) with five women, who reviewed the draft questionnaire, listened to the audiotaped scripts and responded to a series of think-aloud questions by an interviewer. Their responses elicited their cognitive analytical process in answering survey questions. The cognitive testing helped us reduce the number of items in the TALKDOC to 109 (see Table for the topic areas and the numbers of questions included in each section).

Table 1. Item content areas and numbers of question per domain of instrument that was tested

We used the HALS to assess convergent validity. The HALS is a computerized, web-based assessment. To assure that computer literacy factors were not biasing our results, we created the TALKDOC in web-based format using the University of New Mexico–approved Opinio software (Object Planet, Citation1998–2011). In the testing process, participants followed the sequence of self-administering the HALS first, then TALKDOC; interviewers orally asked the communication section questions and entered the responses into the computer.

Participant Recruitment

We used different recruitment methods to identify female participants from various educational levels. We also wanted to oversample for the less-than-basic-literacy group (the U.S. Census indicates that one third of New Mexicans have a high school education or less). Our goal was to recruit 35% of women with less than an 8th-grade education, 35% with an 8th–12th-grade (including GED) level, and 30% who had achieved some college or more. We attended community health fairs; handed out study brochures; placed ads in the local newspaper and on craigslist.org; posted flyers in community sites such as women's health clinics (especially those serving low-income, racially diverse women), laundromats, beauty parlors, and nail salons; and the University of New Mexico hospital. In addition, we contacted English as a second language and adult literacy programs, encouraging women in the sessions to take part. Our brochures and flyers, written at a fourth-grade reading level, described the study and participant eligibility (ages 18–70 years) and gave contact information. Participant compensation ($25 gift cards) was offered, and those recruited were encouraged to tell a friend about the study. We also used a snowball sampling process—a raffle—to increase the number of recruited participants; women who referred other participants became eligible to have their name entered into a drawing for a $100 gift card.

Screening and Data Collection

We screened potential participants for education level, age, race/ethnicity, computer familiarity, and English language capability. Exclusion criteria included being younger than 18 years of age, having no computer skills, and not being conversant in English. Education and ethnicity data were used to create the sampling frame.

Participants provided demographic information (age, race/ethnicity, place of birth, education, employment, income, partnership status, cancer exposure, use of health information sources, and Pap test experience) and answered cancer related questions adapted from the National Cancer Institute's Health Information National Trend Survey (hints.cancer.gov; Citation2007 version). A research assistant helped respondents, as needed, navigate through the both computerized assessments. Four of the five TALKDOC sections were self-administered by the respondent, while the fifth section, oral/aural communication, required the research assistant to play the four recordings. The research assistant stopped the audiotapes at specific points, asked the respondent the relevant questions, and entered the responses into the computer. The total amount of time that was required for an individual to complete the assessment varied depending on several factors—their comfort with the material, the computer, and their level of education. This was especially true with the HALS, because the items become more difficult to answer as the participants progressed. Most participants did not take more than 30 to 35 minutes to complete the self-administered part of the TALKDOC; the oral/aural component (the tape was never replayed, but questions could be repeated) took anywhere from 28 to 40 minutes to complete (e.g., 7–10 minutes per tape), for a total of 60–75 minutes.

Analysis of Reliability and Validity

We used Cronbach's alpha as a measure of internal consistency and reliability as well as to eliminate questions from the analysis. Convergent validity between HALS test scores and TALKDOC knowledge, communication, and disposition questions was assessed with Spearman's correlations.

We combined the knowledge, communication, and disposition questions remaining after the analysis of internal consistency into four subscores. The knowledge subscore is the percentage answered correctly on 17 remaining knowledge questions; the communication subscore is the percentage answered correctly on 15 remaining communication questions.

The disposition self-confidence subscore is the average of responses to four questions reported on a 6-point scale ranging from 1 (not at all confident) to 6 (totally confident). The disposition attitudes to prevention subscore is the average of responses to two questions with responses 1, strongly disagree, to 6, strongly agree. For both disposition subscores, the average was stratified calculated into two groups, those whose average was less than 5 and those whose average was 5 or greater. We used cross-tabulations, Mantel-Haenszel chi-square and t tests to assess relations and significance of association between the scores and demographic experience and Pap test experience, exposure to information, and trust in information sources.

Results

Participant Characteristics

A total of 161 women participated in testing the TALKDOC (see Table for the demographic characteristics of the sample). Female participants ranged in age between 18 and 60 years. The ethnic and racial distribution of the sample was similar to that of the state of New Mexico. Most women were born in the United States. The level of educational attainment was higher than hoped for; about one third of the sample had graduated high school or passed a GED; the remainder attended some sort of college, technical, or vocational school; and one third had graduated college or attained a graduate degree. Most respondents were employed for wages, although the median annual income was less than $35,000. Less than half of the sample reported being married or living with a partner.

Table 2. Participant demographics

Table shows the participants' cancer screening experience and access to and use of health information sources. Almost all respondents had had at least one Pap test, and about two thirds had their last Pap test within the last year. More than half (52%) of those having had a Pap test had had an abnormal Pap test result. Most women reported having their last Pap test as a routine test. Motivation for getting their last Pap test included responding to a health system reminder, having had a prior abnormal test, and/or a recommendation from a doctor or family member. Although half of the sample said they would definitely have a Pap test within the next year, one quarter said they were only somewhat likely or unlikely to have one. More than half of the women had someone in their close family with cancer, including themselves. Most of the respondents reported getting information from the health section of a newspaper or magazine, the Internet, or TV news, or other sources. In addition, 126 (79%) had seen advertisements for the HPV vaccine, and 78 (62%) of these asked a health provider about the vaccine.

Table 3. Descriptive statistics

Internal Consistency

We aimed to reduce the number of items in the TALKDOC. Tests of internal consistency indicated that 32 questions could be eliminated.

Tests of association showed that education was significantly related (p < .001) to the percentage of knowledge questions answered correctly. Respondents with a high school education or less answered 57% (SD = 17) of the knowledge questions correctly, whereas those respondents with some college education answered 73% (SD = 11) of the knowledge questions correctly. Education was also significantly related (p < .001) to the percentage of oral communication questions answered correctly. Respondents with a high school education or less answered 77% (SD = 17) of the oral communication questions correctly, while respondents with some college education answered 89% (SD = 10) of the oral communication questions correctly.

No significant differences found between either the disposition self-confidence or the attitudes to prevention average scores by education (see Table ) for the means and medians for the self-confidence and the attitudes to prevention subscores). There were no significant differences in knowledge, disposition, or oral communication scores by exposure to media, influence of information sources, having had a Pap test, or having had abnormal pap tests.

Table 4. Descriptive statistics for average response of and questions for disposition scores (self-confidence and attitudes to prevention)

Convergent Validity

The HALS scores are divided into five levels of proficiency. Table shows the definition of each of the five levels and the distribution of HALS test scores of the study respondents. Approximately one third of the respondents had a score that was lower than would be expected of those graduating from high school; one third had minimum-level proficiency; and the remaining third had high proficiency.

Table 5. Scores on Health Activities Literacy Scale (N = 150)

Spearman's correlations between HALS test scores and knowledge were .51 (p < .01), and oral communication = .41 (p < .01), disposition: self-confidence = .09 (p = .24), and disposition: attitudes to prevention = –0.16 (p = .05).

Discussion

The study results show that the TALKDOC is a health literacy tool that can be used to measure constructs of knowledge, attitudes, and oral communication that are not currently addressed in other assessment tools but are necessary for functional health literacy. This type of health literacy assessment tool can be used to evaluate the effect, efficacy, and effectiveness of an intervention that addresses knowledge, dispositions, and/or oral communication.

The convergent validity assessment, comparing the TALKDOC with HALS, showed moderate correlations. This suggests that the TALKDOC is not redundant with HALS and that knowledge, dispositions, and oral communication are valid health literacy constructs to assess. Interventions that address these constructs can use the TALKDOC model with appropriate or needed revisions for health content.

The results of this study the importance of considering other constructs of health literacy in intervention assessment tools. Tests of association showed a relation between education levels and scores on our health literacy knowledge and oral communication tests. Literacy itself has been shown to be correlated with education; and literacy levels can be improved with training and education (Parker & Kreps Citation2005). Health literacy—especially certain constructs such as knowledge and oral communication skills—can be integrated into focused interventions. Taken together, the findings of this study suggest that the TALKDOC is an instrument suitable for measuring change from such an intervention. In this first test of the instrument, there was a sufficient knowledge gap so that improvement could be detected after an intervention. For oral/aural communication competence, there were ceiling effects, and we saw no change over time in the variable.

Additional research is needed to assess how conceptually different aspects of health literacy contribute to the overall construct of health literacy. In this study, we created and tested a tool that measures other indicators of functional health literacy—knowledge of a specific health condition or disease, oral/aural communication skills needed to learn about, negotiate treatment and follow directions to handle the health condition, and the attitudes or dispositions that facilitate or hinder health decision making and action. The study findings suggest that there is value in enhancing existing health literacy measures with those that measure different constructs of health literacy, such as those in TALKDOC.

Implications for Intervention Research

Health literacy is broader than literacy about health. The definition of literacy is “the ability to use texts to retrieve information” (White & McCloskey, Forthcoming). The definition of health literacy requires individuals to use information to make appropriate health decisions. Other domains of health literacy are important in real-life settings, such as knowledge of the subject matter, attitudes toward prevention, treatment, self-efficacy about asking questions or negotiating complex health systems, or filling out forms, and the ability to listen and pose questions to get more information in a dialogue about a health topic. These areas and skills should be the focus of health education interventions.

The research reported here is the first step to understanding the opportunities for measuring other health literacy competencies in addition to literacy. Findings suggest that this model instrument can be used to evaluate disease-specific interventions that aim to improve relevant constructs of health literacy, those which influence individuals' ability to make appropriate decisions about their health and health care.

Limitations

The most challenging aspect to testing the TALKDOC was participant recruitment. Our goal was to achieve a sample that mirrored the population of New Mexico, which includes a significant proportion of individuals who have below-basic literacy. However, these individuals are notoriously difficult to recruit. We tried to address common barriers by administering the instrument in evenings and on weekends, recruiting in places where these women might congregate, and inviting women to bring their friends. In addition, the total length of time needed to complete the consent process; demographic questionnaire; the HALS and the TALKDOC required from 40 minutes up to 2 hours. It was tiring for those women who required more time to finish; their responses may have reflected fatigue. Despite the length of time needed, most participants remained interested, engaged, and committed to completing the survey.

Supplemental material

uhcm_a_712613_sup_28763362.pdf

Download PDF (835.6 KB)

Acknowledgments

The work described in this manuscript was generously supported by the Eunice Kennedy Shriver National Institute of Child Health and Development (1 RO3 HD050402-01). We wish to thank the staff and student interns who participated in developing and implementing this assessment tool; the members of the Advisory Board who provided the content and ensured the scientific validity of the information in the instrument; and the members of the community who participated in the testing of the instrument. Also, the authors' sincere thanks go to Irwin Kirsch of the Center for Global Assessment Educational Testing Service, Princeton, NJ and to Rima Rudd of the Department of Society, Human Development and Health at Harvard University School of Public Health, Boston MA for their guidance, consultation and moral support and, in particular, for providing access to the Health Activities Literacy Scale.

Notes

*For example, I have had abnormal Pap test results in the past; it was time and I asked for it; school physical; to get birth control pills renewed; heavy bleeding and painful cramps; new sex partners.

References

  • American Cancer Society. ( 2010 ). Cancer facts and figures 2010. Retrieved from http://www.cancer.org/Research/CancerFactsFigures/index
  • American Cancer Society. ( 2007 ). Global cancer facts and figures 2007. Retrieved from http://www.cancer.org/Research/CancerFactsFigures/GlobalCancerFactsFigures/index
  • Baur , C. , Brooks , C. , Harris , L. , Locke , J. , Neuhaus , C. , Robinson , S. , & Hilfiker , S. ( 2010 ). National action plan to improve health literacy, May 2010 . Washington , DC : U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion .
  • Benjamin , R. ( 2010 ). Health literacy improvement as a national priority . Journal of Health Communication , 15 ( Suppl 2 ), 1 – 3 .
  • Black , L. L. , Zimet , G. D. , Short , M. B. , Sturm , L. , & Rosenthal , S. L. ( 2009 ). Literature review of human papillomavirus vaccine acceptability among women over 26 years . Vaccine , 27 , 1668 – 1673 . doi: 10.1016/j.vaccine.2009.01.035
  • Centers for Disease Control, & Prevention. ( 2011 , August 26 ). National and state vaccination coverage among adolescents aged 13 through 17 years, United States, 2010 . Morbidity and Mortality Weekly Report , 60 3( 33 ), 1117 – 1123 . Retrieved from http://www.cdc.gov/vaccines/stats-surv/nisteen/data/tables_2010.htm
  • Chew , L. D. , Bradley , K. A. , & Boyko , E. J. ( 2004 ). Brief questions to identify patients with inadequate health literacy . Family Medicine , 36 , 588 – 594 .
  • Clayman , M. , Pandit , A. , Bergeron , A. , Cameron , K. , Ros , E. , & Wolf , M. ( 2010 ). Ask, understand, remember: A brief measure of patient communication self-efficacy within clinical encounters . Journal of Health Communication , 15 ( Suppl 2 ), 72 – 79 .
  • Cline , A. ( 2000 ). Prioritization process using Delphi technique. Retrieved from http://www.carolla.com/wp-delph.htm
  • Cronbach , L. J. , & Meehl , P. E. ( 1955 ). Construct validity in psychological tests . Psychological Bulletin , 52 , 281 – 302 .
  • Davis , T. C. , Long , S. W. , Jackson , R. H. , Mayeaux , E. J. , George , R. B. , Murphy , P. W. , & Crouch , M. A. ( 1993 ). Rapid estimate of adult literacy in medicine: A shortened screening instrument . Family Medicine , 25 , 391 – 395 .
  • Dubow , J. ( 2004 ). Adequate literacy and health literacy: Prerequisites for informed health care decision making. Issue Brief, June (IB70), pp. 1–11. Washington, DC: AARP Public Policy Institute. Retrieved from http://www.assets.aarp.org/rgcenter/health/ib70_literacy.pdf
  • Gazmararian , J. , Beditz , K. , Pisano , S. , & Carreón , R. ( 2010 ). The development of a health literacy assessment tool for health plans . Journal of Health Communication , 15 , 93 – 101 .
  • Gee , J. ( 1996 ). Social Linguistics and Literacies: Ideology in Discourses () , 2nd ed. . Philadelphia , PA : Falmer Press .
  • Hahn , E. , Choi , S. , Griffith , J. , Yost , K. , & Baker , D. ( 2011 ). Health Literacy Assessment Using Talking Touchscreen Technology (Health LiTT): A new Item response theory-based measure of health literacy . Journal of Health Communication , 16 ( Suppl 3 ), 150 – 162 .
  • Helitzer , D. , Hollis , C. , Cotner , J. , & Oestreicher , N. ( 2009 ). What health literacy demands do health information materials place on users? An assessment of cervical cancer prevention materials . Cancer Control , 16 , 70 – 78 .
  • McCormack , L. , Bann , C. , Squiers , L. , Berkman , N. , & Squire , C. ( 2010 ). Measuring health literacy: A pilot study of a new skills-based instrument . Journal of Communication , 15 , 51 – 71 .
  • National Cancer Institute . ( 2007 ). Health Information National Trend Survey (HINTS). Retrieved from http://www.hints.cancer.gov
  • National Network of Libraries of Medicine. ( 2004 ). Understanding Health Literacy and its Barriers (Current Bibliographies in Medicine 2004–1). Retrieved from http://www.nlm.nih.gov/pubs/cbm/healthliteracybarriers.html
  • Nielsen-Bohlman , L. , Panzer , A. , & Kindig , D. A. ( Eds.). (2004 ). Health Literacy: A Prescription to End Confusion . Institute of Medicine. Washington , DC : National Academies Press .
  • Nutbeam , D. (2008). The evolving concept of health literacy. Social Science and Medicine , 67, 2072–2078.
  • Object Planet. ( 1998–2011 ). Opinio (software). Oslo, Norway: Author.
  • Paasche-Orlow , M. K. , Wilson , E. , & McCormack , L. ( 2010 ). The evolving field of health literacy research . Journal of Health Communication , 15 ( Suppl 2 ), 5 – 8 .
  • Paasche-Orlow , M. K. , & Wolf , M. ( 2010 ). Promoting health literacy research to reduce health disparities . Journal of Health Communication , 15 ( Suppl 2 ), 34 – 41 .
  • Parker , R. , Baker , D. W. , Williams , M. V. , & Nurss , J. R. ( 1995 ). The test of functional health literacy in adults: A new instrument for measuring patients' literacy skills . Journal of General Internal Medicine , 10 , 537 – 541 .
  • Parker , R. , & Kreps , G. ( 2005 ). Library outreach: Overcoming health literacy challenges . Journal of the Medical Library Association , 93 ( 4 Suppl ), S81 – S85 .
  • Parker , R. , & Ratzan , S. ( 2010 ). Health literacy: A second decade of distinction for Americans . Journal of Communication , 15 ( Suppl 2 ), 20 – 33 .
  • Ratzan , S. C. ( 2010 ). The national health literacy action plan: The time has come for action . Journal of Communication , 15 , 575 – 577 .
  • Ratzan , S. C. , & Parker , R. R. ( 2000 ). Introduction . In C. R. Selden , M. Zorn , S. C. Ratzan , & R. M. Parker (Eds.), National Library of Medicine Current Bibliographies in Medicine: Health Literacy . NLM Pub. No. CBM 2000–1. Bethesda, MD: National Institutes of Health, U.S. Department of Health and Human Services.
  • Rudd , R. , & Anderson , J. ( 2006 ). The health literacy environment of hospitals and health centers . Boston , MA : National Center for the Study of Adult Learning and Literacy and Health and Adult Literacy and Learning Initiative, Harvard School of Public Health .
  • Rudd , R. , Kirsch , I. , & Yamamoto , K. ( 2004 , April) . Literacy and Health in America: Policy Information Report . Princeton , NJ : Educational Testing Service .
  • Rudd , R. , Zobel , E. K. , Fanta , C. H. , Surkan , P. , Rodriquez-Louis , J. , Valderrama , Y. , & Daltroy , L. H. ( 2004 ). Asthma: In plain language . Health Promotion Practice , 5 , 334 – 340 .
  • U.S. Department of Health, & Human Services . ( 2010 ). National action plan to improve health literacy. Washington, DC: Author. Retrieved from http://health.gov/communication/HLActionPlan
  • U.S. Department of Health, & Human Services. ( 2007–2008 ). National Health Care Disparities Report. Washington, DC: Author. Retrieved from http://archive.ahrq.gov/qual/qrdr07.htm
  • Wallace , L. S. , Rogers , E. S. , Roskos , S. E. , Holiday , D. B. , & Weiss , B. D. ( 2006 ). Brief report: Screening items to identify patients with limited health literacy skills . Journal of General Internal Medicine , 21 , 874 – 877 . doi: 10.1111/j.1525–1497.2006.00532.x
  • Waller , J. , McCaffery , K. , Forrest , S. , & Wardle , J. ( 2004 ). Human papillomavirus and cervical cancer: Issues for biobehavioral and psychosocial research . Annals of Behavioral Medicine , 27 ( 1 ), 68 – 79 .
  • Weiss , B. D. , Mays , M. Z. , Martz , W. , Castro , K. M. , DeWalt , D. A. , Pignone , M. P. , & Hale , F. A. ( 2005 ). Quick assessment of literacy in primary care: The newest vital sign . Annual Family Medicine , 3 , 514 – 522 .
  • White , S. , & McCloskey , M. ( Forthcoming ). Framework for the 2003 National Assessment of Adult Literacy (NCES 2005–531). Washington, DC: U.S. Department of Education, National Center for Educational Statistics. Retrieved from http://nces.ed.gov/naal/fr_definition.asp
  • Willis , G. B. ( 2004 ). Cognitive Interviewing: A “How to” Guide. Research Triangle Institute. Retrieved from http://fog.its.uiowa.edu/~c07b209/interview.pdf
  • Wood , D. , Shin , J. , Duval , B. , & Schmitt , H. ( 2006 ). Chapter 22: Assuring the quality, safety and efficacy of HPV vaccines: The scientific basis of regulatory expectations pre- and post-licensure . Vaccine , 24 , S3/187 – S3/192 .