Abstract
Background: One of the most devastating consequences of aphasia is the disruption to normal conversation. The Conversation Partner Programme emphasises communicative competence and life participation. Currently there is no recognised system for evaluating this intervention. Following policy imperatives for patient and public involvement, it is important to include service users in the development of evaluation criteria. However, people with aphasia are often excluded from such research and service development initiatives because of their communication disability. This study was designed to include people with aphasia and other key stakeholders as co-researchers in the development of evaluation criteria for a Conversation Partner Programme.
Aims: To describe the multi-perspectival co-generation of Conversation Partner Programme evaluation criteria using a participatory research approach.
Methods & Procedures: Following a pilot study, the generation and analysis of qualitative data involved a Participatory Learning and Action (PLA) approach based on the interpretive paradigm. Using purposeful sampling participants (n = 20) included: people with aphasia (n = 5); speech and language therapists (n = 5); speech and language therapy graduates and undergraduates (n = 9) and university coordinator (n = 1). Through (n = 18) individual and inter-stakeholder data generation episodes (PLA focus groups and interviews) using participatory techniques (Flexible Brainstorming, Card Sort, Direct Ranking, Seasonal Calendar), evaluation criteria were identified. The principles of thematic analysis guided the co-analysis of data with participants. Data generated in Ireland were presented to an international inter-stakeholder group at Connect, UK, for preliminary exploration of transferability of findings.
Outcomes & Results: Conversation Partner Programme evaluation criteria agreed and prioritised by co-researchers in order of importance included: (1) shared understanding of structure, (2) clarity about the programme, (3) agreed evaluation mechanism, (4) linking with other organisations, and (5) feedback. “Shared Understanding of Structure” was ranked the most important criterion and related to the nature and number of participants, opportunities for group meetings, socialising, and stakeholder interaction. “Feedback”, the criterion ranked least important, detailed responsibilities about summarising programme experiences and sharing this information between stakeholders.
Conclusions: People with aphasia and other key stakeholders were meaningfully involved in the identification of evaluation criteria for a Conversation Partner Programme. The outcomes of this collaborative work bridge the gap between policy imperatives around involvement and actual practice and will impact the design, delivery, and evaluation of the programme for all stakeholders. Findings will be of interest to professionals in this clinical area and to those exploring innovative methodologies to include marginalised service users, especially people with communication disabilities in research.
Acknowledgements
We are grateful to the following organisations for their collaboration and support: The Department of Speech and Language Therapy, PCCC, Health Services Executive (HSE) West, Galway, Ireland (http://hse.ie/eng/services/list/1/LHO/Galway/Therapy/) and Connect (The Communication Disability Network; www.ukconnect.org).
ORCID
Ruth Mc Menamin http://orcid.org/0000-0002-9485-3148
Edel Tierney http://orcid.org/0000-0001-6393-8539
Anne Mac Farlane http://orcid.org/0000-0002-9708-5025
Notes
1. The authors trained as PLA facilitators at the Centre for Participatory Strategies (CPS) Ross Wood, Clonbur, Co. Galway, Ireland. Training provided by Mary O’Reilly-de Brún and Tomas de Brún.
2. The PWA had the largest representation around the inter-stakeholder table as 4/5 continued their participation in Phase 2; 4/5 SLTs participated in Phase 2 with two SLTs present at each inter-stakeholder session; 2/9 graduate/undergraduate group participated in Phase 2—both were undergraduate students.
3. A full procedural account of using Flexible Brainstorm, Card Sort, and PLA Interviewing techniques with PWA are described in Mc Menamin et al. (Citation2015).
4. The Card Sort categories are the emergent themes/CPP evaluation criteria and are described in detail later.