3,215
Views
32
CrossRef citations to date
0
Altmetric
Articles

Using software to tell a trustworthy, convincing and useful story

Pages 355-372 | Published online: 11 Jul 2012

Abstract

This paper discusses the potential of specialist software to develop category construction in qualitative data analysis and considers how the uses of software may best be reported to substantiate researchers’ claims. Examples are examined from two recent projects: a consultation of pupil’s perceptions of assessment for learning strategies and an exploratory enquiry on employing music as a tool for inclusion in post-conflict Northern Ireland. From this experience, a number of suggestions on how to support the researchers’ claims are made and a model of knowledge generation is put forward. Some of the practical implications outlined are discussed within the context of social research, but it is acknowledged that the suggestions also apply to any field in which knowledge is generated from qualitative data.

Introduction

The use of specialist software for qualitative data analysis is a recurrent theme at international conferences (e.g. International Congress of Qualitative Enquiry, www.icqi.org) and in research methods handbooks (e.g. Davidson & di Gregorio, Citation2011; Myers, Citation2009; Richards, Citation1997, Citation2009; Richards & Richards, Citation1994; Silverman, Citation2009). Many social researchers around the globe use software packages to analyse qualitative data and software use is a debated topic in research journals (e.g. Cisneros-Puebla, Faux, Moran-Ellis, García-Álvarez, & López-Sintas, Citation2009; Davidson & Jacobs, Citation2008; Erasmus & de Wet, Citation2005; Evers, Mruck, Silver, & Peeters, Citation2011; Fielding & Lee, Citation1991; Gibbs, Friese & Mangabeira, Citation2002; Knoblauch, Flick & Maeder, Citation2005). Over the last two decades, competing claims on the value of computers for qualitative data analysis have been put forward by advocates and sceptics (a comprehensive overview of these claims is provided by Lu & Shulman, Citation2008). For sceptics, an undesired effect of software usage include that researchers may be mislead to focus on quantity (frequency counts in transcripts) rather than meaning, whether frequent or not. Software packages, it has been argued, may come to define the analysis processes they should merely support, ‘presupposing a way of doing research’ and de-contextualising the data-set (Lu & Shulman, Citation2008, p. 108). Contrastingly, advocates describe the numerous advantages of using software, such as keeping track of developing ideas and an increased power for querying the data-sets and for making links between their parts (Evers, Citation2011; Hutchison, Johnston & Breckon, Citation2010; Konopásek, Citation2008; Lewins & Silver, Citation2007).

Advocates or not, users of specialist software sometimes mention the package and methodology used while not fully disclosing the particular analysis processes. This may perhaps be due to the word limit in journal articles as well as to the unstated assumption that readers will be familiar with such analysis processes. It appears that in some instances computers may be employed in a superficial way, to facilitate data management without making full use of the software possibilities, which may affect the way processes such as category construction is undertaken and subsequently explained to readers (e.g. Bond & Paterson, Citation2005; Pinson, Citation2007). In the only systematic study of patterns of use available to date, Fielding and Lee (Citation1998) reported that most researchers used qualitative software in rather undemanding ways, with little use of the more sophisticated features such as Boolean retrievals.

The purpose of this paper is twofold: to examine some of the possibilities of using software for qualitative data analysis and to discuss how its uses may be best reported to substantiate the researchers’ claims. After reviewing a selection of relevant literature in the first section, the paper considers some examples of developing insights into qualitative data-sets from two recently completed projects. The first focused on perceptions of the increased participation in assessment activities by pupils in ‘Assessment for Learning classrooms’ (Leitch, Gardner et al., Citation2008; Leitch, Lundy, Clough, Gardner, & Odena, Citation2008). The second explored the potential of music education as a tool for inclusion in a post-conflict context (Odena, Citation2009a, Citation2010). It is argued the software assisted in the development of ideas and testing of hypotheses by seeing the data-set and its parts from different angles. In the discussion, it is suggested that fully disclosing the analysis processes can assist researchers to substantiate and add rigour to their claims for end-users. A model of knowledge generation through systematic analysis is outlined in the conclusions, which seems of particular relevance at a time when there is a plethora of organisations commissioning enquiries often tailored to fulfil the funders’ aims (Sugrue, Citation2007; Thomas, Citation2011).

Using computers in qualitative data analysis

The use of computers for qualitative data analysis has been a feature in social research since the 1980s, for instance with the commercialisation of The Ethnograph that facilitated the computer transfer and management of transcripts (see a review of these early years in Friese, Citation2011). In the 1990s and 2000s, other programmes appeared – e.g. NUD∗IST, HyperResearch, ATLAS.ti, MAXqda – and today the use of specialist software for qualitative data analysis is part of many research methods courses (Darmody & Byrne, Citation2006; Davidson & Jacobs, Citation2008). The computer assisted qualitative data analysis (CAQDAS) Networking Project, based at the University of Surrey, hosts a website that includes guides for all the leading packages organised following a set of standard topics, so potential users can compare approaches (caqdas.soc.surrey.ac.uk). When analysing text, as with any type of qualitative data analysis, there are several ways (and steps) in carrying out analysis processes that may be assisted by a software package. Apart from assisting with the managing and retrieving of different types of data (e.g. transcripts, notes) across a number of data-sets, the software may be employed in the process of category construction. This process may be located in a continuum depending on the degree of openness/closeness of the themes to be explored as well as the inductive/deductive methodological approach. At one end of the continuum and with little preconceived expectations, we would find ‘grounded theory’ (Birks & Mills, Citation2010; Glaser & Strauss, Citation1967). In grounded theory, the categories emerge through a process of inductive reasoning, rather than the data being allocated to predetermined categories. Ideally, the researchers start without any defined ideas on what they will find. The analyses are undertaken following ‘a constant comparative method’, which would include the following steps:

Immersion: producing detailed transcriptions (from diaries, interviews, observations, etc.)

Categorisation: assigning categories.

Reduction: grouping categories in ‘themes’.

Triangulation: checking themes against all transcripts, preferably with other people.

Interpretation: making sense of data with new model or established theory (Lincoln & Guba, Citation1985; Seddon, Citation2005).

At the other end of the analysis continuum, we would find studies in which researchers have to identify predetermined categories using a deductive process and making use of, for instance, Boolean operators and set theory. In qualitative comparative analysis, the approach requires the data to be manipulated as variables in order to maximise the number of comparisons that are made across a number of cases (Ragin, Citation1987; Rihoux, Citation2006). Somewhere in the middle of the inductive/deductive continuum, we would find enquiries in which closely defined themes have to be explored from the outset but which do not require data manipulation – for instance, in exploratory studies looking at perceived hindering and facilitating factors to the implementation of a programme (e.g. Hayden & Odena, Citation2007; Miller, Connolly, Odena & Styles, Citation2009; Odena, Miller & Kehoe, Citation2009). Regardless of the degree of inductive/deductive processes, qualitative data analysis, with and without the assistance of software, would always need to go through a process of reading, categorising, testing and refining, which is repeated by the researchers until all categories are compared against all the participants’ responses, and the analysis is validated with other individuals. The same process has previously been labelled as ‘recursive comparative analysis’ (Cooper & McIntyre, Citation1993) and thematic/content analysis (Kvale, Citation1996; Odena, Citation2001, Citation2007; Odena, Plummeridge & Welch, Citation2005; Odena & Welch, Citation2007, Citation2009).

The use of computers for qualitative data analysis, also known as CAQDAS (Fielding & Lee, Citation1991), appears to have a number of practical advantages in comparison to more traditional methods such as cutting quotations and sorting them into boxes. Printing, highlighting and cutting text by hand is viable with tens of pages, but with hundreds of pages the researchers’ memory may be aided by the software, as the number of categories and the relations between them is likely to develop with each additional reading of the transcripts. A number of software packages are currently available in the market, and although all have particular features that are constantly being developed by their manufacturers, their baseline capabilities are similar. For instance, researchers do not need to print interview transcripts each time they decide to make a substantial change in the categorisation. They can identify relevant quotations on the computer screen and code them using virtual-coloured stripes. Quotations can be included in virtual categories or subcategories, and coding can be changed if needed. As the emerging ideas become clearer, whole categories, containing dozens of quotations, can be easily merged or renamed.

Packages can produce category reports at any stage during the analysis process and have a word search engine. Most packages now have autocoding features, and in some cases (e.g. Qualrus) these are both sophisticated and designed to relieve the researcher of much of the assignment work. Autocoding does make some qualitative researchers uncomfortable, as it has been argued that although autocoding allows for fast exploration of all the answers to a question, it ‘might not create a deep understanding of the issues raised’ and has the potential of encouraging ‘code fetishism’ when ‘the act of coding becomes an end in itself’ (Richards, Citation2002, p. 269). Others warn of the perils of de-contextualising the data-sets (Sinkovics, Penz & Gahuri, Citation2005) and suggest that overreliance on autocoding is ‘very dangerous because it assumes that specific constructs are addressed in the field notes in the exact same way’ (Alkin, Citation2010, p. 182). Nevertheless, it is the researcher who defines the autocoding parameters, amends the allocation of quotations assigned to categories and derives meaning from them. Some programmes have the option of counting the characters coded within each category, which can then be used to obtain the percentage of transcripts coded. Packages also have the option of writing memos and linking them to transcripts or other data, and of importing numerical results to other programmes (e.g. Bazeley, Citation2007; Birks & Mills, Citation2010; Evers, Citation2011; Lewins & Silver, Citation2007). Other possibilities include saving interim categorisations – allowing for analysis replication and tracing back/revising thinking paths – and the sharing of coded files (which aids collaborative work).

With all the above capabilities, these packages may ease the time spent managing data and ensure that no relevant quotations are overlooked. Nevertheless, there is some reticence regarding the use of this type of software, especially surrounding the perceived change of the researchers’ role. Crowley, Harre and Tagg (Citation2002) observe that many social scientists with little practical knowledge of CAQDAS believe that the software ‘drives’ the analysis. Some researchers think computers can distinguish the relevant information from data-sets and develop the ideas, in order to meet the research project’s requirements (Gahan & Hannibal, Citation1998; Lu & Shulman, Citation2008). In fact, the researchers are still in charge of building up the analysis, having the ideas, engaging with the data and making all the decisions about the study. Computers may save time locating a piece of text within a large data-set, such as an interviewee’s answer to a particular question, but the relevance of the answer and its implications are assigned by the researcher.

Indeed, a challenge for all researchers is how they might substantiate their claims. In other words, what can researchers say which will enable readers to decide how much confidence they should place in the findings. The researchers’ claims linking the data with the conclusions derived from its analysis have been often called ‘warrants’ (e.g. Gorard, Citation2002; Plantinga, Citation1993; Pollard, Citation2007). For the sake of clarity, the phrase ‘substantiated claim’ will be employed here instead of ‘warrant’. In the next section, some examples of category construction using specialist software are discussed. The case is made that a more detailed explanation of the researchers’ analyses processes may better support their claims.

Two examples of category construction using specialist software:

Consulting pupils on the assessment of their learning (CPAL) project

Consulting pupils on the assessment of their learning (CPAL) was a UK economic and social research council funded study that explored pupils’, parents’ and teachers’ perceptions of the increasing participation of students in their own assessment in Northern Ireland. It was one of over 100 research projects funded within the Teaching and Learning Research Programme, the largest education research programme to date in the UK. The CPAL project comprised three interrelated studies focusing on: (1) the development of Annual Pupil profiles in Northern Ireland in the context of giving pupils ‘a voice’ (McEvoy & Lundy, Citation2007); (2) students’ perceptions of ‘Assessment for Learning classrooms’ (Leitch, Gardner et al., Citation2007; Leitch, Gardner et al., Citation2008; Leitch, Lundy et al., Citation2008; Leitch, Odena et al., Citation2007); and (3) teachers’ and parents’ perceptions of pupils increasing participation in assessment. The research team included seven members and shared the different data-sets through a password protected virtual research environment, which was coordinated from the Centre for Applied Research in Educational Technologies in Cambridge.

For the second interrelated CPAL study, six post-primary schools were selected representing the variety of schools found in Northern Ireland (e.g. from both of its main communities, Protestant and Catholic, and co-educational/single gender). Amongst other techniques, focus groups were employed to gather the students’ perceptions of assessment in the classes of 11 teachers who were engaged in an in-service course to help them embed ‘Assessment for Learning’ (Assessment Reform Group, Citation2002). Assessment for learning is a pedagogical approach that emphasises the role of formative assessment in the learning process. It is based on a number of strategies including self-assessment and peer assessment, quality questioning, sharing learning intentions and success criteria and providing effective formative feedback to students to make them aware of where they are in their learning, where they are expected to be, and what they have to do to get there (Gardner, Citation2006; Leitch, Gardner et al., Citation2008).

Over 70 students aged 11–14 participated in the CPAL focus groups, which started by asking them to reflect through a drawing: ‘How I feel about learning in this class’. Students were then asked to explain their drawings and to co-interpret videotaped extracts of their lessons (Leitch, Gardner et al., Citation2007). Focus groups discussions centred on issues regarding pupils’ learning and participation, which were explored through open-ended questioning or ‘conversations with a purpose’ (Burgess, Citation1988). These were transcribed verbatim resulting in 506 double-spaced pages that were shared through the virtual research environment. Computer-assisted (NVivo) thematic analysis was used, repeating the reading–categorising–testing–refining process until all categories were compared against all responses, and the categorisation discussed on an ongoing basis at team meetings. Over 60% of text contained in the transcripts was coded into categories, which included issues surrounding ‘classroom climate’, ‘teachers’ style’, ‘assessment’ and ‘participation and practical learning’.

The capability of the software facilitated the coding of text under two different categories if required, such as when the students’ views were not presented as separate from one another in their conversations. This would have been more difficult employing manual analysis of printed transcripts, because once a quotation has been cut and stored into a box or folder, it is not feasible to link it with the surrounding text. The programme also allowed for the re-organisation and re-labelling of categories and subcategories throughout the research process, which supported the process of shaping the interim analyses with all team members. Having a thorough analysis of the focus groups allowed the researchers to go back and forwards between these and the other data-sets and, ultimately, make sense of the whole project when writing the final report. For example, as part of the CPAL’s third interrelated study, teachers’ perceptions were gathered through in-depth interviews. When comparing the analysis of the focus groups with the teachers’ interviews, it was found that ‘teachers who espoused the spirit of Assessment for Learning provided greater opportunities for genuine participation in learning and assessment’ (Leitch, Gardner et al., Citation2008, p. 3). Gaps between the teachers’ values and their practices were linked with personal factors (e.g. having been afforded a say in their own childhood) and institutional factors (e.g. school culture).

Another instance of how the software use affected the CPAL research process was the development of a questionnaire to gather the parents’ views on their children’s perceptions of Assessment for Learning that included selected students’ quotations. In Table , there are two examples of questions from the parental questionnaire:

Table 1. Examples of items from the parental questionnaire (CPAL Project).

The above questionnaire in Table was prepared after several team meetings in which the interim categories’ lists of quotations compiled with the assistance of the software were considered, allowing for a discussion of a re-developed interim categorisation at each meeting. This aided in reaching a shared understanding amongst researchers of the main issues highlighted by the pupils.

Music education as a tool for inclusion project

The second example of using software for qualitative data analysis is from an exploratory study of practitioners’ views on the potential of music education as a tool for inclusion in cross-community activities in Northern Ireland (Odena, Citation2009a, Citation2010). The main aim of the study was to explore how to develop music skills, while bringing children from both main communities together. Fourteen interviewees were purposefully selected following a ‘maximum variation sampling’ approach, taking into account their potential as ‘key informants’ as determined by having extended experience with this type of activities (Cohen, Manion & Morrison, Citation2011). Interviewees were working or had worked in a wide variety of contexts including teacher education colleges, the inspectorate, nursery, primary, secondary, specialist music schools and out-of-school music projects. The interviews were semi structured and attempted to explore the participants’ background, their views on music education in Northern Ireland, and advice on how to increase the effectiveness of cross-community projects (see examples of questions in the Appendix). Verbatim transcriptions were analysed using thematic analysis with the assistance of specialist software (NVivo). Over 216 double-spaced pages (93.32%) of text were coded into categories – 253,742 characters out of 271,905. As in the CPAL enquiry, this process consisted of repeated readings of all transcripts, looking for commonalities and themes, which were tested with each new reading and evolved into the final categories. A sample of the categorised text was discussed with two colleagues, giving further reliability to the analysis. The emerging categories, listed in Table , focused on a number of issues, including project processes and effectiveness, and music as a sign of identity.

Table 2. Music education as a tool for inclusion project – data analysis categories (in bold) and subcategories (in italics).

The analysis showed how the activities and aims explained by interviewees varied depending on a number of factors, one of the most important being the level of acknowledgement of integration of the educational setting, which appeared to be influenced by the socio-economic environment. It was apparent that cross-community music education projects had been and continued to be an effective means of addressing prejudice amongst young people, although addressing prejudice may have not always been the aim of all projects but a welcomed side effect. The analysis also highlighted barriers for cross-community education (e.g. funding, parents and school leaders’ lack of support) and some negative musical stereotypes linked with each community. For instance, a number of interviewees gave the following descriptions of flute bands, which in Northern Ireland are seen as representative of the Protestant community:

Flute bands petrify me because to me they signify the Twelfth of July and marching … for many it’s a very appropriate way of being part of the community, but it still frightens me because it’s an alien culture.

Like any stereotype, once you start to dig into it, you see that that’s not the case, but music has been used as a weapon to sort of define communities … it’s like gang mentality.

The historical background of brass bands is in the British military system … [and] tends to attract more Protestants; similarly Irish traditional music is part of the folk culture of the Catholics.

Even though musical stereotypes linked with styles and perceived musical traditions were still apparent and could be an obstacle for the implementation of cross-community activities, the potential of using music for such activities was highlighted by all interviewees. Successful activities described included school visits with a musical element, shared after-school music education activities in neutral settings (i.e. city centre), children’s musical theatre projects and collaborative performances between schools across the community divide:

[Music] is a superb tool for encouraging children to work together … they throw themselves into it wholeheartedly and are quite prepared to work with other people in doing that.

[Children] can inspire people like no other group of people can.

The specialist software aided in the process of disclosing the most relevant categories, not just in addressing the research aim but as strong categories emerging across conversations with interviewees in all their different contexts. Table shows the number of appearances of the four main categories within the transcripts and the number of interviewees that had quotations coded within these categories (with % in brackets):

Table 3. Transcript appearances of the four main categories in the music education as a tool for inclusion study (adapted from Odena, Citation2010, p. 91).

Disclosing the relative weight of the emerging categories was used to substantiate the conclusions of the study; for instance, by showing the degree to which quotations used in written outputs were representative of the participants’ views, and by evidencing that particular categories appeared across interviews that had been carried out in a wide variety of contexts due to the maximum variation sampling approach. It should be clarified that the above is sample description, and as such it is a form of representativeness, but not in the quantitative sense of generalisation to the population sampled. Disclosing frequency data in qualitative data analysis allow for the informed assessment of any emerging patterns across data-sets and for a consideration of alternative explanations.

A few months after completing the first analysis, a second one was carried out, which was aimed at developing theory–practice links between the interview transcripts and social psychology theories. In particular, instances of ‘optimal conditions’ for cross-community contact and developing stages of intergroup relations as described in the literature were mapped out in the participants’ explanations (e.g. Allport, Citation1954; Brown, Citation2001; Hughes, Citation2007; Hughes & Donnelly, Citation2006; Pettigrew, Citation1998; Pettigrew & Tropp, Citation2006). At the core of organising cross-community activities amongst confronting groups lies the idea that intergroup contact, under certain conditions, can be effective in reducing prejudice and hostility between groups. The optimum conditions for this to happen include: (1) equal status of both groups in the contact situation; (2) ongoing personal interaction between individuals from both groups; (3) working towards a common goal and (4) official social sanction for contact between groups (Hughes, Citation2007). This theory, also known as contact hypothesis, was first proposed by Allport (Citation1954, p. 489) who observed that to maximise programme effectiveness, contact activities would need to ‘occur in ordinary purposeful pursuits’. In a subsequent reformulation of the contact theory, Pettigrew (Citation1998) outlined a sequential model to reduce conflict between groups containing three stages:

The first stage or initial contact, where anxiety is likely to be more pronounced and where personal identity and interpersonal interaction are emphasised in an effort to ‘de-categorise’ the individuals.

A second stage when contact is well established, which affords a situation with less anxiety in which the old salient categorisation of belonging to a group is disclosed, resulting in weakened prejudices that are generalised beyond the activity.

And a third and final stage in which, after extended contact, individuals begin to think of themselves as part of a re-defined new larger group that comprises all communities.

The original analysis of the interviews showed that cross-community music activities were perceived positively by both communities and that such activities worked as a tool for inclusion in a variety of contexts. Projects seemed to be well received and were described as benefiting the children, but the first analysis did not illustrate the nature of the activities in relation to any diminishing prejudice. Conversely, after the second analysis aimed at outlining links with social psychology theories, it was apparent that the majority of activities described would fall within Pettigrew’s first stage or ‘initial contact’, as weakened prejudices were not generalised beyond the often one-off cross-community element. A number of educational implications followed, such as that after decades of violent conflict, projects in polarised areas where tensions were still high would benefit from focusing on the quality of the children’s musical experience, ‘leaving positive attitudes towards the other community to develop naturally’ (Odena, Citation2010, p. 99). To move beyond Pettigrew’s first stage, projects needed to be sustained and had to offer something that enticed all those involved: fun for participating children, good educational aims to attract parents and a degree of status to attract school leaders. In affluent contexts with less obvious polarisation, cross-community activities such as youth orchestra rehearsals were already happening regularly (some interviewees had the privilege to participate in them while growing up). This supported the suggestion that negative anxiety produced by the anticipation of contact with the other community was likely to be ameliorated by early participation in contact activities under optimal conditions (Kenworthy, Turner, & Hewstone, Citation2005; Tausch, Kenworthy, & Hewstone, Citation2005).

It may be argued that similar conclusions could have been arrived at without using software. Nevertheless, the software afforded a second in-depth and rigorous analysis including comparisons within a layered structure of categories and subcategories (see Table ), and facilitated the search of particular expressions across all interviews making sure that no stone was left unturned. As reported by Webb and Vulliamy (Citation2007) in a study of primary teachers’ perceptions of New Labour’s strategies for schools, an important advantage when using software is that when retrieving all coded text within a category, any unconscious bias by the researcher as to the relative importance of the category is revealed. The ready access to all quotations also eased the retrieval of representative instances to prepare research outputs with different foci for different audiences.Footnote 1 Ultimately, the study’s educational implications reached beyond the original context of the enquiry, as teachers in Cyprus adapted them for their particular post-conflict environment (Odena, Citation2009b).

Substantiating claims when analysing qualitative data with computers

As suggested before, due to word limitations in written outputs researchers often mention employing a particular software package or the type of analysis used, while leaving the analysis processes undisclosed. In order to be able to assess the researchers’ claims, the processes employed to analyse each data-set would need to be explained, as well as the provenance of the emerging themes (and its relative weight if relevant for the research questions being asked). There is an increasing number organisations commissioning studies whose research aims are often tailored to the organisation’s interests. For example, in a relatively small country such as the Republic of Ireland, the number of Non-Governmental Organisations, Think Thanks and other new actors in the social research arena quadrupled from fewer than 70 to over 300 in recent decades (Sugrue, Citation2007). This is a trend that can be seen across developed democracies: market forces bring increased funding sources and a variety of actors that produce social research, often blurring the boundaries between enquiry, politics, advocacy and business. Reports are sometimes published without going through academic peer review and there seems to be a tendency to value expert commentary over original research (Rich, Citation2005; Thomas, Citation2011). Beyond the unresolved debates about research ‘quality’ that draw on what counts as knowledgeFootnote 2 which would fall outside the scope of this paper, some general suggestions on ways to substantiate the researchers’ claims are discussed in this section.

Before the advent of computers, when manually coding text with markers, the option of disclosing the percentage of data categorised was not readily available to researchers. Current software packages can count the number of characters and words coded, so percentages can be easily calculated. In order to rule out explanations by what in terms of qualitative analysis might be labelled as ‘unquoted evidence’, the percentage of text categorised may be stated. The number of pages and line spacing of text analysed, and the number of pages for each distinctive data-set could also be disclosed to give readers an overall sense of the relative size of the data-sets available. Moreover, the type of data analysis, whether it follows an inductive/deductive approach or a degree of both, would need to be explained, as there is a variety of interpretations even for established approaches – compare for example Glaser and Strauss’ (Citation1967) with Birks and Mills’ (Citation2010) interpretations of grounded theory.

If coding is undertaken by a single researcher, reliability procedures would need to be put in place. Sharing the analysis with others is useful to ensure the interpretation is not biased towards the views of a single researcher, and software can assist in sharing interim analyses and developing them collaboratively. Some packages now incorporate inter-rater reliability features that make it easier to compare coding in multicoding teams, e.g. Coding Analysis Toolkit (CAT). Validity issues when describing, interpreting and explaining social phenomena are likely to be considered more comprehensively by a research team than by a single individual (Maxwell, Citation1992). Consequently, if CAQDAS is used there is greater transparency, but there is also more procedural complexity to be described.

Examples of how analytic processes may be aided with the use of software have been considered. For instance, looking for the frequency of categories across a number of data-sets to minimise any unconscious bias when reporting emerging themes across a number of documents (e.g. Table ). Although the meaning and relevance of categories is attributed by the researchers, such disclosure would aid researchers in ruling out counter explanations in a systematic way. Gorard (Citation2002) argues that research claims would need to persuade the sceptical reader rather than play to a gallery of existing converts. Specialist software can assist in eliciting new viewpoints by offering different perspectives into the data-sets, which is particularly useful when seeking alternative explanations and when trying to reduce any bias (Webb & Vulliamy, Citation2007).

The above procedures would provide the readers with an indication of the robustness of the analysis. In studies with a small number of participants, this would ensure that representative quotations provided in reports are truly representative of the data collected. In a seminal paper Adelman, Jenkins and Kemmis (Citation1976, p. 143) observed that in case studies, when attention is given to the particular, the meanings presented in the report are likely to become apparent by ‘the shock of recognition’. Misrepresenting the case might weaken the researchers’ claims as much as over-claiming beyond the scope of the sample. In this type of enquiry, the applicability of the claims is often made by the readers, who are able to recognise similar realities which may help them make sense of their own. Ultimately, the development and disclosure of research analysis processes facilitated by software programs can afford the readers the opportunity to judge whether the researcher’s claims are sound rather than whether they agree with them simply from the perspective of their own particular experience. In coming years, access to additional material attached to articles might be an expected feature in e-journals, and may include coded text, giving readers a more detailed insight into the analysis processes.

Conclusions: outlining a generative model of social knowledge development

Although computer-aided qualitative data analysis are ‘no substitute for theoretically inspired reasoning’ (Silverman, Citation2006, p. 387), the software possibilities outlined with examples from two enquiries appear to deserve some consideration. Several options have been suggested on how to enhance the analysis of data and its reporting in written research outputs. Using computers may assist in the systematisation of the analysis of empirical data, and describing the analysis processes in greater detail would support the claims arising from them. Different types of claims may be obtained in social research depending on the evidence and the data analysis systematisation: from isolated observations leading to hunches, to corroborated evidence supported by multiple layers, leading to substantiated claims. Figure outlines a generative model of social knowledge development arising from recent experience of employing software to tell a convincing and useful story. This model incorporates some of the stages of warranted beliefs suggested by Thomas and Macnab (Citation2007) and applies them to the use of software for qualitative data analyses, with a design inspired on an earlier model of teachers’ thinking (Odena & Welch, Citation2009, Citation2012). The arrow on the left represents the level of systematisation, which would increase the researchers’ possibilities to corroborate their evidence with multiple layers. Software would assist the researcher’s task of creating multiple layers and testing alternative hypotheses:

Figure 1 A generative model of social knowledge development.

Figure 1 A generative model of social knowledge development.

Systematic analyses aided by software would afford increased possibilities to substantiate research claims. Nevertheless, producing conclusive evidence will depend on the issues under scrutiny and how the research questions are framed (practical implications developed from social enquiry are normally context and time bound). Popper (Citation1959, Citation1963) maintained that researchers should not try to predict things by solely amounting instances of evidence. Instead, the way forward to advance knowledge would be to falsify accepted theories. This is often achieved when using a methodological approach focused on the in-depth study of a small number of participants. With this approach, however, different data analyses processes applied to the same data-sets may yield different claims, which is why these processes would need to be disclosed to end-users.

A number of authors have discussed whether it can ever be assumed that the reasonableness of a claim derived from evidence can be beyond doubt (e.g. Hammersley, Citation2008; Silverman, Citation2006). Ultimately, the impact of research will be determined not just by quality and relevance but also by luck and rhetorical power (Black & Wiliam, Citation2003; Gardner, Citation2011). There is a need to be persuasive and credible to facilitate research impact and disclosing the analyses rigorously may improve research reports and their perception by end-users. Social research is a field made up of overlapping communities of practice, and it would be useful to see research ‘as contributing to better understanding, in ways that owe more to the quality of interpretations of the data’ than to the methods used (Hodkinson, Citation2004, p. 9). It is hoped that the suggestions outlined in this paper will contribute ideas for the consideration of researchers engaged in interpreting qualitative data from any settings, regardless of their particular research practices.

Notes on contributor

Oscar Odena is reader in Education at the University of Hertfordshire, UK, where he is the director of the Doctorate in Education (EdD) programme. He is a member of the editorial boards of the International Journal of Music Education, Research Studies in Music Education, British Journal of Music Education and Revista Electrónica Complutense de Investigación en Educación Musical. He has worked in higher education institutions in Spain, Northern Ireland and England, and serves as the UK and Africa Research Commissioner of the International Society for Music Education (2008–2014). He has co-edited with Gary Spruce the section on ‘Music learning and teaching during adolescence: ages 12–18’ in the forthcoming Oxford handbook of music education (Oxford University Press). His latest work is a book entitled Musical creativity: Insights from music education research published by Ashgate.

Acknowledgements

The above examples of using computers for qualitative data analysis are taken from the ESRC-TLRP CPAL Project, directed by Professor Ruth Leitch, and from the Music Education as a Tool for Inclusion study, funded by The Bernard van Leer Foundation and directed by the author. Thanks to the CPAL team members: professors John Gardner, Ruth Leitch, Laura Lundy and Peter Clough, and to Dr Despina Galanouli and Dr Stephanie Mitchell. Thanks also to the participants in both studies for generously giving their time, to the manuscript referees for their very useful suggestions, and special thanks to Lucy Young for her continuing support.

Notes

1. For example, a paper for the Research Commission of the International Society for Music Education focused on music education matters whereas a report for the funders focused on inclusion and respect for diversity issues (Odena, Citation2008, Citation2009a). A further discussion on a number of ethical issues involved in disseminating research to multiple audiences can be found elsewhere (Odena, Citation2004).

2. The issue of quality in what is generally called ‘qualitative’ research has been addressed by academics and government officials alike (e.g. Cabinet Office, Citation2003; Hammersley, Citation2007; Oancea, Citation2005; Silverman, Citation2006; Slavin, Citation2008; Whitty, Citation2006). There is a sharp conflict between demands for having explicit quality criteria, such as the criteria used in systematic reviews, and arguments that such criteria are not desirable. This conflict reflects different value assumptions about the main goal of social enquiry and about what is worth researching – see for instance Delamont’s (Citation2007) arguments against auto-ethnography.

References

  • Adelman , C. , Jenkins , D. and Kemmis , S. 1976 . Re-thinking case study: Notes from the second cambridge conference . Cambridge Journal of Education , 6 ( 3 ) : 139 – 150 .
  • Alkin , M. C. 2010 . Evaluation essentials , New York , NY : Guilford .
  • Allport , G.W. 1954 . The nature of prejudice , Reading , MA : Addison-Wesley .
  • Assessment Reform Group . 2002 . Assessment for learning: 10 principles: Research-based principles to guide classroom practice , London : Assessment Reform Group with support from the Nuffield Foundation .
  • Bazeley , P. 2007 . Qualitative data analysis with NVivo , London : Sage .
  • Birks , M. and Mills , J. 2010 . Grounded theory: A practical guide , London : Sage .
  • Black , P. and Wiliam , D. 2003 . ‘In praise of educational research’: Formative assessment . British Educational Research Journal , 29 ( 5 ) : 623 – 637 .
  • Bond , R. and Paterson , L. 2005 . Coming down from the ivory tower? Academics’ civic and economic engagement with the community . Oxford Review of Education , 31 ( 3 ) : 331 – 351 .
  • Brown , R. 2001 . “ Intergroup relations ” . In Introduction to social psychology , 3rd ed. , Edited by: Hewstone , M. and Stroebe , W. 479 – 515 . Oxford : Blackwell .
  • Burgess , R.G. 1988 . “ Conversations with a purpose: The ethnographic interview in educational research ” . In Studies in qualitative methodology: A research annual , Edited by: Burgess , R.G. Vol. 1 , 137 – 155 . London : JAI Press .
  • Cabinet Office . 2003 . Quality in qualitative evaluation: A framework for assessing research evidence , London : National Centre for Social Research on behalf of the Strategy Unit in the Cabinet Office .
  • Cisneros-Puebla, C.A., Faux, R., Moran-Ellis, J., García-Álvarez, E., & López-Sintas, J. (Eds.) (2009). Advances in qualitative research in Ibero America. Thematic issue of Forum: Qualitative Social Research, 10(2). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/issue/view/31
  • Cohen , L. , Manion , L. and Morrison , K. 2011 . Research methods in education , 7th ed. , London : Routledge .
  • Cooper , P. and McIntyre , D. 1993 . Commonality in teachers’ and pupils’ perceptions of effective classroom learning . British Journal of Educational Psychology , 63 ( 3 ) : 381 – 399 .
  • Crowley , C. , Harre , R. and Tagg , C. 2002 . Qualitative research and computing: Methodological issues and practices in using QSR NVivo and Nud∗ist . International Journal of Social Research Methodology , 5 ( 3 ) : 193 – 197 .
  • Darmody , M. and Byrne , D. 2006 . An introduction to computerised analysis of qualitative data . Irish Educational Studies , 25 ( 1 ) : 121 – 133 .
  • Davidson , J. and di Gregorio , S. 2011 . “ Qualitative research and technology: In the midst of a revolution ” . In The sage handbook of qualitative research , 4th ed. , Edited by: Denzin , N.K. and Lincoln , Y.S. 627 – 644 . London : Sage .
  • Davidson , J. and Jacobs , C. 2008 . The implications of qualitative research software for doctoral work: Considering the individual and institutional contexts . Qualitative Research Journal , 8 ( 2 ) : 72 – 80 .
  • Delamont, S. (2007). Arguments against auto-ethnography. Education-line. Retrieved June 6, 2012, from www.leeds.ac.uk/educol/documents/168227.htm
  • Erasmus , Z. and de Wet , J. 2005 . Towards rigour in qualitative analysis . Qualitative Research Journal , 5 ( 1 ) : 27 – 40 .
  • Evers, J. (2011). From the past into the future: How technological developments change our ways of data collection, transcription and analysis. Forum: Qualitative Social Research, 12(1). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/article/view/1636/3161
  • Evers, J., Mruck, K., Silver, CH., & Peeters, B. (Eds.) (2011). The KWALON experiment: Discussions on qualitative data analysis software by developers and users. Thematic issue of Forum: Qualitative Social Research, 12(1). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/issue/view/36
  • Fielding , N.G. and Lee , R.M. , eds. 1991 . Using computers in qualitative research , London : Sage .
  • Fielding , N.G. and Lee , R.M. 1998 . Computer analysis and qualitative research , London : Sage .
  • Friese, S. (2011). Assessing a quarter century of qualitative computing. Qualitative Research & Consulting. Retrieved July 11, 2011, from www.quarc.de/qda-software/caqdas.html
  • Gahan , C. and Hannibal , M. 1998 . Doing qualitative research using QSR NUD∗IST , London : Sage .
  • Gardner , J. , ed. 2006 . Assessment and learning , London : Sage .
  • Gardner , J. 2011 . Educational research: What (a) to do about impact? . British Educational Research Journal , 37 ( 4 ) : 543 – 561 .
  • Gibbs, G.R., Friese, S., & Mangabeira, W.C. (Eds.) (2002). Using technology in the qualitative research process: Thematic issue of Forum: Qualitative Social Research, 3(2). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/issue/view/22
  • Glaser , B.G. and Strauss , A.L. 1967 . The discovery of grounded theory: Strategies for qualitative research , New York , NY : Aldine de Gruyter .
  • Gorard , S. 2002 . Fostering scepticism: The importance of warranting claims . Evaluation and Research in Education , 16 ( 3 ) : 136 – 149 .
  • Hammersley , M. 2007 . The issue of quality in qualitative research . International Journal of Research and Method in Education , 30 ( 3 ) : 287 – 305 .
  • Hammersley , M. 2008 . Questioning qualitative inquiry. Critical essays , London : Sage .
  • Hayden, J., & Odena, O. (2007). Social inclusion and respect for diversity: A framework for early childhood programs. Education-line. Retrieved June 20, 2012, from www.leeds.ac.uk/educol/documents/170682.htm
  • Hodkinson , P. 2004 . Research as a form of work: Expertise, community and methodological objectivity . British Educational Research Journal , 30 ( 1 ) : 9 – 26 .
  • Hughes , J. 2007 . Mediating and moderating effects of inter-group contact: Case studies from bilingual/bi-national schools in Israel . Journal of Ethnic and Migration Studies , 33 ( 3 ) : 419 – 437 .
  • Hughes , J. and Donnelly , C. 2006 . Contact as a policy mechanism for promoting better relations in integrated schools in Northern Ireland and bilingual/bi-national schools in Israel . Journal of Peace Education , 3 ( 1 ) : 79 – 97 .
  • Hutchison , A.J. , Johnston , L.H. and Breckon , J.D. 2010 . Using QSR-NVivo to facilitate the development of a grounded theory project: An account of a worked example . International Journal of Social Research Methodology , 13 ( 4 ) : 283 – 302 .
  • Kenworthy , J. , Turner , R.N. and Hewstone , M. 2005 . “ Intergroup contact: When does it work, and why? ” . In On the nature of prejudice , Edited by: Dovidio , J.F. , Glick , P. and Rudman , L.A. 278 – 292 . Oxford : Blackwell .
  • Knoblauch, H., Flick, U., & Maeder, CH. (Eds.) (2005). The state of the art of qualitative research in Europe. Thematic issue of Forum: Qualitative Social Research, 6(3). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/issue/view/1
  • Konopásek, Z. (2008). Making thinking visible with Atlas.ti: Computer assisted qualitative analysis as textual practices. Forum: Qualitative Social Research, 9(2). Retrieved June 6, 2012, from www.qualitative-research.net/index.php/fqs/article/view/420
  • Kvale , S. 1996 . Interviews: An introduction to qualitative research interviewing , London : Sage .
  • Leitch, R., Gardner, J., Lundy, L., Clough, P., Odena, O., Mitchell, S., & Galanouli, D. (2008). Consulting pupils on the assessment of their learning - TLRP research briefing 36. London: ESRC TLRP. Retrieved June 20, 2012, from https://uhra.herts.ac.uk/dspace/handle/2299/6153
  • Leitch , R. , Gardner , J. , Mitchell , S. , Lundy , L. , Odena , O. , Galanouli , D. and Clough , P. 2007 . Consulting pupils in assessment for learning classrooms: The twists and turns of working with students as co-researchers . Educational Action Research , 15 ( 3 ) : 459 – 478 .
  • Leitch, R., Lundy, L., Clough, P., Gardner, J., & Odena, O. (2008). Putting pupils at the heart of assessment: Children’s rights in practice. CPAL Project Outcomes. London: ESRC TLRP. Retrieved June 20, 2012, from https://uhra.herts.ac.uk/dspace/handle/2299/6226
  • Leitch, R., Odena, O., Gardner, J., Lundy, L., Mitchell, S., Galanouli, D., & Clough, P. (2007). Consulting secondary school students on increasing participation in their own assessment in Northern Ireland. Education-line. Retrieved June 20, 2012, from www.leeds.ac.uk/educol/documents/176139.doc
  • Lewins , A. and Silver , CH. 2007 . Using software in qualitative research: A step-by-step guide , London : Sage .
  • Lincoln , Y. and Guba , E. 1985 . Naturalistic enquiry , Beverly Hills , CA : Sage .
  • Lu , CH.-J. and Shulman , S.W. 2008 . Rigor and flexibility in computer-based qualitative research: Introducing the coding analysis toolkit . International Journal of Multiple Research Approaches , 2 ( 1 ) : 105 – 117 .
  • Maxwell , J.A. 1992 . Understanding and validity in qualitative research . Harvard Educational Review , 62 ( 3 ) : 279 – 301 .
  • McEvoy , L. and Lundy , L. 2007 . E-consultation with pupils: A rights-based approach to the integration of citizenship education and ICT . Technology, Pedagogy and Education- Special Issue , 16 ( 3 ) : 305 – 319 .
  • Miller , S. , Connolly , P. , Odena , O. and Styles , B. 2009 . A randomised controlled trial evaluation of business in the community’s time to read pupil mentoring programme , Belfast : Centre for Effective Education, School of Education, Queen’s University Belfast .
  • Myers , M.D. 2009 . Qualitative research in business & management , London : Sage .
  • Oancea , A. 2005 . Criticisms of educational research: Key topics and levels of analysis . British Educational Research Journal , 31 ( 2 ) : 157 – 183 .
  • Odena , O. 2001 . The construction of creativity: Using video to explore secondary school music teachers’ views . Educate , 1 ( 1 ) : 104 – 122 .
  • Odena , O. 2004 . Some considerations on research dissemination with particular reference to the audience and the authorship of papers . Music Education Research , 6 ( 1 ) : 101 – 110 .
  • Odena, O. (2007). Using specialist software for qualitative data analysis. Invited closing seminar at the educational studies association of Ireland (ESAI) Annual Conference, Cavan, Republic of Ireland, March 29–31. The General Teaching Council for Northern Ireland’s Access to Research Resources for Teachers (ARRT) Space. Retrieved June 20, 2012, from http://gtcni.openrepository.com
  • Odena , O. 2008 . “ A qualitative investigation of practitioners’ views on cross-community music education in Northern Ireland: Reality and potential in a post-conflict society ” . In Proceedings of the 22nd International Seminar on Research in Music Education , Edited by: Malbrán , S. and Mota , G. 165 – 176 . Porto , , Portugal : International Society for Music Education (ISME) Research Commission .
  • Odena, O. (2009a). Early music education as a tool for inclusion and respect for diversity: Study paper for the Bernard van Leer Foundation. Brighton: University of Brighton and Bernard van Leer Foundation. Retrieved June 20, 2012, from the University of Hertfordshire Research Archive (UHRA) https://uhra.herts.ac.uk/dspace/handle/2299/6227
  • Odena, O. (2009b). Exploring the potential of music education to facilitate children’s cross-community activities in Northern Ireland. Keynote at the International Conference ‘Intercultural education and peaceful co-existence: The role of the school’, Pedagogical Institute of Cyprus, Larnaca, November 19. Retrieved June 20, 2012, from UHRA https://uhra.herts.ac.uk/dspace/handle/2299/6179
  • Odena , O. 2010 . Practitioners’ views on cross-community music education projects in Northern Ireland: Alienation, socio-economic factors and educational potential . British Educational Research Journal , 36 ( 1 ) : 83 – 105 .
  • Odena, O., Miller, S., & Kehoe, S. (2009). A qualitative evaluation of a mentoring reading programme for 8–9 year olds in Northern Ireland. Education-line. Retrieved June 20, 2012, from www.leeds.ac.uk/educol/documents/184822.pdf
  • Odena , O. , Plummeridge , CH. and Welch , G. 2005 . Towards an understanding of creativity in music education: A qualitative exploration of data from English secondary schools . Bulletin of the Council for Research in Music Education , 163 ( Winter ) : 9 – 18 .
  • Odena , O. and Welch , G. 2007 . The influence of teachers’ backgrounds on their perceptions of musical creativity: A qualitative study with secondary school music teachers . Research Studies in Music Education , 28 ( 1 ) : 71 – 81 .
  • Odena , O. and Welch , G. 2009 . A generative model of teachers’ thinking on musical creativity . Psychology of Music , 37 ( 4 ) : 416 – 442 .
  • Odena , O. and Welch , G. 2012 . “ Teachers’ perceptions of creativity ” . In Musical creativity: Insights from music education research , Edited by: Odena , O. 29 – 48 . Farnham : Ashgate .
  • Pettigrew , T.F. 1998 . Intergroup contact theory . Annual Review of Psychology , 49 : 65 – 85 .
  • Pettigrew , T.F. and Tropp , L.R. 2006 . A meta-analytic test of intergroup contact theory . Journal of Personality and Social Psychology , 90 ( 5 ) : 751 – 783 .
  • Pinson , H. 2007 . At the boundaries of citizenship: Palestinian Israeli citizens and the civic education curriculum . Oxford Review of Education , 33 ( 3 ) : 331 – 348 .
  • Plantinga , A. 1993 . Warrant and proper function , Oxford : Oxford University Press .
  • Pollard, A. (2007). Challenges facing educational research Educational Review Guest Lecture 2005 (TLRP Cardiff conference 2007 version). Keynote at the ESRC Teaching and Learning Research Programme Annual Conference, Cardiff, November 26–27.
  • Popper , K.R. 1959 . The logic of scientific discovery , London : Hutchinson .
  • Popper , K.R. 1963 . “ Prediction and prophecy in the social sciences ” . In Conjectures and refutations: The growth of scientific knowledge , Edited by: Popper , K.R. 336 – 346 . London : Routledge and Kegan Paul .
  • Ragin , Ch. C. 1987 . The comparative method: Moving beyond qualitative and quantitative strategies , London : University of California Press .
  • Rich , A. 2005 . Think thanks, public policy and the politics of expertise , Cambridge : Cambridge University Press .
  • Richards , L. 1997 . “ Computers and qualitative analysis ” . In Educational research, methodology and measurement: An international handbook , 2nd ed. , Edited by: Keeves , J.P. 286 – 290 . Oxford : Pergamon .
  • Richards , L. 2002 . Qualitative computing – a methods revolution? . International Journal of Social Research Methodology , 5 ( 3 ) : 263 – 276 .
  • Richards , L. 2009 . Handling qualitative data: A practical guide , 2nd ed. , London : Sage .
  • Richards , T.J. and Richards , L. 1994 . “ Using computers in qualitative research ” . In Handbook of qualitative research , Edited by: Denzin , N.K. and Lincoln , Y.S. 445 – 462 . Thousand Oaks , CA : Sage .
  • Rihoux , B. 2006 . Qualitative comparative analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research . International Sociology , 21 ( 5 ) : 679 – 706 .
  • Seddon , F.A. 2005 . Modes of communication during jazz improvisation . British Journal of Music Education , 22 ( 1 ) : 47 – 61 .
  • Silverman , D. 2006 . Interpreting qualitative data , 3rd ed. , London : Sage .
  • Silverman , D. 2009 . Doing qualitative research , 3rd ed. , London : Sage .
  • Sinkovics , R.R. , Penz , E. and Gahuri , P.N. 2005 . Analysing textual data in international marketing research . Qualitative Market Research: An International Journal , 8 ( 1 ) : 9 – 38 .
  • Slavin , R.E. 2008 . What works? Issues in synthesizing educational programme evaluations . Educational Researcher , 37 ( 1 ) : 5 – 14 .
  • Sugrue, C. (2007). ‘Back to the future: Perspectives on current realities and future quality of educational research through the prism of Irish Educational Studies’. Opening Keynote at the Educational Studies Association of Ireland (ESAI) Annual Conference, Cavan, Republic of Ireland, March 29–31.
  • Tausch , N. , Kenworthy , J.B. and Hewstone , M. 2005 . “ The role of intergroup contact in prejudiced reduction and the improvement of intergroup relations ” . In Psychological approaches to dealing with conflict and war , Group and social factors Edited by: Fitzduff , M. and Stout , C.E. Vol. 2 , 67 – 108 . Westport , CT : Praeger .
  • Thomas, P. (2011). Why advocacy and market forces fail education reform. Truthout. Retrieved June 6, 2012, from www.truthout.org/why-advocacy-and-market-forces-fail-education-reform
  • Thomas, G., & Macnab, N. (2007). Can we offer appropriate warrants for our findings? Paper presented at the ESRC teaching and learning research programme Annual Conference, Cardiff, November 26–27.
  • Webb , R. and Vulliamy , G. 2007 . Changing classroom practice at key stage 2: The impact of new labour’s national strategies . Oxford Review of Education , 33 ( 5 ) : 561 – 580 .
  • Whitty , G. 2006 . Education(al) research and education policy making: Is conflict inevitable? . British Journal of Educational Research , 32 ( 2 ) : 159 – 176 .

Appendix. Examples of interview questions (Music education study)

WORK

Could you explain what education activities do you provide? (Age level, type of students, etc.)

Do you work the cross-curricular theme of ‘Education for Mutual Understanding’ in Music? (If yes, how?) [only for school teachers].

MUSIC AND MUSIC EDUCATION IN NI

In the past, did you feel that the two main school communities were using music as a sign of identity? (How? Has it diminished?)

PROJECT ADVICE

Could you provide some advice for successful music education activities where children from both communities participate?

When preparing activities do you try to include music from both traditions or do you try to avoid anything to do with them?