7,098
Views
6
CrossRef citations to date
0
Altmetric
Full Research Papers

A design science research (DSR) case study: building an evaluation framework for social media enabled collaborative learning environments (SMECLEs)

, &

Abstract

To conduct design science research (DSR) it is expected that some form of process model must be used, where each stage is explicitly outlined in the presentation of the research, with clear explanations. Since very few, if any, papers actually produces and presents DSR in such a manner, this provides an excellent opportunity to do so. Thus, this paper introduces a case study, where a DSR process model is utilised to produce and present DSR, where the focus is on building an evaluation framework for social media enabled collaborative learning environments (SMECLEs). This approach is imitable by other researchers who wish to produce high quality DSR, and present it in a fashion that is both easy to read, and understand, which helps to increase the standard of DSR being produced and presented.

Introduction

Kane and Fichman (Citation2009) made a call for IS educators, who are often IS researchers also, to start adopting social media platforms (SMPs) in the classroom to teach students in order to remain relevant in a world being changed by information technology. While they state it might take some trial and error on behalf of faculty to develop effective teaching processes for using these platforms, this research looks to create relevant knowledge that can be leveraged by educators if they wish to adopt these platforms, helping to reduce this trial and error. To achieve this, a design science research (DSR) approach is adopted to design, build, and evaluate a framework capable of evaluating the effectiveness of SMPs on collaborative learning. Therefore presented in the following sections is a case study of DSR, which uses the DSR process model by (Peffers, Tuunanen, Rothenberger, & Chatterjee, Citation2007) to produce and present the research. This consists of five process elements: 1. Identify Problem; 2. Define Objective(s) of a Solution; 3. Design and Build; 4. Evaluation; 5. Communication. These process element titles are used to present this research, with clear explanations of what they entail at the start of each. Thus, this paper is organised as follows: the problem that was identified is introduced, and from this the objective of a solution is inferred. Then, the design cycles are introduced, explaining the design, build, and evaluate sections of each of the six design cycles. Finally the contributions are communicated in the discussion section. Problem identification is introduced first.

Identify problem

Identifying a relevant problem to practice involves recognising a deficiency in a current system and then justifying the value of finding a solution to this problem (Hevner, Citation2007; Hevner, March, Park, & Ram, Citation2004). Ideally, the research problem should be new, creative, and the solution should be important to the field (Hevner, Citation2007; Hevner et al., Citation2004). Once the problem has been identified, a thorough search of previous research on the topic should be performed (Hevner, Citation2007; Hevner et al., Citation2004). By clearly defining the research problem, a focus for the research is created (Hevner, Citation2007; Hevner et al., Citation2004; Peffers et al., Citation2007). To achieve this, IS researchers should look to practice to identify a topic to research, and then look at the academic literature available to understand it (Benbasat & Zmud, Citation1999). It was observed that the topic of social media was receiving constant attention in practitioner literature (Armano, Citation2009a, Citation2009b; Baker, Citation2009; Deragon, Citation2009; Reid, Citation2009; Soat, Citation2010), and therefore deemed a relevant topic. However, on its own, social media is too broad as a topic, so, as is necessary with DSR, a relevant problem was identified, which helps focus the research on an area that practitioners can benefit from.

Collaborative technologies as enabling learning environments

Collaborative technologies such as group decision support systems (GDSS) were proclaimed to be able to impact the learning environments of educational institutions twenty years ago, where the IS discipline was interested in determining whether these new collaborative technologies were capable of transforming the traditional methods of teaching (Alavi, Citation1994; Alavi, Wheeler, & Valacich, Citation1995; Leidner & Jarvenpaa, Citation1993, Citation1995). Reasons for this interest included educational institutions lack of change in their learning environments, especially in comparison to organisations adoption of such technologies (Alavi, Citation1994; Leidner & Jarvenpaa, Citation1995); lack of engaging students in the learning process (Alavi, Citation1994); educators, students, and employers feeling that technology could enhance learning (Alavi, Citation1994); and despite IS researchers highlighting ‘the merits of information technology to improve communication, efficiency, and decision-making in organisations’ (Leidner & Jarvenpaa, Citation1995, p. 265), they were not applying this knowledge to their own learning environments. However, Leidner and Jarvanpaa (Citation1995, p. 265) found that when technology was being used in educational learning environments, it was in an automating fashion as opposed to a transforming one, where in ‘the absence of fundamental changes to the teaching and learning process, such classrooms may do little but speed up ineffective processes and methods of teaching.’

Social media as enabling learning environments

New generations of collaborative technologies often emerge (Bajwa et al., Citation2008), and the platforms of social media are one such technology. In a similar fashion to previous collaborative technologies, social media have been proclaimed as impacting the learning environments of educational institutions by facilitating better communication and collaboration, in new and exciting ways (Ajjan & Hartshorne, Citation2008; Kane & Fichman, Citation2009; Zhang, Citation2012). However, just as before, the same issues can be observed: the learning environments of educational institutions have seen little change in the past 20 years, especially in comparison to organisations adoption of such technologies, where there is still a lack of engaging students in the learning process, relying on the traditional method of teaching (Hustad & Olsen, Citation2014; Kane & Fichman, Citation2009; Zhang, Citation2012); educators, students, and employers, believe that technology enabled learning environments will enhance learning (Chen, Balijepally, & Sutanto, Citation2008; Tan, Street, Hawthorn, & Stockdale, Citation2011)); the IS discipline has also focused much research on social media in terms of their impact on organisations, but have failed to discuss it in terms of how this knowledge could influence their own practice, especially in terms of teaching (Kane & Fichman, Citation2009).

Problem statement

However, while there are calls for social media to be introduced to learning environments, introducing them is not such a simple task, and should not be done just for the sake of it (Kane & Fichman, Citation2009), but educators need to consider the learning models that best suit the platforms to enable learning to occur (Alavi, Citation1994; Chen et al., Citation2008; Leidner & Jarvenpaa, Citation1995). Alavi (Citation1994) suggests that actively engaging learners in the learning process is preferred to the traditional method of teaching, where it generates more critical thinking, creative responses, and high-level reasoning strategies, amongst the learners (Hustad & Olsen, Citation2014; Leidner & Jarvenpaa, Citation1995; Zhang, Citation2012). So it is argued that it is necessary to reengineer the current traditional approach of learning, to a collaborative learning approach (Kirschner, Citation2001) as a collaborative technology may be better suited to enabling such a learning environment (Alavi, Citation1994; Hustad & Olsen, Citation2014; Kane & Fichman, Citation2009; Leidner & Jarvenpaa, Citation1995; Zhang, Citation2012).

We are therefore seeing the same occurrence today as twenty years ago, where a collaborative technology is being proclaimed to be able to impact the learning environments of education institutions, by changing, and possibly improving, the pedagogical approach, where the impact again comes in the form of changing from a traditional learning approach, to a collaborative learning approach. However, the problem that has been identified is:

There is a lack of understanding on whether the platforms that are enabled by social media are effective at enabling collaborative learning.

This provides an opportunity for research to be conducted to provide such an understanding, which will benefit practice, in particular educational institutions, and educators, to understand how to utilise social media in a manner that benefits their learners, otherwise there is the potential to fail to learn from the past, where technology was used to merely aid traditional learning environments as opposed to impact, and change them, which resulted in little improvements except speed up ineffective processes and methods of teaching (Leidner & Jarvenpaa, Citation1995). Further to this, by being able to evaluate their own collaborative learning environments, educators would also be able to understand where they can improve aspects of them, to increase the benefit to learners.

Numerous studies have focused on different platforms of social media and their impact on learning (Chen, Siau, & Nah, Citation2010; Franceschi, Lee, Zanakis, & Hinds, Citation2009; Kumar, Citation2012; Lattemann & Stieglitz, Citation2012; Phang & Kankanhalli, Citation2009; Schultze, Nardi, Rennecker, Stucky, & Hiltz, Citation2007; Zhang, Citation2012), however none of them focused on their impact on collaborative learning. An issue with each of these studies is that while they do provide important findings for educators in relation to adopting social media into different types of learning environments, they are each specific to the study that has been set up. That is to say, no framework has been built in these studies to allow educators to evaluate the effectiveness of the learning environments that they build, but instead are reflective only of the ones in the studies. This provides an opportunity for such a framework to be developed that allows educators to evaluate the effectiveness of the collaborative learning environments they build. The next section introduces the objective of the solution, which has been inferred from the problem that has been stated, and provides a focus for the research.

Define objective(s) of a solution

Stating the objective(s) for the research is necessary to provide focus (Peffers et al., Citation2007). The objective(s) should be inferred from the problem definition, while also stating what is possible and feasible. This objective(s) will eventually act as the metric at the evaluation stage, when the artefact will be judged to have achieved its intended goal of solving the identified problem. When stating the objective(s), they can be in quantitative terms (where a desirable solution would be better than current ones), or qualitative (description of how a new artefact is expected to support solutions to problems not hitherto addressed) (Peffers et al., Citation2007). While a relevant problem that needs to be addressed has been identified above, the objective that was inferred from this is:

Evaluate the effectiveness of social media enabled collaborative learning environments.

This is a quantitative measure, which will be used as the metric at the evaluation stages of the design process, to see if the artefact that is designed and built has achieved its intended goal. The next section explains the design, build, and evaluate cycles that occurred, which answers the first research question.

Designing, building and evaluating the SMECLE evaluation framework

The next stage in the IS design science process model was followed, which initially consists of designing and building an artefact. This involves moving from the research objective and actually demonstrating that it is feasible to build the identified artefact. The design involves understanding the studied domain, and applying relevant scientific and technical knowledge, while the build refers to the construction of the artefact based on this knowledge, demonstrating such an artefact can be constructed.

Once an artefact has been built, the researcher must evaluate its utility by comparing the objective(s) of the solution, to actual observed results from the use of the artefact in its intended environment. These objectives therefore act as the metrics, which define whether the artefact has achieved its intended goal of solving its identified problem, or not. It is an iterative step, where the researcher can decide to take the lessons learned in the evaluation activity and return to the design and develop activity to improve the artefact, or alternatively, they can move onto the next activity in the DSR process model and leave further improvements for future research. Together, these three elements make up the design cycle, and presented in Table are the design cycles that were followed in this research.

Table 1. Design cycles for this study.

Over the following sections, each of the design cycles for creating the SMECLE evaluation framework are explained. First however, to be able to evaluate the SMECLE framework, data needs to be collected from some social media enabled collaborative learning environments (SMECLEs). The six SMECLEs that were used to do this are introduced next.

SMECLE case studies

Two types of social media enabled collaborative learning environments (SMECLEs) were created, and run, for this study: microblog enabled collaborative learning environments (CLEs), and blog enabled collaborative learning environments (CLEs). Microblog enabled CLEs allow live discussions to occur between learners with a character limit of 140 characters per message. Blog enabled CLEs allow learners to write longer posts, and allows others to comment on them, but do not allow live discussions, instead discussions occur over time. Three of each were created with different classes, following design principles (DPs) that were identified in the IS literature on creating collaborative learning environments, but adopted to fit social media platforms (SMPs). These steps are presented in Table .

Table 2. Steps for creating the SMECLEs.

An overview of the three microblog enabled CLEs are presented in Table .

Table 3. Overview of the three microblog enabled CLEs.

An overview of the three blog enabled CLEs are presented in Table .

Table 4. Overview of the three blog enabled CLEs.

Table illustrates each version of the evaluation framework, and the case study used to evaluate it. For example, for SMECLE evaluation framework V1.0, there was only a single data-set used to evaluate it, as the IS6119 data-set indicated it was not useful at evaluating SMECLEs. For V2.0, after another design and build phase, where the framework was amended, the IS3101 data-set was used to evaluate it (it was not necessary to evaluate IS6119 again as the amendments were made based on this data-set and thus would not indicate any issues, represented by the red Y). However, the new data-set, IS3101 indicated V2.0 was not useful at evaluating SMECLEs. This process continued until all of the datasets indicated that the evaluation framework was useful at evaluating SMECLEs.

Table 5. The design cycles for the research, with the data sets used to evaluate each version of the SMECLE evaluation framework.

Each design cycle is introduced in the following sections, beginning with Phase 1

Phase 1: designing, building, and evaluating the SMECLE evaluation framework V1.0

Designing and building SMECLE evaluation framework V1.0

Three building blocks to build the evaluation framework were identified as being necessary: 1. social media platforms; 2. social media characteristics; and 3. collaborative learning characteristics. To understand each of these building blocks, a review of the IS literature was undertaken, where six social media platforms were identified and explained, as well as five social media characteristics, and five collaborative leaning characteristics. When this was completed, SMECLE evaluation framework V1.0 was built by putting these building blocks together, where a matrix that juxtaposes the five characteristics of social media against the five characteristics of collaborative learning, created, on a single page, an evaluation framework to analyse if the social media platform is effective at enabling collaborative learning to occur. Further to this, the matrix created twenty-five relationships that required different rules to act as indicators to whether an instance of an intersection between two characteristics had occurred, and these were created based on the understanding of how a social media characteristic may enable a collaborative learning characteristic. SMECLE evaluation framework V1.0 is presented in Figure , where the building blocks and cell rules can be seen. The evaluation of SMECLE evaluation framework V1.0 is presented next.

Figure 1. SMECLE evaluation framework V1.0 with rules.

Figure 1. SMECLE evaluation framework V1.0 with rules.

Evaluating the SMECLE evaluation framework V1.0

After the evaluation framework had been built, it was evaluated by using it to analyse the data from the first SMECLE case study that was run, IS6119. This involved reading the data that was created in that learning environment, and determining if it met any of the rules in the evaluation framework. It was observed that there were 13 cells with instances, from a possible 25, where 10 were demonstrated to comply with the rules. However, the data demonstrated there were 3 cells that the rules were ineffective at determining when a social media characteristic enabled a collaborative learning characteristic. This indicated that the objective had not been met, as the evaluation framework was not capable of evaluating the effectiveness of the SMECLE. The learning from this evaluation was then brought into the next design and build activity so the evaluation framework could be amended.

Phase 2: designing, building, and evaluating the SMECLE evaluation framework V2.0

Designing and building SMECLE evaluation framework V2.0

The evaluation framework’s building blocks were demonstrated to be effective for building a SMECLE evaluation framework, but some rules were demonstrated to be ineffective. To be able to redesign the evaluation framework, the learnings from the evaluation section in Phase 1 are used; where the data is used to amend the rules. The three cells are:

(1)

‘Social Interaction, Active Learning’

(2)

‘Social Interaction, Role of the Instructor’

(3)

‘Content Sharing, Active Learning’

The data from IS6119 indicated that all three of the rules for these cells were too broad. For example, it was observed that learners were making comments that had little to do with the task, but these were still classified as instances of ‘Social Interaction, Active Learning’. Further, every tweet that the instructor sent related to the task in some way, but it is possible that if they sent a tweet non-task related, it would still be classified as an instance of ‘Social Interaction, Role of the Instructor’, even though the instructor would not be fulfilling their role. The understanding from this is that learners need to be commenting on the task that has been set, trying to discuss and engage with each other about it, to enable Active Learning to occur. Also, for the instructor to fulfil their role, they need to be relating to the task. In terms of content sharing, there were a number of tweets where learners shared some content, such as a link to a YouTube clip, but with no indication that it was consumed, or understood. The understanding from this is that learners need to be sharing content that is in relation to the task, and to indicate that they have consumed, and understood the content, for Active Learning to occur. Each of these three rules were thus amended.

A retrospective review of the other rules was also carried out, based on the learning that was derived from these three amendments, and is used to update cells where clear anomalies exist. This highlighted that all of the base rules failed to take into account that they need to focus on the task that must be completed by the learners. This was a clear anomaly, so ‘in relation to the task’ was added to all of the rules. The evaluation of SMECLE evaluation framework V2.0 is presented next.

Evaluating the SMECLE evaluation framework V2.0

SMECLE evaluation framework V2.0 was evaluated by using it to analyse data from the second SMECLE case study that was run, IS3101. This involved reading the data that was created in that learning environment, and determining if it met any of the rules in the evaluation framework. It was observed that there were 13 cells with instances, from a possible 25, where 11 were demonstrated to comply with the rules. However, the data demonstrated there were 2 cells where the rules were ineffective at determining when a social media characteristic enabled a collaborative learning characteristic. This indicated that the objective had not been met, as the evaluation framework was not capable of evaluating the effectiveness of the SMECLE. The learning from this evaluation was then brought into the next design and build phase so the evaluation framework could be amended.

Phase 3: designing, building, and evaluating the SMECLE evaluation framework V3.0

Designing and building SMECLE evaluation framework V3.0

To be able to redesign the evaluation framework, the learnings from the evaluation section in Phase 2 are used; where the data is used to amend the cells and their rules. The two cells are:

(1)

‘Content Sharing, Active Learning’

(2)

‘User Generated Content, Active Learning’

The data from IS3101 indicated that the structure of these two cells was too limiting, where it was observed that when learners shared content, or generated and shared content, it didn’t always get noticed by other learners. The understanding from this, is that Content Sharing can enable Active Learning to occur at different levels, namely at an individual level, and a group level, and so too can User Generated Content. Therefore, the cells are restructured to accommodate for these two levels, and the rules are amended to implement this understanding. With these two cells amended, a retrospective review of the other cells was undertaken, reviewing their rules with respect to the new learning that was acquired.

However, no clear anomalies were identified, so no further rules needed to be amended. The evaluation of SMECLE evaluation framework V3.0 is presented next.

Evaluating the SMECLE evaluation framework V3.0

SMECLE evaluation framework V3.0 was evaluated by using it to analyse data from the third SMECLE case study that was run, IS4428, and the first one, IS6119 (there was no need to analyse the second case study as that was used to update the rules). For the IS4428 analysis, it was observed that there were 11 cells with instances, from a possible 25, where all 11 were demonstrated to comply with the rules. For the IS6119 analysis, it was also observed that there were 11 cells with instances, from a possible 25, where all 11 were demonstrated to comply with the rules. Since this is the second and third microblog enabled CLE where no ineffective cells were identified, it is necessary to evaluate the usefulness of the SMECLE evaluation framework V3.0 with a different type of SMECLE.

IS2200 is a blog enabled CLE, and was used to evaluate SMECLE evaluation framework V3.0. This involved reading the data that was created in that learning environment, and determining if it met any of the rules in the evaluation framework. It was observed that there were 13 cells with instances, from a possible 25, where 7 were demonstrated to comply with the rules. However, the data demonstrated there were 6 cells that the rules were ineffective at determining when a social media characteristic enabled a collaborative learning characteristic. This indicated that the objective had not been met, as the evaluation framework was not capable of evaluating the effectiveness of the SMECLE. The learning from this evaluation was then brought into the next design and build activity so the evaluation framework could be amended.

Phase 4: designing, building, and evaluating the SMECLE evaluation framework V4.0

Designing and building SMECLE evaluation framework V4.0

To be able to redesign the evaluation framework, the learnings from the evaluation section in Phase 3 are used; where the data is used to amend the cells and their rules. The six cells are:

(1)

‘Social Interaction, Active Learning’

(2)

‘Social Interaction, Group Participation’

(3)

‘Social Collaboration, Active Learning’

(4)

‘Social Collaboration, Group Participation’

(5)

‘Content Sharing, Active Learning’

(6)

‘User Generated Content, Active Learning’

The data from IS2200 indicated that the structure of these six cells was too limiting, where it was observed that when learners made comments, asked questions of each other or agreed disagreed with each other, shared content, or generated and shared content, it was beneficial to different levels of learners across the environment. These levels correspond with Bruffee (Citation1999, p. 8) who suggests that in CLEs there are different levels of groups at work: the assigned group, which consists of small groups of learners working together to learn the language, mores, and values of a particular community; the class group, which is a larger community consisting of the different assigned groups; and the discipline community group, which is a still larger community in which the learners are trying to become members of, where the class group is nested. Finally, there is also the individual themselves, which consists of each learner in the environment (Bruffee, Citation1999, p. 8). The understanding from this is that the different collaborative learning groups that manifest in face-to-face CLEs, also manifest in blog enabled CLEs, so each of the cells from above are restructured for these levels. The Group Participation cells do not contain the individual level as it requires at least two learners for group participation to occur.

Further to this, there were also a number of other new understandings. For example, it was observed that for Group Participation to occur there needs to be at least three interactions between at least two group members, with a consensual answer being reached, for it to be considered Group Participation. It was also observed that when learners agreed/disagreed with others, there was a common occurrence of learners leaving comments such as ‘Nice blog, I think you are right’ as opposed to providing a reason why. This led to the new understanding that learners need to provide a reason why they agree/disagree with other learners. Further, it was observed that learners who acknowledge content that has been shared, need to provide an understanding of it also, to indicate Active Learning has occurred as a result of them consuming it, which also applies for user generated content. Each of these new understandings were implemented into the rules for the restructured cells from above.

With these six cells amended, a retrospective review of the other cells was undertaken, reviewing their rules with respect to the new learning that was acquired. From the new understanding of what constitutes Social Collaboration, where it is necessary for a learner to explain why they agree/disagree with another learner, the rest of the cells containing Social Collaboration were amended. Also, from the new understanding of what constitutes Group Participation, where there needs to be at least three interactions, a review of the cells that contain Group Participation was done, and it was deemed two further cell’s rules were required to be amended to incorporate this understanding, ‘Content Sharing, Group Participation’, and ‘User Generated Content, Group Participation’. The evaluation of SMECLE evaluation framework V4.0 is presented next.

Evaluating the SMECLE evaluation framework V4.0

SMECLE evaluation framework V4.0 was first evaluated by using it to analyse data from the three microblog enabled CLEs. For each of the IS6119, IS3101, and IS4428 analysis, it was observed that there were 13 cells with instances, from a possible 25, that occurred at different levels, where all 13 were demonstrated to comply with the rules. Another blog enabled CLE, IS6118, was then used to evaluate it, where it was observed that there were 14 cells with instances, from a possible 25, where 10 were demonstrated to comply with the rules, with these instances occurring at different levels. However, the data demonstrated that there were 4 cells that the rules were ineffective at determining when a social media characteristic enabling a collaborative learning characteristic. This indicated that the objective had not been met, as the evaluation framework was not capable of evaluating the effectiveness of the SMECLE.

The learning from this evaluation was then brought into the next design and build activity so the evaluation framework could be improved.

Phase 5: designing, building, and evaluating the SMECLE evaluation framework V5.0

Designing and building SMECLE evaluation framework V5.0

To be able to redesign the evaluation framework, the learnings from the evaluation section in Phase 4 are used; where the data is used to amend the cells and their rules. The four cells are:

(1)

‘Social Interaction, Learner Diversity’

(2)

‘Content Sharing, Group Participation’

(3)

‘User Generated Content, Group Participation’

(4)

‘User Generated Content, Learner Diversity’

The data from IS6118 indicated that the structure of these four cells was too limiting, where it was observed instances occurred at the different levels that were identified in Phase 4. For example Social Interaction can enable Learner Diversity to occur at an individual, assigned group, class group, or discipline community group level, depending on who acknowledges the comment a learner makes. Therefore, the cells are restructured to accommodate for these levels, but the Group Participation ones do not contain an individual level, and the rules are amended to implement this understanding. With these four cells amended, a retrospective review of the other cells was undertaken, reviewing their rules with respect to the new learning that was acquired. However, no clear anomalies were identified, so no further rules needed to be amended. The evaluation of SMECLE evaluation framework V5.0 is presented next.

Evaluating the SMECLE evaluation framework V5.0

SMECLE evaluation framework V5.0 was evaluated by using it to analyse data from a new blog enabled CLE, IS1100, and a previously used one, IS2200 (there was no need to analyse the second case study as that was used to update the rules). For the IS1100 analysis, it was observed that there were 14 cells with instances, from a possible 25, that occurred at different levels, and all complied with the rules. For the IS2200 analysis, it was observed that there were 15 cells with instances, from a possible 25, that occurred at different levels, and all complied with the rules. This is the second and third blog enabled CLE where no ineffective cells were identified, and this was also the case with two of the microblog enabled CLEs, IS3101, and IS4428, which also complied with the rules. However, for IS6119, another microblog enabled CLE, it was observed there were 13 cells with instances, from a possible 25, that occurred at different levels, but the data demonstrated that there were 6 cells that the rules were ineffective at determining when a social media characteristic enabling a collaborative learning characteristic. This indicated that the objective had not been met, as the evaluation framework was not capable of evaluating the effectiveness of the SMECLE. The learning from this evaluation was then brought into the next design and build activity so the evaluation framework could be improved.

Phase 6: designing, building, and evaluating the SMECLE evaluation framework V6.0

Designing and building SMECLE evaluation framework V6.0

To be able to redesign the evaluation framework, the learnings from the evaluation section in Phase 5 are used; where the data is used to amend to amend the cells and their rules. The six cells are:

(1)

‘Social Interaction, Role of the Instructor’

(2)

‘User Generated Content, Role of the Instructor’

(3)

‘Social Interaction, Learner Relationships’

(4)

‘Social Collaboration, Learner Relationships’

(5)

‘Content Sharing, Learner Relationships’

(6)

‘User Generated Content, Learner Relationships’

The data from IS6119 indicated that the structure of these six cells was too limiting. Firstly, it was observed that instances of ‘Social Interaction, Role of the Instructor’ and ‘User Generated Content, Role of the Instructor’ occurred at the different levels that were identified in Phase 4, and as such the cells are restructured to accommodate for these levels and the rules are amended to implement this understanding. Secondly, in a CLE, learning is shared amongst he learners and the instructor, where relationships are formed, and strengthened, when learning occurs from instructor-to-learner, learner-to-learner, and learner-to-instructor. It was observed that these types of relationships were formed or strengthened across the Learner Relationships cells. This new learning was applied when building the evaluation framework, where the cells were restructured to include these levels, and the rules were amended accordingly.

Evaluating the SMECLE evaluation framework V6.0

All six of the SMECLE cases were analysed with SMECLE evaluation framework 6.0, but no cell structure changes or rule changes were identified as being necessary. This indicated that the SMECLE evaluation framework was now useful at evaluating SMECLEs. Thus, in the next section, the completed SMECLE evaluation framework is presented as the contribution from this DSR to both practice and the knowledge base.

Discussion

DSR needs to make contributions to both practice and the knowledge base for it to be considered DSR, and separate it from the mere task of developing artefacts (Hevner, Citation2007; Winter, Citation2008). The primary contribution of this research, to both the knowledge base and to practice, is the SMECLE evaluation framework, which is presented in Figure . From the knowledge base perspective, there was previously a lack of understanding as to whether social media enabled collaborative learning to occur. The SMECLE evaluation framework provides a structure that increases our understanding of sixteen relationships that exist between the characteristics of social media and the characteristics of collaborative learning in a SMECLE. Further to this, it is now not only evident that sixteen social media characteristics can enable collaborative learning characteristics in a SMECLE, but it is also understood how they do so, from the rules that were created through six design cycles. These rules are presented in the following tables: Table ; Table ; Table ; Table ; and Table . From a practical perspective, the SMECLE evaluation framework provides knowledge that helps to reduce the trial and error needed from educators to start developing effective teaching processes when using social media platforms.

Figure 2. SMECLE evaluation framework.

Figure 2. SMECLE evaluation framework.

Table 6. Active learning cell rules.

Table 7. Group participation cell rules.

Table 8. Role of the instructor cell rules.

Table 9. Learner diversity cell rules.

Table 10. Learner relationship cell rules.

Conclusion

This study presented a DSR approach to build an evaluation framework to be able to evaluate the effectiveness of social media enabled collaborative learning environments (SMECLEs). The resulting evaluation framework was shown to be effective for its intended purpose, which was to be able to evaluate the effectiveness of social media enabled collaborative learning environments. Further to this, the contributions that have been made help towards solving the identified problem, where the knowledge created has given a better understanding on the effectiveness of social media platforms ability to enable collaborative learning environments.

Disclosure statement

No potential conflict of interest was reported by the authors.

References

  • Ajjan, H., & Hartshorne, R. (2008). Investigating faculty decisions to adopt Web 2.0 technologies: Theory and empirical tests. Internet and Higher Education, 11, 71–80.10.1016/j.iheduc.2008.05.002
  • Alavi, M. (1994). Computer-mediated collaborative learning: An empirical evaluation. MIS Quarterly, 18, 159–174.10.2307/249763
  • Alavi, M., Wheeler, B. C., & Valacich, J. S. (1995). Using IT to reengineer business education: An exploratory investigation of collaborative telelearning. MIS Quarterly, 19, 293–312.10.2307/249597
  • Armano, D. (2009a). Do you live social? Harvard Business Review [online], 30 December. Retrieved December 30, 2009, from http://blogs.hbr.org/cs/2009/12/do_you_live_social.html?cm_mmc=npv-_-WEEKLY_HOTLIST-_-JAN_2010-_-HOTLIST0104
  • Armano, D. (2009b). Six Social Media Trends for 2010. Harvard Business Review [online], 2 November. Retrieved November 03, 2009, from http://blogs.hbr.org/cs/2009/11/six_social_media_trends.html?cm_mmc=npv-_-WEEKLY_HOTLIST-_-NOV_2009-_-HOTLIST1109
  • Bajwa, D., Lewis, F., Pervan, G., Lai, V. S., Munkvold, B. E., & Schwabe, G. (2008). Factors in the global assimilation of collaborative information technologies: An exploratory investigation in five regions. Journal of Management Information Systems, 25, 131–166.10.2753/MIS0742-1222250106
  • Baker, S. (2009). Beware social media snake oil. Business Week [online], 03 December. Retrieved December 05, 2009, from http://www.bloomberg.com/bw/magazine/content/09_50/b4159048693735.htm
  • Benbasat, I., & Zmud, R. W. (1999). Empirical research in information systems: The practice of relevance. MIS Quarterly, 3–16.10.2307/249403
  • Bruffee, K. A. (1999). Collaborative learning: Higher education, interdependence, and the authority of knowledge. Baltimore and London: The John Hopkins University Press.
  • Chen, W., Balijepally, V., & Sutanto, P. (2008). Learning effectiveness and student satisfication in mobile classrooms. AMCIS 2008 Proceedings, Toronto, Canada.
  • Chen, X., Siau, K., & Nah, F. (2010). 3-D virtual world education: An empirical comparison with face-to-face classroom. ICIS 2010 Proceedings, Saint Louis, Missouri, USA.
  • Deragon, J. (2009). 5 Things You Must Ask About Social Media. The Relationship Economy [online], October 19. Retrieved October 23, 2009, from http://www.relationship-economy.com/?p=6947.
  • Franceschi, K., Lee, R. M., Zanakis, S. H., & Hinds, D. (2009). Engaging group E- learning in virtual worlds. Journal of Management Information Systems, 26, 73–100.10.2753/MIS0742-1222260104
  • Hevner, A. (2007). A three cycle view of design science research. Scandinavian Journal of Information Systems, 19, 87–92.
  • Hevner, A., March, S., Park, J., & Ram, S. (2004). Design science in information systems research. MIS Quarterly, 28, 75–105.
  • Hustad, E., & Olsen, D. H. (2014). Educating reflective enterprise systems practitioners: A design research study of the iterative building of a teaching framework. Information Systems Journal, 24, 445–473.10.1111/isj.v24.5
  • Kane, G. C., & Fichman, R. G. (2009). The shoemaker’s children: Using Wikis for information systems teaching, research, and publication. MIS Quarterly, 33(1), 1–17.
  • Kirschner, P. A. (2001). using integrated electronic environements for collaborative teaching/learning. Research Dialogue in Learning and Instruction, 10(1), 1–9.10.1016/S0959-4752(00)00021-9
  • Kumar, P. (2012). Teaching and learning in a virtual world: A pedagogical experimentation using second life. AMCIS 2012 Proceedings, Seattle, Washington, USA.
  • Lattemann, C., & Stieglitz, S. (2012). Challenges for lecturers in virtual worlds. ECIS 2012 Proceedings, Barcelona, Spain.
  • Leidner, D., & Jarvenpaa, S. (1993). The information age confronts education: Case studies on electronic classrooms. Information Systems Research, 4, 24–54.10.1287/isre.4.1.24
  • Leidner, D., & Jarvenpaa, S. (1995). The use of information technology to enhance management school education: A theoretical view. MIS Quarterly, 19, 265–291.10.2307/249596
  • Peffers, K., Tuunanen, T., Rothenberger, M. A., & Chatterjee, S. (2007). A design science research methodology for information systems research. Journal of Management Information Systems, 24, 45–77.10.2753/MIS0742-1222240302
  • Phang, C., & Kankanhalli, A. (2009). How do perceptions of virtual worlds lead to enhanced learning? An empirical investigation. ICIS 2009 Proceedings, Phoenix, Arizona, USA.
  • Reid, C. (2009). Should business embrace social networking? EContent [online], 15 June. Retrieved November 03, 2009, from http://www.econtentmag.com/Articles/Editorial/Feature/Should-Business-Embrace-Social-Networking-54518.htm
  • Schultze, U., Nardi, B., Rennecker, J., Stucky, S., & Hiltz, S. (2007). Using massively multi-member online worlds for work and education. ICIS 2007 Proceedings, Montreal, Quebec, Canada.
  • Soat, J. (2010). 7 questions key to social networking success. Information Week [online], 16 January. Retrieved January 17, 2010, from http://www.informationweek.com/news/internet/social_network/showArticle.jhtml?articleID=222301011
  • Tan, F. T. C., Street, J., Hawthorn, V., & Stockdale, R. (2011). Leveraging emerging web technologies for community engagement project success in higher education. ECIS 2011 Proceedings, Helsinki, Finland.
  • Winter, R. (2008). Design science research in Europe. European Journal of Information Systems, 17, 470–475.10.1057/ejis.2008.44
  • Zhang, X. (2012). Design and evaluation of a socially enhanced classroom blog to promote student learning in higher education. AMCIS 2012 Proceedings, Seattle, Washington.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.