660
Views
16
CrossRef citations to date
0
Altmetric
Research in Economic Education

Evaluating Twitter and its impact on student learning in principles of economics courses

, &
 

ABSTRACT

Ever since Becker and Watts (Citation1996) found that economic educators rely heavily on “chalk and talk” as a primary teaching method, economic educators have been seeking new ways to engage students and improve learning outcomes. Recently, the use of social media as a pedagogical tool in economics has received increasing interest. The authors assess students across three different institutions to see if the use of Twitter improves learning outcomes relative to a traditional Learning Management System. Using an experimental design, they find no evidence that the use of Twitter improves students' learning.

JEL CODES:

Notes

1. See Watts and Schaur (Citation2011, 300) for more detailed survey results, including information about the alternatives to lecturing that instructors report utilizing.

2. See Al-Bahrani, Holder, Patel, and Wooten (Citation2015) for detailed citations and discussion on the use of music, movies, TV shows, blogs, and podcasts in economic classrooms.

3. Please see Al-Bahrani and Patel (Citation2015) for a more comprehensive description of Twitter and how to use it in an economics course.

4. Based on their survey, Al-Bahrani, Patel, and Sheridan (Citation2015) found that the top five most popular social media platforms, based on number of users, include Facebook (88 percent), YouTube (83 percent), Instagram (69 percent), Twitter (67 percent), Google+ (36 percent), and Pinterest (35 percent). The percentage of students who access their social media accounts several times per day is much higher (78 percent) than those who access Blackboard multiple times per day (48 percent).

5. In an effort to mitigate any potential bias in teaching methods, considerable effort was made to maintain uniformity within sections at each school. For their specific school, each instructor used the same book(s) for their sections, provided the same exams, taught the same topics, and weighted assignments equally.

6. Students were made aware that an experiment was being conducted and that information about them could be collected as part of the experiment, but that any identifying information could be known only to their instructor. Students were given an opportunity to opt out of part or all of the study, in accordance with IRB standards.

7. A full list of control variables is available in .

8. TUCE is a standardized test of understanding for introductory economics, for use at U.S. undergraduate institutions and published by the Council for Economic Education. For this measure, we calculate the number of questions students answered correctly on the posttest minus the number of questions they answered correctly on the pretest (while correcting for the different number of questions on the Microeconomics and Macroeconomics TUCE exam). Note that a student who performs worse on the posttest could have a negative score for this measure.

9. Emerson and Taylor defined their gap-closing measure as actual improvement divided by the potential gain in score (which is the difference between the posttest score and the pretest score); e.g., (post-TUCE – pre-TUCE)/(33 – pre-TUCE). Because our experiment was conducted both in micro and macro principles, we scaled our gap-closing measure to percentages to correct for the different number of questions on each exam; the Microeconomics TUCE has 33 questions, and the Macroeconomics TUCE has 30 questions. Our student gap-closing measure is defined as (post TUCE – pre-TUCE)/(100 – pre-TUCE) where the post- and pre-TUCE scores are not raw scores, but rather are percentages.

10. There were 246 students registered at the beginning of the semester: 96 from NKU, 89 from UTM, and 61 from NCC. Out of the initial sample, 17 students decided to opt out from the surveys: 4 from NKU, 7 from UTM, and 6 from NCC. Out of the remaining sample, 6 students withdrew from the course during the semester (3 each from NKU and UTM), and there were 2 incomplete surveys (1 each from NKU and NCC). Lastly, there were 58 students with missing observations for one or more variables; 34 of these students were either missing the pre- or post-TUCE scores.

11. At the time of the study, Twitter did not collect engagement data. Recently, Twitter has introduced Twitter Analytics, which keeps track of the number of people who engage with each tweet, including various types of engagement such as viewing the tweet, clicking on a link (if one is provided), etc.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.