895
Views
1
CrossRef citations to date
0
Altmetric
Research Article

The Management of Uncivil and Hateful User Comments in Austrian News Media

ORCID Icon &
Received 28 Jul 2022, Accepted 27 Feb 2023, Published online: 16 Mar 2023

ABSTRACT

Almost every news media outlet in Austria offers a comments section on their websites and/or on their social media pages. A growing problem in these frequently used forums is the posting by some users of uncivil and hateful speech, which must be dealt with by the news media outlets and their community management. Based on 39 semi-structured interviews with community managers and journalist-moderators, this study indicates that the approach of Austrian news media outlets to user comments differs between their websites and their social media channels; the provision of space for user comments on websites seeks the normative objective of serving the community and promoting the formation and exchange of opinions, whereas comments sections on social media often follow commercial interests. Differences between news media are primarily due to the size of the community management teams and thus depend on human resources.

Introduction

Today, almost every news media outlet in Austria offers an online comments section alongside the articles on their website and/or social media pages. These spaces are “the most prominent form of user participation in journalism” (Wintterlin et al. Citation2020, 904). Online comments sections are open to everyone to present their opinions and views and to engage in interactive discussions with each other (e.g., Coe, Kenski, and Rains Citation2014; Wintterlin et al. Citation2020). In this way, the media provides a space for critical public discussions (Wolfgang Citation2016).

Most news media employ community managers, who are responsible for moderating the comments sections. However, particularly in news outlets with fewer resources, the journalists themselves can perform the role of community moderators—Wolfgang, McConnell, and Blackburn (Citation2020) refer to them as journalist-moderators.

A growing problem facing news media outlets and their community managers is the use of uncivil, hostile, destructive, and/or hateful speech by some users (Coe, Kenski, and Rains Citation2014; ECRI Citation2016; Paasch-Colberg et al. Citation2018; Quandt Citation2018; Wintterlin et al. Citation2020; Ziegele, Naab, and Jost Citation2019). Uncivil and hateful user comments on the websites and social media pages of news media can be consequential because they can negatively influence readers’ perceptions and interpretations of the information that precedes them, leading to attitude polarization (Anderson et al. Citation2014), diminishing of the credibility of the news article (Naab et al. Citation2020) and media brand (Wintterlin et al. Citation2021), and erosion of political trust (Mutz and Reeves Citation2005). Incivility and hate can even increase fear, spread violent thoughts, and promote behaviors targeting groups highlighted by commenters (Nemes Citation2002). Therefore, the management of uncivil and hateful speech is an important endeavor and knowledge about it, such as moderation practices of comments sections, can help to contain uncivil and hateful speech.

To our knowledge, there are no studies on the management of uncivil and hateful user comments in news media in Austria. Based on interviews with 24 newspapers in 10 countries (not including Austria), Domingo (Citation2011, 93) notes that “there are no universal audience participation practices, workflows or strategies.” With our study, we are able to shed light on the Austrian situation and contribute to the international discussion on the management of uncivil and hateful user comments. Following previous work by Paasch-Colberg et al. (Citation2018), this study examines the management of uncivil and hateful speech in Austrian news media with a focus on moderation practices and those factors influencing them. We conducted in-depth interviews with 36 community managers and 3 journalist-moderators. A key benefit of this study is that it includes a broad range of newspapers, television, and radio programs, private and public media outlets, quality and tabloid media, as well as national, regional, and local media to cover a wide spectrum of the Austrian media landscape. Thereby this study provides a comprehensive picture of different community management practices in Austria.

Background of the Study

There is no universal consensus on which speech is uncivil or even hateful (Nemes Citation2002), because it often lies in the eye of the beholder (Herbst Citation2010). Nevertheless, there are a few common themes when it comes to defining this sort of speech. Uncivil comments feature an unnecessarily disrespectful tone toward other participants or the topic under discussion and do not add anything of substance to the discussion (Coe, Kenski, and Rains Citation2014). Rossini (Citation2022) differentiates between incivility, intolerance and hate speech. Incivility is referring to the tone when people use foul and vulgar language, are rude and harsh. Intolerance is about attacking individuals and groups and missing moral respect, it is dangerous and democratically harmful (see also Ziegele and Jost Citation2020) and online speech that may even encourage participation (i.e., incivility). Hate speech is a subtype of intolerance which aims to humiliate, abuse or insult groups and their members (Rossini Citation2022; see also Sponholz Citation2021). Hate speech conveys more destructive and often dangerous notions. According to the European Commission against Racism and Intolerance (Citation2016, 3), hate speech is:

the advocacy, promotion or incitement, in any form, of the denigration, hatred or vilification of a person or group of persons, as well as any harassment, insult, negative stereotyping, stigmatization or threat […] on the ground of ‘race’, colour, descent, national or ethnic origin, age, disability, language, religion or belief, sex, gender, gender identity, sexual orientation and other personal characteristics or status.

Sponholz (Citation2021) emphasizes that hate speech is a highly contested concept: “As a consequence, one cannot assume that the term has always been applied with the same meaning or that authors investigating this kind of group libel are calling their research object by the same name” (54). For example, for social media, Sponholz (Citation2021) differentiates between hateful speech, hate-fueled speech, hatred-inciting speech, and dangerous hate speech (the latter is not further specified). User comments with hateful speech insult or openly threaten people based on a collective characteristic, such as race, color, sex, disability, religion, or sexual orientation. In hate-fueled comments, people are labeled based on a collective characteristic and hate speech happens when prominent personalities, such as politicians, stigmatize people based on a collective feature.

Uncivil and hateful speech in comments sections is not always related to the article or story to which it is attached. As speech we define comments with words as in user forums a discussion is text-based and not verbal. For instance, a study on US newspapers showed that online reader comments sometimes include racial terms even when the article does not (Harlow Citation2015). Quandt (Citation2018) highlighted that the motives and reasons for these forms of user behavior vary notably between actors and can range from apparently irrational random acts to planned and synchronized engagement between users.

Setting

To deepen our understanding on the topic of community management, this study focuses on news media in Austria. The Austrian media landscape is characterized by a high newspaper circulation and strong public service broadcasting (Hallin and Mancini Citation2004; see also democratic corporatist media systems). Austria is a so-called newspaper-centric society (Norris Citation2000), meaning that newspapers are the main source of information in Austria. For instance, the tabloid Kronen Zeitung reaches about 25% of the Austrian population (Media-Analyse Citation2020) and has one of the highest market shares in readership in any democratic country. Austria also has two free newspapers, Heute and Österreich. With a reach of 9.5% (Heute) and 7.5% (Österreich /oe24) they have the third and fourth highest readership levels in Austria after the Kronen Zeitung and the Kleine Zeitung (10.2%) (Media-Analyse Citation2020). Television remains the key electronic medium in Austria. The public service broadcaster ORF has the widest reach on TV (33.2% market share; providing three of the six Austrian TV news programs) and radio (74% market share; providing four of the 10 main Austrian radio programs). Competition in the print, radio, and television sectors has grown considerably in Austria over the past 20 years since the admission of private radio and television broadcasters in the late 1990s, through the ongoing process of digitalization, and given the proliferation of hybrid and free newspapers (Trappel Citation2019). Furthermore, Austrian media outlets face media competition of the neighboring German-speaking countries Germany and Switzerland. The competition in the media market forces media outlets to minimize their costs and maximize their revenues. In this sense, the media are heavily dependent on advertising revenues. Over the years, the internet has increased media concentration, competition, and commercial pressures (Trappel Citation2019). However, the “online media market largely mirrors the world of analog media” (Trappel Citation2019, 220, translated by the authors) and the existing media constitute much of the online public sphere (Trappel Citation2019). The majority of Austrian online news users access online news via a news website app (38.7%), followed by social media (29.0%), and by entering the name of a news media website in search engines (28.7%) (Gadringer et al. Citation2021). Der Standard, Austria's leading quality newspaper, was the first German-language newspaper to go online. Today, it is the newspaper with the most community managers (Brodnig Citation2016). According to the Reuters Digital News Report in 2019, 18% of Austrians regularly comment on news via social media or websites, which is slightly below the European average of 23% (Newman et al. Citation2019).

Austrian law guarantees freedom of expression, with limits; incitement to hatred is, of course, punishable. Many distasteful user comments do not fulfill the incitement to hatred clause, because often the intent is missing or ambiguous. Hateful comments are often not intended, but posted in anger (Brodnig Citation2016). In January 2021—after the interviews of this study had been conducted—the Austrian government finally responded to the increasing prevalence of uncivil and hateful speech online by passing the “Communication Platforms Act” (Kommunikationsplattformen-Gesetz—KoPl-G Citation2022) and the “Prohibiting Hate Speech on the Net” law (Hass-im-Netz-Bekämpfungsgesetz—HiNBG Citation2022). The “Communication Platforms Act” expands criminal offenses and claims, making it easier to enforce the law, and it creates obligations for communication platforms such as Facebook to delete hateful comments within 24 hours. However, it excludes platforms of media companies in connection with their journalistic offerings (e.g., newspaper forums). The main argument is that media outlets already have guidelines as well as monitoring and reporting processes in place (i.e., self-regulation) and the general law takes effect. The “Prohibiting Hate Speech on the Net” law, however, does offer a new and simplified injunction procedure for hate postings, including the possibility of immediate enforceability.

Community Management: Its Role and Practices

Community management is part of institutional engagement “aimed at the organization or regulation of the process or content of online discussions” (Ziegele and Jost Citation2020, 894). Media outlets that offer user forums generally aim at a thriving online community to attract and bind users, i.e., consumers, which is key in times of increasing media competition (e.g., Humprecht, Hellmueller, and Lischka Citation2020; Kraut and Resnick Citation2011). For example, as discussed in the previous section, Austria has a strong newspaper market and competition is fierce. Hence, running a successful community management is part of the greater strategic process of a media outlet—it is intentional and goal-oriented. Organizational and financial resources are prioritized along the media outlet's fundamental aim of attracting and binding users. Research shows that potential users expect that the organization who started the community is committed to investment in the community (Kraut and Resnick Citation2011). If users believe a media outlet is committed to investing in its community management, its credibility is strengthened (Kraut and Resnick Citation2011).

Community management calls for good planning and preparation and is (at least to some extent) influenced by factors such as financial and human resources as well as policies and guidelines of the medium (Brodnig Citation2016; Ksiazek and Springer Citation2020; Meier, Kraus, and Michaeler Citation2018; Paasch-Colberg et al. Citation2020; Ruiz et al. Citation2011). A crucial element of a successful user forum culture is moderation (Ziegele and Jost Citation2020). From an institutional perspective, this involves the (coordination) work done behind the scene sustaining the community as a whole (Kraut and Resnick Citation2011). Most news media employ community managers who are responsible for moderating the comments sections. Meier, Kraus, and Michaeler (Citation2018) interviewed 20 people working with audience engagement or community management at media organizations or online platforms in Austria to reveal that most media outlets have specialists for managing the community. The community management process involves surveying comments sections and assessing individual contributions to decide which comments will be promoted and which will be hidden or deleted, as well as determining whether or not to block users. Community managers thereby serve a modified gatekeeping function (Paasch-Colberg et al. Citation2020) influencing the content and quality of online discussions. Bruns (Citation2003) refers to community managers as “gatewatchers” that serve as guides and their task is to evaluate what is reliable and appropriate information. Moreover, community managers sometimes answer questions, provide additional information, and actively encourage users to contribute their opinions, and views (Ksiazek and Springer Citation2020). In doing so, community managers can facilitate activity and cooperation (Grimmelmann Citation2015). Research shows that activity and cooperation within a forum are predictors of success, because they are indicators of growth (Kraut and Resnick Citation2011).

In democracies such as Austria, online communities of news media are based on principles of free speech including the free flow of information and open participation. The governance mechanisms employed by Austrian media are increasingly following deliberative standards (Seethaler Citation2015), according to which participants need to respect others, including actors featured in news articles, and interact constructively (Grimmelmann Citation2015; Meier, Kraus, and Michaeler Citation2018). It is about ensuring a minimum of discourse quality in comments sections (Grimmelmann Citation2015; Wintterlin et al. Citation2020). Coe, Kenski, and Rains (Citation2014, 662) highlight that user comments on news articles can stimulate public deliberation by “representing a wider range of opinions than are featured in the article itself and by providing the possibility of interaction between readers and journalists” (see also Ruiz et al. Citation2011). The decisions community managers take in their daily work influence “what is seen, what is valued, what is said” (Grimmelmann Citation2015, 42, 45).

Resources

Successful audience engagement needs sufficient financial and human resources (Brodnig Citation2016; Ksiazek and Springer Citation2020), which demands the commitment from the media outlet's top management (Meier, Kraus, and Michaeler Citation2018). In media outlets with fewer resources, journalists are most often charged with taking care of audience engagement, while in resource-strong and often larger media outlets, this role is performed by specialists or a team of community managers (Meier, Kraus, and Michaeler Citation2018). Notwithstanding this, as shown by Paasch-Colberg et al. (Citation2020), who conducted 20 interviews with community managers in Germany, community management teams are often lacking human resources and community managers feel overwhelmed by the quantity and tone of comments (see also Domingo Citation2011). According to Meier, Kraus, and Michaeler (Citation2018, 1059), community managers as well as journalist-moderators need “sufficient resources at their disposal–ranging from physical space to the usability of the applied tools and services, an atmosphere that encourages participation, up to the granting of a sufficient timeframe to establish and carry out the work.”

Policies and Guidelines

To prevent, de-escalate, and control uncivil and hateful speech, news media often establish commenting policies that dictate appropriate standards of conduct and rules for discourse and behavior (Wolfgang Citation2016). Policies and/or ethical guidelines are about regulating user participation in comments sections to guarantee that users’ contributions comply with democratic principles and help facilitate fruitful discussions (Springer and Kümpel Citation2018). News media have established policies specifying what is considered as acceptable and unacceptable conduct within their comments sections (e.g., Meier, Kraus, and Michaeler Citation2018; Ruiz et al. Citation2011). For example, after examining the commenting policies of 21 news corporations in the US, Wolfgang (Citation2016) found that policy statements or netiquettes specifically prohibit sexual, hateful, abusive, threatening, and invasive statements or actions, because this sort of behavior or speech inhibits the creation of a more egalitarian space for all participants. News media try to promote respectful and sincere dialogue to create a space for higher-quality discussions. Even though it is often a challenge to “develop a policy statement that expresses both an interest in promoting respectful dialogue, but also includes a defense of robust and passionate political confrontation” (Wolfgang Citation2016, 777). Transparency surrounding enacted policies and guidelines, as well as about moderation procedures, can be a precondition for audience trust and credibility (Ksiazek and Springer Citation2020; Meier, Kraus, and Michaeler Citation2018).

Established policies and guidelines as well as resource allocation are among the contingencies affecting the application of various moderation practices. Hence, our first research question (RQ1) asks: Which factors influence how the community management of Austrian news media manages uncivil and hateful user comments?

Moderation Practices

Moderation practices regulate the workflows of the community management (Ruiz et al. Citation2011) and dictate the management of user comments (Paasch-Colberg et al. Citation2020; Wintterlin et al. Citation2020). Frameworks are essential for successful audience engagement processes (Meier, Kraus, and Michaeler Citation2018) and Paasch-Colberg et al. (Citation2020) note that German news media are eager to establish common approaches and measures for their community management. However, Ksiazek and Springer (Citation2020, 76) note that there is a “lack of consensus of the most effective moderation strategy”, as this is partly influenced by organizational factors such as the editorial line, target groups, brand, and financing (Paasch-Colberg et al. Citation2020). For example, in Germany, Paasch-Colberg et al. (Citation2020) show that it is easier for private media outlets to delete a user comment than it is for fee-financed public service broadcasters. This example shows that national laws as well as societal norms and values may influence moderation practices.

Common moderation practices include the closing of comments sections for sensitive topics to avoid hateful comments, pre-moderation involving the assessment of comments for their appropriateness prior to publication, and post-moderation relating to the screening of comments after publication (e.g., Domingo Citation2011; Frischlich, Boberg, and Quandt Citation2019; Ksiazek and Springer Citation2020; Paasch-Colberg et al. Citation2020; Springer and Kümpel Citation2018; Wintterlin et al. Citation2020). Some media even outsource their community management, for example, to keep control “over the legal validity of contributions” (Domingo Citation2011, 81; see also Springer and Kümpel Citation2018).

Pre-moderation is often used in countries in which most hate speech is illegal (Singer Citation2011; Springer and Kümpel Citation2018). Studies for the US, on the other hand, show that user comments are rather post-moderated to avoid accusations of joint responsibility (Domingo Citation2011; Springer and Kümpel Citation2018). Ruiz et al. (Citation2011) indicate that pre-moderation helps to eradicate insults in comments, yet, that the application of different moderation practices does not seem to influence the quality of comments in a clear pattern.

The various levels of engagement between community managers and users can be distinguished along a spectrum from non-interactive to interactive moderation. Non-interactive moderation, described by Ksiazek and Springer (Citation2020) as “policing”, rather happens “behind the scenes” and therefore remains reactive and hidden (Wolfgang, McConnell, and Blackburn Citation2020). In this uni-directional moderation practice, uncivil and hateful comments are either never published (pre-moderation) or later removed (post-moderation). According to Frischlich, Boberg, and Quandt (Citation2019) and Wintterlin et al. (Citation2020), a non-interactive moderation style is rather authoritative and establishes a hierarchy between moderators and users. Blocking comments as well as admonishing, sanctioning, and excluding users is much more common in this approach than in interactive moderation practices (see also Neuberger, Langenohl, and Nuernbergk Citation2014). However, this limits users’ freedom of expression (Ksiazek and Springer Citation2020). Previous studies report that non-interactive, authoritative moderation routines are more widespread than the alternative (Frischlich, Boberg, and Quandt Citation2019; Neuberger, Langenohl, and Nuernbergk Citation2014).

Interactive moderation involves community manager engaging in bi-directional communication with the users (Frischlich, Boberg, and Quandt Citation2019). Moderators follow a deliberative perspective and “enter an eye-level dialogue” (Wintterlin et al. Citation2020, 4) with users, while giving them room to exchange their opinions and stances. Moderators are present and sometimes even answer or ask questions, introduce new topics, or provide additional information (Ksiazek and Springer Citation2020). This is particularly the case when journalist-moderators engage with users. Journalist-moderators foster a deliberative discussion atmosphere, stimulating users’ willingness to engage and thereby decreasing incivility, because users trust individuals more than organizational actors (Ksiazek and Springer Citation2020). However, moderating leaves journalists less time for their main job. Interactive moderation styles are not standard and when present, they are rather found within left-leaning than right-leaning media (Frischlich, Boberg, and Quandt Citation2019; Wintterlin et al. Citation2020).

To manage the growing amount of user comments, media outlets are increasingly making use of automated solutions: i.e., introducing “new algorithmic gatekeepers in the moderation process” supporting human moderators (Ksiazek and Springer Citation2020, 9; Meier, Kraus, and Michaeler Citation2018). Following Bruns (Citation2003) gatewatcher concept, we can refer to it as “automated gatewatching”. Comments are filtered based on pre-defined rules and when comments are identified as hateful, they are flagged or hidden. However, the final say about flagging or hiding a particular comment resides exclusively with human moderators.

Following the discussion above and informed by previous work by Paasch-Colberg et al. (Citation2018), our second and third research questions ask:

RQ2: What moderation practices are used by Austrian media outlets to prevent, control, and de-escalate uncivil and hateful speech?

RQ3: What are the (greatest) challenges for community managers in their daily work of dealing with uncivil and hateful comments?

Uncivil and hateful speech is a challenge for media outlets because its presence may lower the perceived credibility of the media brand among users (Wintterlin et al. Citation2021). Some authors have recognized that comments sections, in the absence of incivility, may even build brand value and strengthen the user's brand loyalty (Brodnig Citation2016; Meier, Kraus, and Michaeler Citation2018; Vujnovic Citation2011). However, an often-overlooked consideration is that media outlets sometimes understand audience engagement merely as a way to generate more website traffic (Meier, Kraus, and Michaeler Citation2018; Vujnovic Citation2011). Brodnig (Citation2016) highlights this is particularly valid for the Austrian media landscape. In such cases, user involvement is understood as a way to beat the competition—a fact emphasized in smaller markets as shown by a study with 24 online newspapers in 10 countries (Vujnovic Citation2011). Media outlets in general, and this applies to both private and public media in Austria, are highly dependent on advertising revenues and thus on reach and user traffic (e.g., page impressions) (Brodnig Citation2016; Kaltenbrunner et al. Citation2020; Meier, Kraus, and Michaeler Citation2018). Based on a recent survey among Austrian journalists, Kaltenbrunner et al. (Citation2020) note that intensive contact with users and systematic community building are both editorial-journalistic as well as business-economic goals.

Research Design

Sample

The selection of interview partners took the current situation of the Austrian media landscape into account; however, we were not able to accommodate all potentially relevant aspects, as we were dependent on the willingness of community managers to take part in our study. Nevertheless, the 39 interviews conducted in spring 2019, including 36 community managers and 3 journalist-moderators of Austrian news media, represent a broad range of the Austrian news media landscape.

Among the sample were 31 community managers and journalist-moderators from print media. These include two quality newspapers (Die Presse, Der Standard), two mid-market newspapers (Kurier, Kleine Zeitung), a tabloid paper (Kronen Zeitung,) as well as two free tabloid media (Heute, Österreich/oe24). We included local- and regional-oriented print news media (bz-Wiener Bezirkszeitung, Niederösterreichische Nachrichten (NÖN)), two investigative online news magazines (Addendum, Dossier) as well as weekly and monthly news media outlets (Falter, Profil, Das Biber, Forbes) with various political leanings. Furthermore, we included six community managers from TV programs, from both the public service broadcaster ORF and the private Servus TV, as well as two community managers from radio stations, namely ORF Ö3, the station with the widest reach, and the niche channel FM4. provides an overview.

Table 1. Overview of the sample (N = 39).

The number of interviews for the different news media outlets varies according to the size of the community management teams. For instance, we interviewed eight of the fifteen community managers of the national-oriented quality paper Der Standard and seven community managers working for the TV and radio channels of the public service ORF in order to include different ORF user forums. In contrast, we only interviewed one journalist-moderator from the local bz-Wiener Bezirkszeitung, because this individual is solely in charge of conducting the papers’ community management.

Semi-structured Interviews

To gain an in-depth understanding of uncivil and hateful speech and its management in Austrian news media, we conducted personal semi-structured qualitative interviews.Footnote1 Qualitative interviewing is about encouraging the interviewed participants to reveal detailed information about their environment, behavior and activities, beliefs and views, symbolic constructions, and/or relevance systems (Brinkmann Citation2014).

The semi-structured interview guide developed by the authors includes questions used in previous interview guidelines (Paasch-Colberg et al. Citation2018) and new questions developed on the basis of many of the studies cited above. It was guided by our three research questions. The pilot-tested guide included questions about the organization of the community management (e.g., size of the team), strategies, practices, and operative processes to de-escalate and control uncivil and hateful speech (e.g., moderation processes, tools, netiquettes, blocking of forums for sensitive topics, changes over time), criteria applied to distinguish uncivil and hateful speech within comments, topics and targets of hate speech, as well as challenges (e.g., the extent to which interviewees were professionally or privately exposed to hate speech, trolls). The full interview guide and the data is available from the authors on request.

All interviews were conducted in person and lasted between 20 and approximately 45 min. The interviews were audio-recorded, transcribed in full, and qualitatively analyzed using MAXQDA software. Qualitative content analysis allows categories to emerge inductively—these categories are the centerpiece of qualitative content analysis (Schreier Citation2014). To identify and conceptualize the relevant content of our interviews, we followed Mayring’s (Citation1991) approach, which is a structured, systematic, category-oriented, and intersubjective process to reduce, summarize, and organize texts. The coding unit represents the smallest text unit, which can be individual words, parts of sentences or even entire sentences. The context unit tells us how much context we are allowed to take under consideration which is, in our study, the entire interview with a person (Mayring Citation1991). In this process of analysis both authors first analyzed the same set of 13 interviews. Ambiguities and different interpretations were discussed after each interview, followed by a joint discussion of the categories, which emerged from the material. By working through the transcribed interviews, the categories were expanded step by step. Of course, this qualitative process of category formation, with the assignment of the text parts to the categories, represents to a certain degree an act of interpretation. This led to an initial categorization system. Second, we divided the rest of the interviews for coding. The coding process was accompanied by ongoing discussions to resolve ambiguities, discuss new categories, and add these to our categorization system. Lastly, having finalized the categorization system, we went through all of interviews again. We then looked for similarities and differences between the media outlets. The authors translated all the quotes presented in the following sections from German into English.

Managing the Community

Austrian news media report that they are challenged by increasing levels of uncivil and hateful speech in their communities over the past years. A few community managers describe Europe's refugee crisis in 2015 as a critical turning point as well as when the former coalition government of the Austrian People’ Party (ÖVP) and the populist right wing Freedom Party of Austria (FPÖ) came in power in 2017. A community manager of the mid-market newspaper Kurier points out: “Ordinary language had begun to radicalize itself. (…) Discussions very quickly reached a high emotional level (…) and issues were often presented as just black or white, good or bad, dead or alive, and in or out.” Community managers did not experience this level of radicalization and polarization ten or even 25 years ago. For example, the interview partner of the Kleine Zeitung explained that about 10% of the comments in the forums of the regional newspaper had to be deleted in the past and, over time, this number has risen to 15% and sometimes even 20%.

In response to our first research question, which investigates factors that influence the management of uncivil and hateful user comments, interviewees indicated multiple times that financial and human resources as well as policies and guidelines—i.e., strategic institutional decision—have a fundamental impact on their daily work.

Resources

The assignment of financial and human resources is a precondition for a successful audience engagement. With around 15 team members, the national-oriented newspaper Der Standard has the largest community management team in Austria. The community manager of Der Standard reported that through “their successful community management”, the newspaper has been able to decrease the amount of incivility and hateful comments in the user forums. In 2010, about 10% of user comments had to be deleted, but the rate has since decreased to about 4%. According to Der Standard, the average rate for similar media outlets is around 12%.

The interview data revealed that generally, national-oriented newspapers with a wider reach provide more resources for their community management. For instance, Der Standard currently has around 60,000 registered users and attracts about 10 million postings per year. Austria's newspaper with the widest reach, the tabloid paper Kronen Zeitung, as well as oe24, whose news media outlet publishes a national-oriented tabloid paper (Österreich) and runs a news channel, also have relatively large teams of about 10–12 people to run their comments sections on their websites and on social media. News media with a wider reach generally provide forums on their websites and on social media. The exception is the public broadcaster ORF, which has no comments sections on its website. However, ORF still employs more than ten people to manage the social media forums of its various radio and TV channels. Smaller news outlets such as the private TV station ServusTV, the magazine Forbes, and the local newspaper bz-Wiener Bezirkszeitung rather focus on external social media platforms. Their number of community managers ranges between one person to less than five. Especially in cases where community management is in the hands of a single person, it is often the responsibility of a journalist-moderator who is supported by the media's digital editorial staff. Frequently, journalists are charged with keeping an eye on the comments attached to their own articles.

Policies and Guidelines: “Netiquette”

Policies and guidelines, most often called netiquette in Austria, are also fundamental for the successful management of a community. Besides requirements not to insult and threaten others, these can forbid engagement in party propaganda or calls for demonstrations, and demand respect for copyrights and personal rights. Netiquette guidelines are available to users on the news media's websites and social media pages, and users have to accept them. For example, Der Standard always links to its community rules “whenever one wants to post.” However, our interviewees note that users seldom read the netiquette.

The findings show that existing resources and established policies and guidelines influence moderation practices, which are in the focus of our second research question, which asks what moderation practices are used by Austrian media outlets to prevent, control, and de-escalate uncivil and hateful speech?

Moderation Practices

“You can't just let it run,” as a community manager of Austria's widest reaching newspaper, the Kronen Zeitung, put it. All interviewees emphasized that moderation is essential to ensure and support a fruitful discussion. The interviews reveal several moderation practices, which often go hand in hand. Practices by Austrian news media to prevent, control, and de-escalate uncivil and hateful speech include the full range of in-house moderation from closing comments sections for sensitive topics to pre- and post-moderation. None of the interviewees reported about outsourcing their community management. Differences in moderation practices across media outlets are primarily due to the differing numbers of community managers.

At the beginning of the interviews, we asked interviewees about what constitutes incivility and hate speech from their perspective. Austrian news media indeed have a similar understanding of the limits of what sort of speech is acceptable and what is not. Their answers for hate speech are much in line with the definition given by the European Commission (see above). The community manager of Das Biber put into words the sentiments of many: “I think it's common sense” what is acceptable and what is not. Unacceptable comments include those that are “offensive”, “insulting”, “defamatory”, and “attacking”, “sexist” and “racist” comments, comments against “ethnic groups” and “religion”, as well as those “full of rage” and “violating criminal or other legal regulations”. However, as shown by the examples, multiple interviewees indicated that the exact demarcation between uncivil and hateful speech is often blurred and a gray area is present. For example, supporters of a right-wing party might perceive a comment in which the party is accused of violating human dignity as hate speech. However, a community manager of the mid-market newspaper Kurier, for example, mentioned that a comment would have to call the party a “criminal terrorist party” to actually count as hate speech. In cases where that community managers hold different views on what is appropriate and what is inappropriate, and hence what meets the news media's standards and what does not, those working in teams seek resolution through discussion. They work according to the “dual control principle”, as one interviewee pointed out.

Closing Comments Sections

There are certain sensitive topics which tend to draw the majority of uncivil and hateful comments in Austria. The interviewees mentioned several times immigration as well as religion, in particular Islam, and parties and politicians. Singer (Citation2011) shows similar results for Germany, the UK, and Israel. Multiple times interviewees indicated that on their websites they just deactivate the comments sections for such sensitive topics; this finding is in line with findings from Germany (e.g., Paasch-Colberg et al. Citation2020). Frequently news media with less than a handful of community managers use the possibility of just closing a comments section, but two of the larger national-oriented newspapers, the quality paper Die Presse and the mid-market paper Kurier also mentioned this practice. In news media with larger teams, the editor often informs the community managers before sensitive content goes online.

The decision on whether or not to close comments sections does not have to be made on social media for which content is specifically selected, even though user posts are often more problematic on social media than in the forums on media websites. As the Dossier reports, “nowhere else are people as poisonous as on Twitter and very often [users on Twitter] use their real identity including their profession.” A professional routine has been established for social media in which journalists and community managers work together, as explained by a community manager of the public broadcaster ORF: “The newsroom flags a news article [posting for the website] with ‘not for Facebook', because they know that our capacities cannot cope with it at the moment.”

Pre-moderation

The interviewed community managers indicated that, as far as possible and mostly dependent on human resource availability, they endeavor to read all comments. They do not fully rely on automated community management systems to prevent uncivil and hateful speech.

Comments determined to be uncivil and hateful are deleted and users are blocked in cases of massive violations. Blocked users can comment, but their comments remain hidden until released. Surprisingly, some blocked users do not even notice this fact, as described by a community manager of Der Standard: “They continue writing and do not notice that their comments do not appear.” Usually, however, users who violate the standards of conduct first receive a warning including a reminder of the netiquette guidelines before being blocked. If users ask, community managers explain and discuss the reasons for prohibiting certain comments and/or blocking users.

Users are becoming ever more sensitive to cross the line. Phrases such as “I am not allowed to write what I think now, otherwise I will be blocked” (ServusTV) have increased over time. Community managers indicated that blocked users are sometimes a great challenge, because “ten minutes later, you find the same phrases or the same monologues just under a new username or a newly created account” (Kurier). Blocked users often argue for their right to freedom of expression.

Community Management Systems

One component of pre-moderation practices are community management systems such as swat.io and Forumat. The majority of Austrian news media outlets have these algorithmic filtering tools in use which search comments based on predefined words and phrases and stop flagged comments from being published. Comments that have been identified as problematic before becoming publicly available are reviewed by a community manager and must be manually released should they be deemed acceptable. In addition, users who stand out can be flagged without them knowing, which helps to locate their comments and to control them. A community manager of the Kleine Zeitung said: “In the case of massive forum violations, there is the stop status. It means that the user can post, but the comment only appears after it has been manually activated.”

A great challenge mentioned by community managers is that users who spread uncivil and hateful speech are becoming ever more adept at bypassing filters to avoid their comments from being blocked (and hence, probably deleted). A community manager of the newspaper Die Presse suggested that “sometimes comments have to be read two or three times to discern what is behind a statement.” In such cases, the “dual control principle” is frequently applied.

Post-moderation

The post-moderation practices of Austrian news media outlets can be summarized under two main styles: “interactive moderation” and “staying-on-the-sideline”.

Interactive moderation is about engaging in bi-directional communication (Frischlich, Boberg, and Quandt Citation2019) with the users. The majority of news media applying interactive moderation practices said it is about “entering into a dialogue with the readers” (Kleine Zeitung) or to “enter into a discourse with users” (Der Standard). By showing presence and actively engaging, community managers aim to foster a rational and civilized discussion culture. This interactive post-moderation practice seems to influence the direction of discussions and their quality—at least to some extent. Interviewees report from their experiences that users are more “reserved with hatred” (e.g., Krone) when community managers actively engage with the audience.

Additionally, community managers of regional and local newspapers mentioned that users sometimes directly address the community management via email when they are looking for additional information and, on rare occasions, community managers talk with users on the phone or even meet them face-to-face. Der Standard, the media outlet with the highest number of staff, sometimes invites users in person to discuss their inappropriate behavior.

When news media follow “staying-on-the-sideline” moderation practices, community managers keep out of public discussions. As the community manager of a radio channel of the public service ORF put it, “our task is to be present in the background and ideally not be noticed” by the community. The practice is to intervene surreptitiously when uncivil and hateful comments are posted, although this applies to pre-moderation as well. Moderation happens rather reactively and remains hidden, as described by Wolfgang, McConnell, and Blackburn (Citation2020) for the US as well as by Frischlich, Boberg, and Quandt (Citation2019) and Wintterlin et al. (Citation2020) for Germany. Thus, such non-interactive moderation practices can be described as more authoritative than interactive moderation practices. Studies in Germany report that non-interactive, authoritative moderation routines are more widespread (Frischlich, Boberg, and Quandt Citation2019; Neuberger, Langenohl, and Nuernbergk Citation2014). This echoes with our findings in the sense that the interviews reveal that news media with a single community manager or less than a handful of community managers most often follow the “staying-on-the-sideline” moderation routine and such smaller community management teams are more widespread in Austria. This might even be the case for some of the “larger” news media such as the national-oriented newspaper Die Presse. Hence, moderation practices are largely determined by human resources.

Challenges

The third research question addressed the challenges community managers face in their daily work when dealing with uncivil and hateful comments. The interviewed community managers and journalist felt that the prevalence of trolls and paid users posting for a specific political party has increased over time. Both of these are often detected when several comments are posted to a single topic in a very short timeframe. Das Biber mentioned that “some long-established” trolls and haters are known by their writing style, their selection of words, and their punctuation. Here, highly polarizing comments referring to Russia are striking, regardless of whether they adopt a pro and con stance (e.g., Das Biber, Die Presse, Kleine Zeitung). However, not only Russian troll factories are trying to impose political impact. Both right-wing and left-wing groups also “hijack articles”, as a community manager of the Kleine Zeitung put it: “It's all there.” Most often trolls and paid users are detected during pre-moderation. Failing this, their posting behavior generally catches the eyes of community managers in the post-moderation phase.

Another challenge interviewees mentioned were personal attacks. Particularly personal attacks against journalists have increased over time; this is however less prevalent against community managers. Wintterlin et al. (Citation2021) show similar findings for Germany. A community manager of the (former) Austrian journalistic research platform Addendum put into words the sentiments of many: “When a topic is presented that the readers do not find ok, then the journalists are relative quickly and very personally attacked.” This goes as far as readers editing photos of a journalist-moderator of the local newspaper bz-Wiener Bezirkszeitung and distributing them in various photo groups, which led to a police report. Hence, this goes beyond post-moderation. The interviewee from Das Biber mentioned that she had to file a police report after insults and calls for violence against her person on the media's social media page as well as on her private social media account and through emails. The interviews reveal that women journalists seem to be more often the target of uncivil and hateful speech than their male counterparts. “You would not write any rape threats to a man,” mentioned the interviewee from Das Biber.

The findings show that, in general, the interviewed news media primarily make the normative claim of serving the community and promoting the formation and exchange of opinions by providing online comments sections to their articles on their websites. On their social media pages, however, commercial interests often predominate. Social media is used for marketing and PR purposes by driving website traffic as well as promoting the brand and new products. Multiple interviewees indicated that social media is used to “drive traffic to the website” (e.g., Der Standard, Kleine Zeitung) and to the “provided TV content” (ServusTV), to “reach new readers” (e.g., oe24), to “promote the brand” or “the product”, and “advertise new products” (e.g., ORF, Falter). It is still important, however, to build a community and strengthen the bond with users to achieve each of these commercial goals. Forums on both websites and on social media therefore need to “define a benefit for the target group in order for a forum to be successful” (ServusTV).

Discussion and Conclusions

A key benefit of this study is that it examines community management of a broad range of news media outlets in Austria. By not only focusing on newspapers, as the majority of previous studies in the field have done (Domingo Citation2011; Frischlich, Boberg, and Quandt Citation2019; Wintterlin et al. Citation2020, Citation2021), this study provides a more comprehensive picture of the news media's management of uncivil and hateful speech. With a focus on the Austrian media landscape, it widens the focus of systematic academic research in this field of digital journalism. Analyzing comments on Facebook pages of news organizations in Germany and the US, Humprecht, Hellmueller, and Lischka (Citation2020) show that there are country-specific online discussion cultures. Reacting to these partly leads to different approaches in community management. In this context, it is important to mention that in practice (at least in Austria) an exact demarcation between uncivil and hateful speech is often difficult. Contrary to the theoretical discussion (e.g., Sponholz Citation2021) and empirical findings based on content analysis of user forums (e.g., Rossini Citation2022 for Brazil; see also setting on background), practitioners emphasized that it is rather a gray area. Rossini (Citation2022) highlights that incivility “should not be treated as inherently toxic” (417) as it “may help citizens express their views and stand out among the crowd” (415). However, from an institutional strategic perspective, user forums are run to attract and bind users, therefore Austrian media outlets have employed deliberative standards (Seethaler Citation2015) to foster activity and cooperation which are indicators of growth (Kraut and Resnick Citation2011). Too many negative comments can affect the credibility of a media outlet (Humprecht, Hellmueller, and Lischka Citation2020). Moreover, following Humprecht, Hellmueller, and Lischka's (Citation2020) argument that hostile comments trigger more hostile comments, a media outlet aims to counteract to incivility in its beginnings before hate speech spreads across a user forum. In practice, the line between incivility and hate speech is often very thin.

In Austria, differences between media outlets largely result from human, and thus financial, resource constraints. It shows that there is a difference whether the community management is part of the larger institutional strategy or not. Human resources are a vital factor for community management processes. It influences whether a news media outlet provides spaces for audience engagement on its website and on social media, or only on the latter. Media with smaller community management teams or journalist-moderators most often just offer public spaces for discussion on social media. This highlights still another difference between media types, as TV programs only offer comments sections on social media.

The interviews reveal this applies to both the private TV channel as well as the public service broadcaster, which has the widest reach across both TV and radio in Austria (an online search confirms this pattern also applies for the other two Austrian TV stations, Puls4 and ATV). However, following news websites, social media is the second most important access option for news for Austrians (Gadringer et al. Citation2021).

Considering the different media formats, newspapers have the largest community management teams. Newspapers are the main source of information in Austria and their frequent use probably impacts the fact that newspapers put greater focus on user forums. News media with a wider geographical reach generally have larger community management teams, such as the national-oriented Der Standard (about 15 team members), Krone (about 12 team members), oe24 and the ORF (about 10 team members each). However, this pattern is not universal for national-oriented newspapers, as shown by Kurier and Die Presse. Thus, the willingness and interest in professional community management among the top management is an important factor for the provision of human resources to the community management.

As shown by previous studies for countries such as Germany (Paasch-Colberg et al. Citation2020; Wintterlin et al. Citation2020) and the US (Wolfgang Citation2016), policies and guidelines are applied as standard across Austrian news media, showing a willingness among outlets to take responsibility for their comments sections (Ruiz et al. Citation2011). Automatic filtering tools are an essential part of pre-moderation, helping to detect the growing number of trolls and paid users. Nevertheless, human moderation remains key for managing audience engagement (see also Wolfgang, McConnell, and Blackburn Citation2020). This high demand for human resources is particularly relevant for interactive post-moderation when, for example, moderators enter into discussions to de-escalate the situation instead of just deleting inappropriate content. However, even low-resourced media outlets have moderation practices in place. Structural factors like type of financing or orientation, including tabloid versus quality media, left-leaning versus right leaning-media or national, regional, and local orientation are less decisive for moderation practices. Practices applied by Austrian news media to prevent, control, and de-escalate uncivil and hateful speech include the full range of in-house moderation from closing comments sections for sensitive topics to pre- and post-moderation.

Pre-moderation practices in Austria are similar to those described in studies for other countries (Paasch-Colberg et al. Citation2020; Ruiz et al. Citation2011; Wintterlin et al. Citation2020). There is also a trend towards more “automated gatewatching” (Bruns Citation2003). A few community managers particularly emphasized that final decisions about automatically detected comments require them to be read again and in specific cases discuss them with others—the “dual control principle”. This indicates that community managers take their gatewatching function (Bruns Citation2003) seriously.

Post-moderation practices follow two main styles and again are similar to those described in studies for other countries (e.g., Ksiazek and Springer Citation2020) such as Germany (Frischlich, Boberg, and Quandt Citation2019; Wintterlin et al. Citation2020) and the US (Wolfgang, McConnell, and Blackburn Citation2020). Particularly news media with smaller community management teams tend to follow “staying-on-the-sideline” moderation practices by intervening only when dealing with uncivil and hateful comments. Other news media have their community managers actively engage with the users. “Interactive moderation” can positively stimulate a fruitful exchange of opinions and views. An active community management is necessary to ensure rational and civilized discussion. It fosters a media outlet's credibility helping to attract and bind users. But just like in other countries, uncivil and hateful speech has been growing recently in Austria and users are becoming ever more adept at bypassing automated filtering. Singer (Citation2011, 136) believes that “the quality of contributions will increase only when users feel that they are part of a community that is not just a trendy label but a real (even if virtual) entity.” Studies have shown that when news media directly appeal to users’ sense of personal responsibility, users are more motivated and willing to engage against uncivil comments (Ziegele, Naab, and Jost Citation2019). Some Austrian news media involve users by giving them the possibility to report (flag) other user comments (see, for example, Burger Citation2020).

However, this study reveals that some user forums, particularly those on social media, serve largely business-economic goals (Kaltenbrunner et al. Citation2020). This is in line with previous studies (Brodnig Citation2016; Kaltenbrunner et al. Citation2020; Meier, Kraus, and Michaeler Citation2018; Vujnovic Citation2011). The competition in the print, radio, and television sectors has grown considerably in Austria over the past 20 years (Trappel Citation2019) and forces media outlets to maximize their revenues. Thus, website traffic is important for generating advertising revenues. Future research should explore the influence of business-economic goals on community management processes further, in particular by conducting cross-country comparisons.

This study has some limitations that need to be addressed. Only Austrian community managers were interviewed. Therefore, the findings are based on self-reflections and the results may not be generalizable to other countries with different media systems, policy regulations and laws, or cultures. For instance, most hate speech is legal in the US (Singer Citation2011). Meanwhile, a recent study from Germany found differences in moderation practices between left-leaning and right-leaning media, whereby the former was more engaging (interactive) than the latter (Wintterlin et al. Citation2020), yet the interviews in this study did not detect such a pattern for the news media in Austria. This could be due to the fact that that the Austrian media market is too small for such a comparison. Hence, further research in this field is needed, particularly as policy regulations and laws relating to comments sections on the websites of media outlets remain absent in many countries. Furthermore, this study did not include the user perspective. By analyzing comments sections, future research could investigate the execution of deliberative practices and whether these are successful from a normative perspective. By interviewing users of comments sections, future studies could explore the views of users who post hateful comments and of those who just come across them, with regard to their experiences with community managers and the relevance of their work.

Despite the problems outlined, forums of news media are important because they can help to “accomplish the democratic normative goal of creating spaces for public discourse” (Wolfgang, McConnell, and Blackburn Citation2020, 2). A democratic debate is based on a minimum of respect and targets of hate speech deserve protection (Brodnig Citation2016), which means that uncivil and hateful speech are taken seriously. For community managers this often implies walking a tightrope between freedom of speech and censorship. Nemes (Citation2002) states that freedom of speech principle precludes censorship of opinion and that more speech is often the best antidote to uncivil, hostile, destructive, and hateful speech. However, when this sort of speech disrupts and destroys critical public discussion, news media are duty bound to ensure a minimum of quality and, particularly, protect those targeted. With the new “Prohibiting Hate Speech on the Net” law, the Austrian government has taken a major step to achieve rigor and responsiveness in hate speech, but it is still the responsibility of news media to hide incivility and hate in the first place. Community managers should always be aware of the fact that their communities have a great reach and can influence public opinion (e.g., Wintterlin et al. Citation2020).

Disclosure Statement

No potential conflict of interest was reported by the author(s).

Notes

1 We thank the students of the course "Methods and Practice" at the FHWien der WKW University of Applied Sciences of Management & Communication for their help with conducting the interviews.

References

  • Anderson, A. A., D. Brossard, D. A. Scheufele, M. A. Xenos, and P. Ladwig. 2014. “The ‘Nasty Effect:’ Online Incivility and Risk Perceptions of Emerging Technologies.” Journal of Computer-Mediated Communication 19 (3): 373–387. doi:10.1111/jcc4.12009.
  • Brinkmann, S. 2014. “Unstructured and Semi-Structured Interviewing.” In The Oxford Handbook of Qualitative Research, edited by P. Leavy, 277–299. Oxford: Oxford University Press.
  • Brodnig, I. 2016. “Hass im Netz.” Was wir gegen Hetze, Mobbing und Lügen tun können [Hate on the Web. What We Can Do Against Incitement, Bullying and Lies]. Brandstätter.
  • Bruns, A. 2003. “Gatewatching, not Gatekeeping: Collaborative Online News.” Media International Australia 107 (1): 31–44. doi:10.1177/1329878X0310700106.
  • Burger, C. 2020. “Mehr Transparenz bei eigenen Postings [More Transparency for Own Postings].” https://www.derstandard.at/story/2000122057409/mehr-transparenz-bei-eigenen-postings.
  • Coe, K., K. Kenski, and S. Rains. 2014. “Online and Uncivil? Patterns and Determinants of Incivility in Newspaper Website Comments.” Journal of Communication 64 (4): 658–679. doi:10.1111/jcom.12104.
  • Domingo, D. 2011. “Managing Audience Participation.” In Participatory Journalism: Guarding Open Gates at Online Newspapers, edited by J. B. Singer, A. Hermida, D. Domingo, A. Heinonen, S. Paulussen, T. Quandt, Z. Reich, and M. Vujnovic, 76–95. Malden: Wiley-Blackwell.
  • European Commission Against Racism and Intolerance. 2016. “General Policy Recommendation No. 15 on Combating Hate Speech.” https://book.coe.int/eur/en/humanrights-and-democracy/7180-pdf-ecri-general-policy-recommendations-no-15-oncombating-hate-speech.html.
  • Frischlich, L., S. Boberg, and T. Quandt. 2019. “Comment Sections as Targets of Dark Participation? Journalists’ Evaluation and Moderation of Deviant User Comments.” Journalism Studies 20 (14): 2014–2033. doi:10.1080/1461670X.2018.1556320.
  • Gadringer, S., R. Holzinger, S. Sparviero, J. Trappel, and C. Schwarz. 2021. “Reuters Digital News Report 2021.” Network Austria. http://www.digitalnewsreport.at/.
  • Grimmelmann, J. 2015. “The Virtues of Moderation.” The Yale Journal of Law & Technology 17 (1): 42–109.
  • Hallin, D. C., and P. Mancini. 2004. Comparing Media Systems: Three Models of Media and Politics. Cambridge: Cambridge University Press.
  • Harlow, S. 2015. “Story-chatterers Stirring up Hate: Racist Discourse in Reader Comments on U.S. Newspaper Websites.” Howard Journal of Communications 26 (1): 21–42. doi:10.1080/10646175.2014.984795.
  • Hass-im-Netz-Bekämpfungsgesetz—HiNBG. 2022. “Hass-im-Netz-Bekämpfungs-Gesetz – HiNBG [Hate on the Net Law].” https://www.parlament.gv.at/PAKT/VHG/XXVII/I/I_00481/index.shtml.
  • Herbst, S. 2010. Rude Democracy: Civility and Incivility in American Politics. Philadelphia: Temple University Press.
  • Humprecht, E., L. Hellmueller, and J. A. Lischka. 2020. “Hostile Emotions in News Comments: A Cross-National Analysis of Facebook Discussions.” Social Media + Society 6 (1), doi:10.1177/2056305120912481.
  • Kaltenbrunner, A., R. Lugschitz, M. Karmasin, S. Luef, and D. Kraus. 2020. “Der österreichische Journalismus-Report [The Austrian Journalism Report].” Facultas.
  • Kommunikationsplattformen-Gesetz—KoPl-G. 2022. “Bundesrecht konsolidiert: Gesamte Rechtsvorschrift für Kommunikationsplattformen-Gesetz [Federal law consolidated: Entire Body of Law for Communications Platforms Act].” https://www.ris.bka.gv.at/GeltendeFassung.wxe?Abfrage = Bundesnormen&Gesetzesnummer = 20011415.
  • Kraut, R. E., and P. Resnick. 2011. Building Successful Online Communities. Cambridge, MA: The MIT Press.
  • Ksiazek, T. B., and N. Springer. 2020. User Comments and Moderation in Digital Journalism: Disruptive Engagement. London: Routledge.
  • Mayring, P. 1991. “Qualitative Inhaltsanalyse [Qualitative Content Analysis].” In Handbuch Qualitative Forschung: Grundlagen, Konzepte, Methoden und Anwendungen, edited by U. Flick, E. von Kardoff, H. Keupp, L. von Rosenstiel, and S. Wolff, 209–212. Munich: Beltz, Psychologie-Verlag-Union.
  • Media-Analyse. 2020. “Media-Analyse 2020 [Media Analysis 2020].” https://www.media-analyse.at/.
  • Meier, K., D. Kraus, and E. Michaeler. 2018. “Audience Engagement in a Post-Truth age.” Digital Journalism 6 (8): 1052–1063. doi:10.1080/21670811.2018.1498295.
  • Mutz, D. C., and B. Reeves. 2005. “The New Videomalaise: Effects of Televised Incivility on Political Trust.” American Political Science Review 99 (1): 1–15. doi:10.1017/S0003055405051452.
  • Naab, T. K., D. Heinbach, M. Ziegele, and M.-T. Grasberger. 2020. “Comments and Credibility: How Critical User Comments Decrease Perceived News Article Credibility.” Journalism Studies 21 (6): 783–801. doi:10.1080/1461670X.2020.1724181.
  • Nemes, I. 2002. “Regulating Hate Speech in Cyberspace: Issues of Desirability and Efficacy.” Information & Communication Technology Law 11 (3): 193–220. doi:10.1080/1360083022000031902.
  • Neuberger, C., S. Langenohl, and C. Nuernbergk. 2014. “Social Media und Journalismus.” LfM-Dokumentation. https://www.medienanstalt-nrw.de/fileadmin/lfm-nrw/Publikationen-Download/Social-Media-und-Journalismus-LfM-Doku-Bd-50-web.pdf.
  • Newman, N., R. Fletcher, A. Kalogeropoulos, and R. K. Nielsen. 2019. “Reuters Institute Digital News Report 2019.” https://reutersinstitute.politics.ox.ac.uk/sites/default/files/inline-files/DNR_2019_FINAL.pdf.
  • Norris, P. 2000. A Virtuous Circle: Political Communications in Postindustrial Societies. Cambridge: Cambridge University Press.
  • Paasch-Colberg, S., C. Strippel, M. Emmer, and J. Trebbe. 2018, May 24–28. “Using Tools Against Hate? Moderation Strategies and Online Technologies to Prevent and Counter Hate Speech.” Paper presented at the 68th Annual Conference of the International Communication Association, Prague, Czech Republic.
  • Paasch-Colberg, S., C. Strippel, L. Laugwitz, M. Emmer, and J. Trebbe. 2020. “Moderationsfaktoren: Ein Ansatz zur Analyse von Selektionsentscheidungen im Community Management [Moderation Factors: An Approach to Analyzing Selection Decisions in Community Management].” In Integration Durch Kommunikation: Jahrbuch der Publizistik- und Kommunikationswissenschaft 2019, edited by V. Gehrau, A. Waldherr, and A. Scholl, 109–119. Münster: DGPuK. doi:10.21241/ssoar.67858.
  • Quandt, T. 2018. “Dark Participation.” Media and Communication 6 (4): 36–48. doi:10.17645/mac.v6i4.1519.
  • Rossini, P. 2022. “Beyond Incivility: Understanding Patterns of Uncivil and Intolerant Discourse in Online Political Talk.” Communication Research 49 (3): 399–425. doi:10.1177/0093650220921314.
  • Ruiz, C., D. Domingo, J. L. Micó, J. Díaz-Noci, K. Meso, and P. Masip. 2011. “Public Sphere 2.0? The Democratic Qualities of Citizen Debates in Online Newspapers.” International Journal of Press/Politics 16 (4): 463–487. doi:10.1177/1940161211415849.
  • Schreier, M. 2014. “Varianten qualitativer Inhaltsanalyse: Ein Wegweiser im Dickicht der Begrifflichkeiten [Variants of Qualitative Content Analysis: A Guide Through the Thicket of Terminology].” Forum: Qualitative Social Research 15 (1): Article 18. https://nbn-resolving.org/urn:nbn:de:0114-fqs1401185.
  • Seethaler, J. 2015. “Qualität des tagesaktuellen Informationsangebots in den österreichischen Medien: Eine crossmediale Untersuchung [Quality of Daily News in the Austrian Media: A Cross-Media Study].” RTR. https://www.rtr.at/de/inf/SchriftenreiheNr12015.
  • Singer, J. B. 2011. “Taking Responsibility: Legal and Ethical Issues in Participatory Journalism.” In Participatory Journalism: Guarding Open Gates at Online Newspapers, edited by J. B. Singer, D. Domingo, A. Heinonen, A. Hermida, S. Paulussen, T. Quandt, Z. Reich, and M. Vujnovic, 121–138. Malden: Wiley-Blackwell.
  • Sponholz, L. 2021. “Hate Speech and Deliberation: Overcoming the ‘Words-That-Wound’ Trap.” In Hate Speech and Polarization in Participatory Society, edited by M. Pérez-Escolar, and J. Noguera-Vivo, 49–64. Milton: Routledge.
  • Springer, N., and A. S. Kümpel. 2018. “User-generated (dis)Content.” In Journalismus im Internet. Profession – Partizipation – Technisierung, edited by C. Nuernbergk, and C. Neuberger, 241–271. Wiesbaden: Springer VS.
  • Trappel, J. 2019. “Medienkonzentration–trotz Internet kein Ende in Sicht [Media concentration–despite the Internet, no end in sight].” In Österreichische Mediengeschichte 2, edited by M. Karmasin, and C. Oggolder, 199–226. Wiesbaden: Springer VS.
  • Vujnovic, M. 2011. “Participatory Journalism in the Marketplace.” In Participatory Journalism: Guarding Open Gates at Online Newspapers, edited by J. B. Singer, A. Hermida, D. Domingo, A. Heinonen, S. Paulussen, T. Quandt, Z. Reich, and M. Vujnovic, 139–154. Malden: Wiley-Blackwell.
  • Wintterlin, F., K. Langmann, L. Frischlich, T. Schatto-Eckrodt, and T. Quandt. 2021. “Lost in the Stream? Professional Efficacy Perceptions of Journalists in the Context of Dark Participation.” Journalism 23 (9): 1846–1863. doi:10.1177/14648849211016984.
  • Wintterlin, F., T. Schatto-Eckrodt, L. Frischlich, S. Boberg, and T. Quandt. 2020. “How to Cope with Dark Participation: Moderation Practices in German Newsrooms.” Digital Journalism 8 (7): 904–924. doi:10.1080/21670811.2020.1797519.
  • Wolfgang, D. J. 2016. “Pursuing the Ideal.” Digital Journalism 4 (16): 764–783. doi:10.1080/21670811.2015.1090882.
  • Wolfgang, D. J., S. McConnell, and H. Blackburn. 2020. “Commenters as a Threat to Journalism? How Comment Moderators Perceive the Role of the Audience.” Digital Journalism 8 (7): 925–944. doi:10.1080/21670811.2020.1802319.
  • Ziegele, M., and P. B. Jost. 2020. “Not Funny? The Effects of Factual Versus Sarcastic Journalistic Responses to Uncivil User Comments.” Communication Research 47 (6): 891–920. doi:10.1177/0093650216671854.
  • Ziegele, M., T. K. Naab, and P. Jost. 2019. “Lonely Together? Identifying the Determinants of Collective Corrective Action Against Uncivil Comments.” New Media & Society 22 (5): 731–751. doi:10.1177/1461444819870130.