5,852
Views
1
CrossRef citations to date
0
Altmetric
Research Article

Fallacies about communities that lead to failed community relations

ORCID Icon, ORCID Icon & ORCID Icon
Pages 156-167 | Received 02 Sep 2021, Accepted 11 Nov 2021, Published online: 09 Dec 2021

ABSTRACT

The assessment and implementation of development projects has been disconnected from relevant concepts in social psychology, especially those relating to understanding the interactions between projects and local communities. This disconnection has given rise to the prevalence of several fallacies about human behaviour amongst project staff, decision makers and environmental and social impact assessment practitioners. The playing-out of these fallacies influences the implementation of projects and reduces the likelihood of gaining a social licence to operate. Because these fallacies lead to distorted perceptions about communities, the existence of these fallacies is deleterious to desirable social relations with communities, good impact assessment practice, and to effective project decision-making. We describe eight of these fallacies: subjectivity; naiveté; unpredictability; irrationality; greediness; self-serving; aggressiveness; and rigidity. We discuss these fallacies by drawing on the social psychology constructs of attitudes, risk perception, social identity, and social justice. We conclude by considering how these fallacies can be addressed in practice and how development projects and impact assessment can be improved.

Introduction: how fallacies about human behaviour affect project development

Owners and staff of development projects as well as local communities often have negative experiences with project implementation (Vanclay and Hanna Citation2019). Conflicting interests, opposing views, and limited knowledge (local knowledge on the part of the proponents and technical knowledge on the part of communities) are largely responsible for this (Hanna et al. Citation2016a; Vanclay Citation2017). Social Impact Assessment (SIA) is about the processes of assessing and managing the social issues associated with projects (Vanclay Citation2003, Citation2020; Esteves et al. Citation2012). SIA has provided many tools to help improve project and community interactions (Vanclay et al. Citation2015). However, although there are some exceptions (Meissen and Cipriani Citation1984; Edelstein Citation2003, Citation2004), a full understanding of the psychological processes relevant to the management of social issues in projects is still largely missing. This disconnect between project development, SIA practice and psychological constructs is surprising because many people in the SIA community of practice argue that psychological constructs and psychosocial impacts should be considered (Schweinsberg Citation2007; Esteves et al. Citation2012, Citation2017; Prenzel and Vanclay Citation2014). Some highly evident concepts from the field of social psychology include project acceptability, risk perception, risk communication, social identity, local identity, and social justice. These are the topics we address in this paper. We argue that the overlooking of social psychology in project development and SIA practice has led to the creation, operation and perpetuation of various fallacies about human behaviour, which has been deleterious to the interactions between projects and communities, to SIA practice, and to project decision-making.

Our aim with this paper is to bridge SIA thinking and contemporary understandings in social psychology. The paper was written by a team of authors covering a range of social sciences. The primary author is a social psychologist, the others each have an interdisciplinary background, and all authors are skilled in social impact assessment. Over a number of years, in their discussions about the relationships between projects and communities, they identified several relevant issues from social psychology that have bearing on the management of social issues in projects. They have given presentations at various fora (seminars and conferences), with the feedback received being used to further develop their arguments and to validate the selection of concepts included in this paper.

Although there are many fallacies or myths that could be discussed (Imperiale and Vanclay Citation2021), we ultimately describe eight key fallacies about human behavior that we apply in the context of projects (see below). In our conception, a fallacy about human behavior refers to a false belief about how individuals think, feel, or behave. We argue that these fallacies are often accepted (whether tacitly or deliberately) by impact assessment practitioners, project owners, project staff, and even by community stakeholders, and that they arise from common experiences with project development, especially in relation to: the lack of project acceptability; poor communication; limited assessment of risks; poor conflict management; poor negotiation processes, and inadequate compensation. We draw on two subfields of social psychology – human judgment and intergroup relations – to explain these fallacies. The important concepts we utilize include: attitude (attitude formation, content, change and aptitude-behaviour consistency); risk perception (environmental risks and heuristic and systematic judgments); social identity (formation, threats, and dynamism); and social justice (distributive and procedural justice). We conclude the paper by discussing what must be changed in impact assessment practice and project decision making to overcome these fallacies.

The fallacies are:

  1. Subjectivity: people’s opinions are not objective and can be ignored;

  2. Naivety: people are ignorant and need to be taught the facts;

  3. Unpredictability: people’s behaviors are erratic and cannot be predicted;

  4. Irrationality: people’s judgements do not follow reason and should be ignored;

  5. Greediness: people always want more than they have or deserve;

  6. Self-serving: people favor themselves to the detriment of others;

  7. Aggressiveness: people are disproportionately aggressive when their opinions are not taken into account or when their needs are not met;

  8. Rigidity: once committed to a position, people won’t change their minds.

Fallacy 1: subjectivity (people’s opinions are not objective and can be ignored)

In discussions about communities and projects, a key concept is ‘social license to operate’ (Dare et al. Citation2014; Jijelava and Vanclay Citation2017, Citation2018; Santiago et al. Citation2021). Social license is, in effect, a popularized version of the psychological notion of acceptability. To determine the extent of acceptability, various questions can be asked, for example: How much does the community accept the project? Are communities in favor or against? Do communities think the project is harmful or beneficial, something good or bad? These questions succinctly assess the individual and community acceptability of a project. Despite the importance and prevalence of social license and acceptability in the SIA discourse (Joyce et al. Citation2018; Veenker and Vanclay Citation2021), the ways they are interpreted and used are subject to the subjectivity fallacy – the view that because acceptability judgements are only perceptions, they are not objective and consequently can be ignored.

Why subjective information is a natural part of acceptability judgments

The first important point about acceptability is that it is based on complex knowledge structures (Smith and Queller Citation2004). Acceptability is primarily based on the information we have in our memory from our stories in relation to the topic of the judgment. This information can take different forms, like specific experiences (e.g. memories from visiting specific places), information provided by trusted sources (e.g. conversations with friends, newspaper articles, news on the television, social media announcements), or our personal reflections on the topic. An important corollary here is that the human mind needs a way to navigate through complex knowledge structures. When a member of a community is asked about the acceptability of a project, only the information that is relevant to that person will be considered (assembled). In psychological terms, relevance is equivalent to the ‘accessibility’ of the information to the thinking process, especially in relation to how a person comes to make a judgment. Relevance is an important cognitive tool in the navigation of complexity.

Two characteristics of relevance can result in non-factual information having an important role in attitude formation and structure (Schwartz Citation2000; Ajzen Citation2002; Smith and Queller Citation2004). First, relevance can take many forms, for example, it can mean emotional valance (i.e. the things we particularly like or dislike), recency (i.e. the things we’ve experienced more recently), or personal relevance (i.e. the things about which we have a great level of involvement). Second, relevance is dynamic – it can change according to context (e.g. taking an individual versus a group perspective) or motivation (e.g. taking a long-term versus short-term perspective). Complexity and relevance are critical to interpreting what a specific value judgment means. People might know a lot of different things about a project, however only the most relevant information determines their final judgment.

Project acceptability relies on complex knowledge structures comprised of memories, conversations, news in the media, etc. These complex knowledge structures ensure that we have diverse and relevant information about the project. However, complex representations are not synonymous with factual information, and even when it is, factual information might not be relevant enough to become part of our attitudes.

Acceptability judgments are psychologically meaningful

It can be disconcerting for project technical staff to interact with communities who have high hopes and/or troubling concerns that are based on non-factual information. This experience often leads to the subjectivity fallacy where judgments that are not based on objective information are not taken into account. For example, people may exhibit low levels of acceptability towards a project due to concerns about, for example, noise levels, in situations where, according to objective (scientific) assessments, the project is not expected to produce the feared noise levels. In these situations, the community perception will likely be deemed irrelevant by project staff and decision makers. There is a significant problem with this objectivity imperative – people’s perceptions might be subjective, but they represent their personal history of relation with the project and, consequently, these perceptions will be used to interpret and interact with the world, and will ultimately affect their behavior. In a classic study, Staples et al. (Citation1999) showed that, despite no objective evidence of noise reaching unacceptable levels, people who perceived the noise intensity to be unacceptable remained opposed to the project, were more agitated and stressed, and attributed other issues and ambient noises to the project. In other words, despite their subjectivity, acceptability judgments are psychologically meaningful, and have a strong influence on a project’s social license to operate.

Fallacy 2: naiveté (people are ignorant and need to be taught the facts)

A corollary to the subjectivity fallacy is the naivety fallacy – the view that people are ignorant and need to be provided with factual information. A common practice in new projects is to have proponents inform communities about the project details and its objective impacts (Lima et al. Citation2012; Rau et al. Citation2012). Unfortunately, the lack of understanding about how knowledge structures work results in many proponents becoming frustrated with communities, and instead of understanding why communities think the way they do, the proponents simply restate the objective facts about the project, failing completely to engage with community concerns (Joyce et al. Citation2018).

Attitude change requires specific conditions that are rarely met

For new information to be incorporated into existing knowledge structures, a person needs to have the capacity to understand the information and the motivation to retain it (Bohner and Dickel Citation2011). Evidence has shown that individuals who are highly motivated and capable will more likely be persuaded by information that requires higher levels of processing effort (e.g. messages that are lengthy, complex, or new), whereas individuals who are not motivated or capable will more likely be persuaded by evidence that is easy to process (e.g. messages that are short, simple, repeated often and/or align with prior understandings). Even if the new information becomes part of a person’s knowledge structure, it is its relevance (described above) that dictates whether this information will influence their perceptions and behaviors. Knowledge about a project is not necessarily more relevant because it is objective. Research into attitude change has shown that, besides processing difficulty and motivation, the relationship between the arguments in the message and the metacognitions (i.e. thoughts about thoughts) also have a major influence on the likelihood of a person changing their attitude (Clarkson et al. Citation2011).

From this quick overview of attitude change, it can be concluded that the communication practices used for most new projects are unlikely to produce any attitude change at all. Messages about projects are often complex and technical. These messages are often conveyed in ways that require considerable motivation and effort (e.g. lengthy text in project brochures, overly-packed information meetings, or consultation documents that are inaccessibly displayed in municipal buildings; Gulakov and Vanclay Citation2018, Citation2019). These messages are often conveyed in contexts that undermine integration with previous knowledge. For instance, having information meetings in a context where there is strong opposition (or acceptance) towards a project is likely to lead to re-interpreting the information and re-affirming the opposition (or acceptance) arguments than to changing attitudes.

Fallacy 3: unpredictability (people’s behaviors are erratic and cannot be predicted)

While acceptability judgements can be used to understand resistance or support towards a project, acceptance (or non-acceptance) does not mean that people will behave accordingly. People can express that they are in favor (or against) a project at one point in time and, for many reasons, later be opposed to it; or informally agree with a local intervention but later take part in a protest (Hanna et al. Citation2016a). We argue that this disconcerting experience has resulted in another misconception about human judgment named the unpredictability fallacy – the view that people’s behaviors are erratic and cannot be predicted with accuracy.

Behaviors are determined by multiple causes

Having a strong unfavorable (or favorable) attitude about a project would generally mean a higher likelihood of behaving accordingly, for example, ranging from a person making negative (or positive) comments about the project to their friends through to more extreme manifestations of disagreement (or agreement) (Hanna et al. Citation2016a). However, although our judgments generally drive our behaviors, the relationship is not straightforward. We all have had the experience of people (including ourselves) saying one thing and then doing another (e.g. affirming that French fries are bad for our health and then ordering French fries with dinner; or arguing in favor of being environmentally-sustainable and then going on a long flight for a vacation). Indeed, the low correlation between attitude and behavior has been at the heart of social psychology since the 1970s (Wallace et al. Citation2005) and we have come to know the exact conditions promoting and disrupting attitude-behavior consistency (Wallace et al. Citation2005; Glasman and Albarracín Citation2006; Ajzen and Gilbert Cote Citation2008). The main takeaway is that human behavior is complex and determined by multiple variables (e.g. Clayton Citation2009; Swim et al. Citation2011). Unpredictability is simply a label that used to hide the lack of understanding about the multiple determinants.

Not all behaviors are equal

Some behaviors are easy to execute, while others are demanding. For example, compare a discussion you might have with your friends and family about a new project in your neighborhood versus a discussion about the project at a public forum or town hall meeting. They involve different levels of effort. Various individuals will feel more or less comfortable in these environments and in making the effort required to fully comprehend all the issues and to engage. Acceptability judgments that are strongly held can lead to both easy and difficult behaviors, whereas weaker attitudes can only result in easy behaviors (Wallace et al. Citation2005). Someone who is highly interested in a new project (e.g. someone who reads about it, discusses it, or seeks elaboration on it) will be more likely to become actively involved in public debate about the project and/or in taking other difficult behaviors, such as writing an opinion piece, posting on social media, mobilizing the community, or participating in a demonstration. Conversely, someone with less interest is unlikely to be motivated to do these things. The key point here is that acceptability does not determine behaviors because not all attitudes have the same strength.

Attitude strength is related to another variable that influences attitude-behavior consistency, i.e. perceived behavioral control, the extent to which the behavior in question is perceived to be within the control (ability, capacity) of the individual to perform well (e.g. Ajzen Citation1985). To explain, let’s consider a protest action like signing a petition against a project. If someone strongly believes in the cause, they will be more likely to sign. However, signing is also dependent upon perceived behavioral control, for example in relation to the perceived impediments to performing the act of signing. For instance, if signing require the person to drive somewhere, someone who did not know how to drive will have lower control over signing the petition, and therefore be less likely to sign.

Behaviors are socially determined

Another factor that overrides attitude-behavior consistency is social norms. A person’s social norms are based on their perceptions of what they think people in their social groups deem to be appropriate. People can be influenced to perform a vast array of behaviors. Recent research, for example, shows that the best determinant of energy conservation behaviors is how much someone thinks people in their neighborhood also engage in energy conservation behavior (Goldstein et al. Citation2008; Nolan et al. Citation2008). The key point here is that behaviors do not occur in a vacuum, they have a social meaning and are socially constructed.

Behaviors are context-dependent

Context can also have an important influence on attitude-behavior consistency. Context includes the external setting, the social groups in which one participates, self-motives (goals, objectives), and the current mood of the person. Projects that create significant social or environmental change are likely to lead to people holding complex representations about the project, often with mixed feelings. In such cases, should the context change, e.g. from a work conversation focused on job opportunities to a social discussion focused on the social impacts, the opinions that are expressed are also likely to change (Schwarz et al. Citation2001; Schwarz Citation2007). Unpredictability is more likely to be a result of differences in context, with people behaving consistently across the same contexts, and differently across different contexts.

Fallacy 4: irrationality (people’s judgements do not follow reason and should be ignored)

Every project needs to identify and assess the risks of a project to local communities and society in general. However, proponents and government agency staff often consider that this exercise should be done solely by experts and based only on the direct, objective physical impacts. The contrast between objective versus perceived risk is particularly important to understanding the human irrationality fallacy, which holds the view that human behavior is illogical, does not follow reason, and therefore should be ignored. The strength and prevalence of this fallacy has substantially limited acceptance of the use of people’s perceptions in risk assessment (Mahmoudi et al. Citation2013).

Risks are socially constructed

There are many examples of situations where humans underestimate or overestimate risks. For example, many people perceive that the risk of flying is greater than driving, whereas in fact objectively driving has a much greater risk (Evans et al. Citation1990). Our risk assumptions do not always align with the actuality of danger (Gigerenzer Citation2002). Although human risk judgements tend to be irrational from a probabilistic point of view, these judgments are psychologically meaningful. A risk assessment that is based solely on the likely direct physical consequences from a scientific perspective seriously understates the full range of project-induced social impacts. For instance, an objectively low risk assessed by experts (e.g. electromagnetic radiation from high voltage powerlines or windfarms), if perceived as a high risk by communities still has the potential for considerable socio-economic impacts (e.g. decreased property values) and psychosocial impacts (e.g. increased anxiety) (Furby et al. Citation1988; Chapman Citation2018). Conversely, an objectively-high risk (e.g. heavy metal contamination) may be overlooked by communities because the consequences only occur over a long time period and because of the immediate effect of any compensation that may be provided (Gattig and Hendrickx Citation2007). The point here is that, as important as risk assessments by experts can be, risks are socially constructed, and consequently the social perception of the risk is just as important as any danger captured by an objective risk assessment.

Environmental risks are especially complex

Environmental risks are particularly complex, and this complexity greatly influences risk judgements (Bohm and Pfister Citation2008). Environmental risks generally involve many variables with intricate causal relationships resulting in multiple consequences. The building of a dam, for example, has many environmental, economic, social and health risks (Hanna et al. Citation2016b).

Environmental risks emerge from complex decision making processes and from the aggregate behavior of many individuals. This means that understanding the causes and mitigating the impacts of environmental risks requires dealing with the actions of many stakeholders. The consequences of environmental risks are often temporally delayed and geographically dispersed (Vanclay Citation2004). The complex, cross domain, distributed nature of the consequences greatly influence how individuals perceive environmental risks.

Thinking systematically about risks is complicated

Risk perceptions can be understood as judgements about events or activities with uncertain adverse outcomes that potentially affect something that individuals or communities value. Because the objective physical consequence of risk is measurable, the study of risk perception relates to how human judgments are made and how they deviate from the probability of the objective consequences. A key aspect in understanding risk judgments is the concept of dual-processing (Evans Citation2008). In psychology, it is argued that judgments of risk are based on two types of thinking – systematic and heuristic (Slovic et al. Citation2004). The systematic process involves high effort, conscious, analytical and logical actions to come to a reasoned judgment, and is close to what most people consider to be ‘thinking hard’ about something. Conversely, the heurist process is intuitive, automatic, default, non-conscious, and close to what people consider to be ‘gut feeling’ (Evans Citation2008).

The systematic processing of risks requires specific conditions to be met: cognitive resources (i.e. brain energy), individual motivation, involvement with the topic, and analytical tools to organize relevant knowledge. However, even when the conditions for systematic processing are met, there are still limits to the amount of information a human can consider at any given time. Therefore, systematic processing may be difficult to achieve in the real world, so most judgements are the interplay of systematic and heuristic processing.

For SIA, there are two main implications of the dual processing nature of human cognition. First, the conditions to think systematically about risks are rarely anticipated or promoted. Second, systematic risk assessment is an overrated illusion – systematic processing has its limits and heuristics will always intervene in the reaching of a conclusion (even amongst experts!).

Heuristic reasoning does not mean irrationality

As mentioned above, risk perceptions often result from the heuristic consideration of information. Heuristics are intuitive rules-of-thumb that are constructed around basic information that is readily available to the individual. These rules allow us to operate in a world that is characterized by the existence of too much information, the lack of meaning of much of that information, and the need to think and act fast (Gigerenzer and Goldstein Citation1996). However, simplicity comes with a cost – heuristics often produce biased reasoning. Bias has many forms. For example, the ease by which a risk comes to mind is often an heuristic about the likelihood of the risk. Accordingly, although radiation risk from medical x-rays is objectively more dangerous, radiation risks from nuclear energy are generally perceived to be more dangerous, mostly because of the disproportionate exposure and memorability of nuclear accidents in the news (Slovic Citation2012).

Risk attenuation is the tendency to undervalue risks. While there are many forms of attenuation, environmental risks are particularly subject to temporal discounting, i.e. the long-term consequences are weighted less than the short-term consequences (Gattig and Hendrickx Citation2007). The relevance of risk attenuation is reflected in the Hardin’s (Citation1968)notion of the ‘tragedy of the commons’ (Dietz et al. Citation2003; Vlek & Steg Citation2007), in which decision makers and communities have difficulty in balancing immediate individual benefits with the losses that are uncertain, delayed in time, occurring elsewhere, or to others. Climate change is an example of this (Swim et al. Citation2009).

In contrast to risk attenuation is risk amplification, the process that leads to the perceived intensification of the danger associated with an event (Slovic Citation1987; Kasperson et al. Citation2003). Research shows that environmental changes judged as dreadful (i.e. evoking fear, have catastrophic potential, fatal consequences) and unknown (i.e. unobservable and about which there is limited knowledge) will be the ones judged as more risky (e.g. hazards like radioactive waste, nuclear reactor accidents, and electromagnetic fields), despite their actual danger.

It is beyond the scope of this paper to provide a complete review of the conditions promoting biased reasoning. However, the important point is that, although heuristic reasoning means that the judgements that are made will not be based on full information or be fully-thought through, heuristic reasoning is not the same as irrationality. First, heuristics operate in a way that is highly predictable if we use the appropriate theoretical tools. Second, heuristics often produce reasonable judgments, particularly if we consider the complexity of the information and the time constraints under which people have to make judgements.

Fallacy 5: greediness (people always want more than they have or deserve)

Managing the impacts of a project often involves providing compensation for losses that people in local communities experience (e.g. for property expropriation, the demolition of houses and other assets, loss of crops and common property resources, etc). This is an area of recurring conflict between communities and projects, with communities often arguing that compensation is inadequate, while proponents usually rely on objective or legal criteria to make determinations about payments. This conflict reveals another important fallacy about human behavior, the greediness fallacy, which suggests that humans always want more than they have or deserve. The difference between objective and perceived fairness is similar to that between objective and perceived risks discussed above. The reduction of fairness to objective criteria undervalues the significance of fairness perceptions and leads to conflict between proponents and communities.

Objective estimation of compensation does not lead to acceptance

A common assumption of project staff is that if a compensation procedure is rigorously applied, the outcome will be fair, should be considered fair by local people, and that no dissatisfaction should occur. However, research has provided very limited support for this claim (Tyler Citation2000; Jost and Kay Citation2010). Compensation arrangements can be deemed unfair both because it was inadequate and in that people consider the way they were treated as being important. The take home message here is that limiting the process of allocating compensation only to objective legal assessment is not likely to lead to communities judging the compensation as being fair.

Voice, trust and treatment are critical to getting acceptance

A reason why fixed procedures do not create acceptance is that people value how they are treated in a decision making process. This is called procedural fairness. It is well established that communities are more likely to support the continuing operation of a project when they perceive there is fairness and respect in the development and implementation of a project (Moffat and Zhang Citation2014; Jijelava and Vanclay Citation2017). Procedural fairness has three core elements: having voice; having trust in the other parties; and being treated with respect. Having voice refers to the opportunities of communities to express their opinions about what should be done. Trust is the belief that the other party has the integrity and competence to genuinely deal with their concerns. Treatment can be divided into informational fairness (i.e. truthfulness and provision of timely and adequate justifications) and interpersonal fairness (i.e. respect, and sensitive treatment; Bies Citation2005). Disagreements over compensation are more likely to be a matter of communities lacking voice, trust and/or proper treatment than communities being greedy or wanting more than they have or deserve.

Fallacy 6: self-serving (people favor themselves to the detriment of others)

A common occurrence is that project staff become progressively more frustrated because they generally think that communities are focused only on the negative impacts of the project and refuse to see the positive benefits or consider an alternative perspective. This reveals the existence of the self-serving fallacy – the view that people are partial and instrumental in the way they construct reasoning in ways that favors themselves (and their significant others) to the detriment of other people. The problem with this fallacy is that it is an overly-simplified, individual-level account and completely ignores the significance of the social context.

The role of social identity

Social identity is the part of a person’s character that can only be understood through a person’s relations with others. Social identity is determined by the various groups to which a person belongs. For example, the part of an individual’s self-image that stems from belonging to a group of ‘Lisbon residents’ is different to that part that comes from belonging to a group of ‘environmental scientists’. Group membership is dynamic, it changes over time (e.g. across the life cycle) and across the activities with which a person engages (e.g. work, parenting, hobbies). It is non-exclusive, meaning that a person can belong to different groups simultaneously, and it varies in strength, meaning that membership can be stronger and more relevant for some groups (and for some people) than for others.

An important aspect of being part of a group is that people strive to maintain a positive social identity and a positive image of the groups they belong to. This can be expressed in the form of favorable perceptions of one’s in-groups (i.e. the groups to which the person belongs) when compared to one’s out-groups (i.e. the groups to which the person does not belong). However, these comparisons are systematically biased by the ‘fundamental attribution error’ (Hewstone Citation1990; Malle Citation2006), i.e. the tendency to attribute the success of in-groups to internal causes and their failures to external causes, while simultaneously attributing the success of outgroups to external causes and their failures to internal causes.

Another type of bias is the ‘outgroup homogeneity effect’ (Haslam et al. Citation1996), a perception that out-group members are more similar to one another than in-group members. This leads to out-group members being depersonalized, treated collectively, and prone to oversimplified stereotypes – something that is also called othering.

Social identity biases provide a more likely explanation than the self-serving fallacy. People can indeed be partial in their reasoning, but this is likely the result of how social interactions are structured in the context of project development. We consider that a major contributor to self-serving behaviours is the ‘us versus them’ discourse that is prevalent in project developments. Putting project proponents and local communities into two different social groups creates a social context where both proponents and communities are more likely to engage in self-serving reasoning.

Fallacy 7: aggressiveness (people are disproportionately aggressive when their opinions are not taken into account or when their needs are not met)

Project proponents often experience that a lack of social license escalates into conflict. People can become fixated on their own (self-serving) perspectives and they can become vocal about their disagreement, which can build up from individuals, to groups, to entire communities. In the context of project development, this escalation of conflict tends to be explained by the aggressiveness fallacy – the view that people behave in disproportionately aggressive ways towards others when their opinions are not taken in account or their needs are not meet. As with the self-serving fallacy, conflict and disproportionate responses are better understood when social context is taken into consideration.

Disagreements as threats to social identity

Social identity threat is experienced when an individual is confronted with a situation in which one or more of their social identities is attacked, with potential harm to the value, meaning, or enactment of the identity. Identity threat is an important part of the social identification dynamic, because, in response to the threat, meaning is redefined and group mobility potentially occurs. However, social identity threat also: affects the achievement of group and personal objectives; reduces self-esteem; increases in-group denigration; increases antisocial behavior; makes conflicts between friends and colleagues more frequent; promotes non-conformance with rules; and actively blocks change (e.g. Aquino and Douglas Citation2003; Nag et al. Citation2007; Fiol et al. Citation2009).

Social identity threats are at the heart of aggressive and disproportionate responses to disagreements. In conflict situations, people are not responding as mere individuals, they often respond as members of social groups in defense of their social identities. As mentioned above, the ‘us versus them’ discourse feeds into the mechanisms of identity threat, with proponents and communities being positioned as distinct social groups with competing views that become harmful to the value, meaning, and enactment of each group. The pervasiveness of this discourse can be a major contributor to polarized positions and conflict escalation in project development.

The loss of physical places also threatens social identity

Local identity (which is similar to place attachment or sense of place) refers to the importance that specific physical places have for people’s self-image. Local identity has much to do with the emotional and evaluative significance of belonging to social groups that stem from a physical place (Marques et al. Citation2015; Bernardo and Palma-Oliveira Citation2016). For instance, consider the loss of a favorite fishing site due to the construction of a hydroelectric dam. The users of this fishing site will be losing more than the mere opportunity to fish, they will be losing part of their social identity. This reveals an important issue about the treatment of public places in impact assessments. Project staff tend to look on public places only as places with economic value, rarely considering the emotional value associated with such places. However, many public places are also sources of meaning and social belonging. This explains why what might be perceived as disproportionately aggressive responses to the loss of public places are common and arguably legitimate – people are not only reacting to losing something they value, they are also reacting to the fact that their social identity is under threat.

Fallacy 8: rigidity (once committed to a position, people won’t change their minds)

The rigidity fallacy is the view that people are rigid in their views and once committed to a position they won’t budge. As with the self-serving and aggressiveness fallacies, rigidity is greatly influenced by the social context and consequently it can be the result of the prevalent ‘us and them’ discourse. Social contexts are dynamic, and this dynamism produces changes in people’s judgments and behaviours.

Reframing the discourse

An important characteristic about social identity is that it is dynamic and context dependent. Individuals balance their need to preserve group belonging and to maintain a sense of consistency with their need to adapt to changing circumstances (Turner et al. Citation1994). For instance, when parents collect their children from school, the identity of ‘parent’ and its corresponding content are relevant and salient. However, when they are at work, these same people are likely to display their professional social identities and hide their parent role. An important consequence of this social identity dynamism is that social identities vary across contexts, but tend to remain the same over time within the same context.

Social identity dynamism is particularly important in the context of projects and people. The representation of stakeholders involved in a project as ‘us and them’ makes accessible the same social identities over time and this creates an impression of rigidity. We consider the resulting stability of social identities to be responsible for the rigidity fallacy. Indeed, communities and project staff belong to many different social groups, more than just those represented in the ‘us and them’ distinction. The important message here is that being aware of the dynamism of social identity would allow new perspectives and discourses to be considered.

Social identity dynamism means that the social contexts in project development could be reframed to improve the quality of project decision making. Under the right conditions, communities and proponents could be encouraged to belong to the same macro social group (Gaertner et al. Citation2000). Also, breaking down the barriers between project staff and communities would assist in diluting the us and them mentality. This could be done by community members participating in project related activities with project staff (e.g. measurements, organization on events), or/and project staff participating in community related activities along with community members (e.g. sports events, local festivities).

What is required to address these fallacies?

We described eight fallacies about human behavior and how they apply to the interactions between projects and communities. By drawing on social psychology constructs, we discussed how these fallacies result from frequent experiences in project development and SIA practice, and how these fallacies can be understood. Applying constructs from social psychology to solve important real world problems is not new. Successful applications have been obtained in a vast range of areas, including health, education, work, finance, consumer behavior, immigration, and environment (for a reveiw see Steg et al. Citation2008). However, there is still resistance by some decision-makers, managers and project staff to these theories, leading to underperformance (Lévy-Leboyer Citation1988; Giner-Sorolla Citation2019). Below, we outline three interrelated steps that can assist in countering the fallacies: acknowledge the fallacies; build capacity to address them; and change community engagement practice.

Acknowledge the fallacies

The first step in addressing the fallacies described in this paper is to acknowledge the possibility that project staff and social practitioners may be consciously or subconsciously endorsing some of the fallacies. For example, consider a team taking a true-or-false quiz with the following statements: the subjective opinions of communities about a project should be discarded; information provision is the best means of educating communities about the project; the behaviour of people in the community cannot be predicted with any accuracy; project risk assessments should exclude the personal judgments of community members and should only be done by experts; in any negotiation, people representing communities will only advocate for their individual interests; community members react aggressively when their opinions are not taken into account or their needs not met; once an individual takes a position, it will be extremely difficult to change their mind; fairness in compensation for losses will only be achieved by making rigorous and transparent compensation decisions based on strict criteria. A ‘true’ response to any of these questions implies the existence of a false belief about how people think, feel, or behave in the context of a project.

The above exercise is a way of making the beliefs held by project staff and social practitioners visible. Other clues may be found in the narratives and discourses about communities that are written in project reports or stated in meetings, or that arise in brainstorming sessions that project staff might have, for example about the challenges in project-community relationships (Parsons Citation2008; Ijabadeniyi and Vanclay Citation2020).

Regardless of how acknowledgement of the fallacies is done, it is important to note that the fallacies (which are a form of belief) are nuanced. Some people are aware of their beliefs and can argue why they hold a specific belief. For example, in our practice, we frequently hear the claim that communities’ subjective opinions about a project should be discarded because experts are technical staff who are committed to objectively studying the options and to coming up with the best possible solution. Some others may be aware of their beliefs, but may not be able to explain why they hold a specific belief. And others may not even be aware of some of their beliefs.

In addition to awareness, there is also ambivalence. Some people have univalent beliefs, in that they can easily agree or disagree with each of the sentences given above. Others find it difficult to take a position because they agree and disagree at the same time, and can come up with good arguments for both claims.

Another dimension that adds nuance is consistency, or rather the lack of it. Some people have consistent beliefs over time, in that they are more likely to express the same belief across different moments in time and across different contexts. Others have more fluid beliefs, they are likely to change their beliefs at different moments and across different contexts.

Build the capacity of project staff and social practitioners

People outside the social sciences often consider that psychology is merely common sense and does not use scientific methods, yield replicable results, or make precise predictions (Lilienfeld Citation2012). To address the fallacies described in this paper, it is crucial to also address this skepticism about psychology and the social sciences generally. In this regard, building the capacity of project staff and social practitioners can make a big difference.

Capacity-building has many forms: training project staff and social practitioners; giving them the resources and mandate needed to fulfil their roles; including staff with social psychology training in their teams; and implementing change strategies aimed at adapting the organizational culture and systems to the new learning and perspectives. Competency measurement and executive coaching are other forms. Competency assessment for social performance professionals is essential to help them carry out their role, to improve their influence within their organizations, and to improve corporate social performance overall. Competencies can be measured using a validated competency assessment tool, and the development of competencies can be supported by executive coaching (Esteves and Moreira Citation2021).

Change community engagement practice

The fallacies have a major influence on how community engagement practices are designed and implemented. Let’s consider the practices of information provision, consultation, and dialogue (or active participation). Information is often presented in a patronizing way without adequate consideration of the specific conditions needed to change a person’s perceptions about the project. However, using targeted information methods that address the specific needs of individuals will activate their motivation and capacity to produce change in their knowledge structures (Bourne Citation2016).

Consultation processes often fail to consider subjective, complex, or ambivalent opinions about project acceptability and risks. Traditional approaches to consultation tend to focus on tangible physical themes (such as water or air quality), they generally treat communities as being homogenous, and regard real participation as a threat to the project (Lima et al. Citation2012). Using constructs like acceptability (e.g. Moffat and Zhang Citation2014), perceived fairness (e.g. Tyler and Lind Citation1992), perceived risk (e.g. Lima Citation2006) and local identity (Marques et al. Citation2015) will go a long way towards having a better understanding of local community perceptions.

Dialogue or active participation has become increasingly important in project development and SIA practice (Esteves et al. Citation2012; Lima et al. Citation2012). However, too often project staff only utilize engagement methods such as public meetings that become Q&A sessions rather than actual dialogues, generating frustration and polarization (Gulakov and Vanclay Citation2019). Instead, having smaller meetings based around specific issues or impacts and using more deliberative techniques would be more effective.

Conclusion: making the invisible visible

In projects, there tends to be more emphasis on issues that are highly visible and have well-established methods of measurement rather than on how communities actually experience impacts and what they think about the project. This has detrimental consequences for communities and ultimately for projects as well. With new insights about community relations, including awareness of the fallacies presented in this paper and how these can be addressed, projects can become more attuned with people’s concerns. As a result, projects would become less of a threat to communities and more of an opportunity for shared value creation.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Ajzen I. 2002. Nature and operation of attitudes. Annu Rev Psychol. 52:27–58.
  • Ajzen I, Gilbert Cote N. 2008. Attitudes and the prediction of behavior. In: Crano WD, Prislin R, editors. Attitudes and attitude change. New York: Psychology Press; p. 289–311.
  • Ajzen I. 1985. From intentions to actions: a theory of planned behavior. In: Kuhl J, Beckman J, editors. Action Control. Heidelberg: Springer; p. 11–39.
  • Aquino K, Douglas S. 2003. Identity threat and antisocial behavior in organizations: the moderating effects of individual differences, aggressive modeling, and hierarchical status. Organ Behav Hum Decis Process. 90(1):195–208.
  • Bernardo F, Palma-Oliveira JM. 2016. Urban neighbourhoods and intergroup relations: the importance of place identity. J Environ Psychol. 45:239–251.
  • Bies R. 2005. Are procedural justice and interactional justice conceptually different. In: Greenberg J, Colquitt JA, editors. Handbook of organizational justice. Mahwah (NJ): Lawrence Erlbaum Associates; p. 85–112.
  • Bohm G, Pfister H-R. 2008. Antinomies of environmental risk perception: cognitive structure and evaluation. In: Casimir M, editor. Culture and the changing environment: uncertainty, cognition and risk management in cross-cultural perspective. Oxford: Berghahn Books; p. 61–77.
  • Bohner G, Dickel N. 2011. Attitudes and attitude change. Annu Rev Psychol. 62(1):391–417.
  • Bourne L. 2016. Targeted communication: the key to effective stakeholder engagement. Procedia Soc Behav Sci. 226:431–438.
  • Chapman S. 2018. Wind turbine syndrome: a communicated disease. J Proc R Soc NSW. 151(1):39–44.
  • Clarkson J, Tormala Z, Leone C. 2011. A self-validation perspective on the mere thought effect. J Exp Soc Psychol. 47(2):449–454.
  • Clayton S. 2009. Can psychology help save the world? Minding Nat. 2(3):13–15.
  • Dare M, Schirmer J, Vanclay F. 2014. Community engagement and social licence to operate. Impact Assess Proj Apprais. 32(3):188–197.
  • Dietz, T., Ostrom, E., & Stern, P. C. 2003. The struggle to govern the commons. Science. 302(5652):1907–1912. https://doi.org/https://doi.org/10.1126/science.1091015
  • Edelstein M. 2003. Weight and weightlessness: administrative court efforts to weigh psycho-social impacts of proposed environmentally hazardous facilities. Impact Assess Proj Apprais. 21(3):195–203.
  • Edelstein M. 2004. Contaminated communities: coping with residential toxic exposure. 2nd ed. Boulder: Westview Press.
  • Esteves AM, Factor G, Vanclay F, Götzmann N, Moreiro S. 2017. Adapting social impact assessment to address a project’s human rights impacts and risks. Environ Impact Assess Rev. 67:73–87.
  • Esteves AM, Franks D, Vanclay F. 2012. Social impact assessment: the state of the art. Impact Assess Proj Apprais. 30(1):34–42.
  • Esteves AM, Moreira S. 2021. Developing social performance professionals in the extractive industries. Extr Ind Soc. doi:https://doi.org/10.1016/j.exis.2021.100964
  • Evans J. 2008. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 59:255–278.
  • Evans L, Frick M, Schwing R. 1990. Is it safer to fly or drive? Risk Anal. 10(2):239–246.
  • Fiol C, Pratt M, O’Connor E. 2009. Managing intractable identity conflicts. Acad Manag Rev. 34(1):32–55.
  • Furby L, Slovic P, Fischhoff B, Gregory R. 1988. Public perceptions of electric power transmission lines. J Environ Psychol. 8:19–43.
  • Gaertner S, Dovidio J, Banker B, Houlette M, Johnson K, McGlynn E. 2000. Reducing intergroup conflict: from superordinate goals to decategorization, recategorization, and mutual differentiation. Gr Dyn. 4(1):98–114.
  • Gattig A, Hendrickx L. 2007. Judgmental discounting and environmental risk perception: dimensional similarities, domain differences, and implications for sustainability. J Soc Issues. 63(1):21–39.
  • Gigerenzer G. 2002. Calculated Risks: how to know when numbers deceive you. New York: Simon & Schuster.
  • Gigerenzer G, Goldstein D. 1996. Reasoning the fast and frugal way: models of bounded rationality. Psychol Rev. 103(4):650–669.
  • Giner-Sorolla R. 2019. From crisis of evidence to a ‘crisis’ of relevance? Incentive-based answers for social psychology’s perennial relevance worries. Eur Rev Soc Psychol. 30(1):1–38.
  • Glasman L, Albarracín D. 2006. Forming attitudes that predict future behavior: a meta-analysis of the attitude-behavior relation. Psychol Bull. 132(5):778.
  • Goldstein N, Cialdini R, Griskevicius V. 2008. A room with a viewpoint: using social norms to motivate environmental conservation in hotels. J Consum Res. 35(3):472–482.
  • Gulakov I, Vanclay F. 2018. Social impact assessment in the Russian Federation: does it meet the key values of democracy and civil society? Impact Assess Proj Apprais. 36(6):494–505.
  • Gulakov I, Vanclay F. 2019. Social impact assessment and stakeholder engagement in the Russian Federation: representativeness, deliberativeness and influence. Environ Impact Assess Rev. 75:37–46.
  • Hanna P, Vanclay F, Langdon EJ, Arts J. 2016a. Conceptualizing social protest and the significance of protest actions to large projects. Extr Ind Soc. 3(1):217–239.
  • Hanna P, Vanclay F, Langdon EJ, Arts J. 2016b. The importance of cultural aspects in impact assessment and project development: reflections from a case study of a hydroelectric dam in Brazil. Impact Assess Proj Apprais. 34(4):306–318.
  • Hardin, G. 1968. The Tragedy of the Commons. Science. 162(3859):1243–1248. http://www.jstor.org/stable/1724745
  • Haslam S, Oakes P, Turner J. 1996. Social identity, self-categorization, and the perceived homogeneity of ingroups and outgroups: the interaction between social motivation and cognition. In: Sorrentino R, Higgins E, editors. Handbook of motivation and cognition: foundations of social behavior. New York: Guilford Press; p. 182–222.
  • Hewstone M. 1990. The ‘ultimate attribution error’? A review of the literature on intergroup causal attribution. Eur J Soc Psychol. 20(4):311–335.
  • Ijabadeniyi A, Vanclay F. 2020. Socially-tolerated practices in environmental and social impact assessment reporting: discourses, displacement, and impoverishment. Land. 9(2):33.
  • Imperiale A, Vanclay F. 2021. Conceptualizing community resilience and the social dimensions of risk to overcome barriers to disaster risk reduction and sustainable development. Sustainable Dev. 29(5):891–905.
  • Jijelava D, Vanclay F. 2017. Legitimacy, credibility and trust as the key components of a social licence to operate: an analysis of BP’s projects in Georgia. J Clean Prod. 140:1077–1086.
  • Jijelava D, Vanclay F. 2018. How a large project was halted by the lack of a social Licence to operate: testing the applicability of the Thomson and Boutilier model. Environ Impact Assess Rev. 73:31–40.
  • Jost J, Kay A. 2010. Social justice: history, theory, and research. In: Fiske S, Gilbert D, Lindzey G, editors. Handbook of social psychology. 5th ed. Hoboken (New Jersey): Wiley; p. 1122–1165.
  • Joyce S, Sairinen R, Vanclay F. 2018. Using social impact assessment to achieve better outcomes for communities and mining companies. In: Lodhia S, editor. Mining and sustainable development: current issues. London: Routledge; p. 65–86.
  • Kasperson, J. X., Kasperson, R. E., Pidgeon, N., & Slovic , P.et al (2003). The social amplification of risk: assessing fifteen years of research and theory. In N. Pidgeon, R. E. Kasperson, and P. Slovic (Eds.), The social amplification of risk (Vol. 1, pp. 13–46). Cambridge University Press.
  • Lévy-Leboyer C. 1988. Success and failure in applying psychology. Am Psychol. 43(10):779–785.
  • Lilienfeld SO. 2012. Public skepticism of psychology: why many people perceive the study of human behavior as unscientific. Am Psychol. 67(2):111–129.
  • Lima ML. 2006. Predictors of attitudes towards the construction of a waste incinerator: two case studies. J Appl Soc Psychol. 36(2):441–466.
  • Lima ML, Moreira S, Marques S. 2012. Participatory community involvement in the planning processes of building projects: a social psychological approach. Umweltpsychologie. 16(1):68–87.
  • Mahmoudi H, Renn O, Vanclay F, Hoffmann V, Karami E. 2013. A framework for combining social impact assessment and risk assessment. Environ Impact Assess Rev. 43:1–8.
  • Malle BF. 2006. The actor-observer asymmetry in attribution: a (surprising) meta-analysis. Psychol Bull. 132(6):895–919.
  • Marques S, Lima ML, Moreira S, Reis J. 2015. Local identity as an amplifier: procedural justice, local identity and attitudes towards new dam projects. J Environ Psychol. 44:63–73.
  • Meissen G, Cipriani J. 1984. Community psychology and social impact assessment: an action model. Am J Comm Psych. 12(3):369–386.
  • Moffat K, Zhang A. 2014. The paths to social licence to operate: an integrative model explaining community acceptance of mining. Resour Policy. 39(1):61–70.
  • Nag R, Corley KG, Gioia DA. 2007. The intersection of organizational identity, knowledge, and practice: attempting strategic change via knowledge grafting. Acad Manag J. 50(4):821–847.
  • Nolan JM, Schultz PW, Cialdini RB, Goldstein NJ, Griskevicius V. 2008. Normative social influence is underdetected. Pers Soc Psychol Bull. 34(7):913–923.
  • Parsons R. 2008. We are all stakeholders now: the influence of western discourses of ‘community engagement’ in an Australian Aboriginal community. Crit Perspect Int Bus. 4(2–3):99–126.
  • Prenzel P, Vanclay F. 2014. How social impact assessment can contribute to conflict management. Environ Impact Assess Rev. 45:30–37.
  • Rau I, Schweizer-Ries P, Hildebrand J. 2012. Participation: the silver bullet for the acceptance of renewable energies? In: Kabisch S, Kunath A, Schweizer-Ries P, Steinführer A, editors. Vulnerability, risks, and complexity: impacts of global change on human habitats. Boston MA: Hogrefe; p. 177–191.
  • Santiago AL, Demajorovic J, Rossetto DE, Luke H. 2021. Understanding the fundamentals of the social licence to operate: its evolution, current state of development and future avenues for research. Resour Policy. 70:101941.
  • Schwartz N. 2000. Agenda 2000: social judgment and attitudes: warmer, more social, and less conscious. Eur J Soc Psychol. 30:149–176.
  • Schwarz N. 2007. Attitude construction: evaluation in context. Soc Cogn. 25(5):638–656.
  • Schwarz N, Tesser A, Bohner G. 2001. The construction of attitudes. Oxford (UK): Blackwell; p. 436–457.
  • Schweinsberg S. 2007. Contributions to economic social impact assessment methods from psychological values analysis: a review. Aust Planner. 44(4):44–51.
  • Slovic P. 2012. The perception gap: radiation and risk. Bull At Sci. 68(3):67–75.
  • Slovic P, Finucane M, Peters E, MacGregor D. 2004. Risk as analysis and risk as feelings: some thoughts about affect, reason, risk, and rationality. Risk Anal. 24(2):311–322.
  • Slovic, P. 1987. Perception of risk. Science. 236(4799):280–285. https://doi.org/https://doi.org/10.1126/science.3563507
  • Smith E, Queller S. 2004. Mental representations. In: Brewer MB, Hewstone M, editors. Social Cognition. Malden: Blackwell; p. 5–27.
  • Staples S, Cornelius R, Gibbs M. 1999. Noise disturbance from a developing airport perceived risk or general annoyance? Environ Behav. 31(5):692–710.
  • Steg L, Buunk AP, Rothengatter T. editors. 2008. Applied social psychology: understanding and managing social problems. Cambridge (UK): Cambridge University Press.
  • Swim JK, Stern PC, Doherty TJ, Clayton S, Reser JP, Weber EU, Gifford R, Howard GS. 2011. Psychology’s contributions to understanding and addressing global climate change. Am Psychol. 66(4):241–250.
  • Swim, J., Clayton, S., Doherty, T., Gifford, R., Howard, G., Reser, J., Stern, P., & Weber, E. 2009. Psychology and global climate change: Addressing a multi-faceted phenomenon and set of challenges. A report by the American Psychological Association's task force on the interface between psychology and global climate change. American Psychological Association, Washington.
  • Turner JC, Oakes PJ, Haslam SA, McGarty C. 1994. Self and collective: cognition and social context. Pers Soc Psychol Bull. 20(5):454–463.
  • Tyler T. 2000. Social justice: outcome and procedure. Int J Psychol. 35(2):117–125.
  • Tyler T, Lind E. 1992. A relational model of authority in groups. Adv Exp Soc Psychol. 25(C):115–191.
  • Vanclay F. 2003. International principles for social impact assessment. Impact Assess Proj Apprais. 21(1):5–11.
  • Vanclay F. 2004. Social principles for agricultural extension to assist in the promotion of natural resource management. Aust J Exp Agric. 44(3):213–222.
  • Vanclay F. 2017. Principles to gain a social licence to operate for green initiatives and biodiversity projects. Curr Opin Environ Sustainability. 29:48–56.
  • Vanclay F. 2020. Reflections on social impact assessment in the 21st century. Impact Assess Proj Apprais. 38(2):126–131.
  • Vanclay F, Esteves AM, Aucamp I, Franks D. 2015. Social impact assessment: guidance for assessing and managing the social impacts of projects. Fargo ND: International Association for Impact Assessment. https://www.iaia.org/uploads/pdf/SIA_Guidance_Document_IAIA.pdf [accessed 2021 Nov 25].
  • Vanclay F, Hanna P. 2019. Conceptualizing company response to community protest: principles to achieve a social license to operate. Land. 8(6):101.
  • Veenker R, Vanclay F. 2021. What did NAM do to get a social licence to operate?: the social impact history of the Schoonebeek oilfield in The Netherlands. Extr Ind Soc. 8(2):100888.
  • Vlek, C., & Steg, L. 2007. Human behavior and environmental sustainability: Problems, driving forces, and research topics. Journal of Social Issues. 63(1):1–19. https://doi.org/https://doi.org/10.1111/j.1540-4560.2007.00493.x
  • Wallace D, Paulson R, Lord C, Bond C. 2005. Which behaviors do attitudes predict? Meta-analyzing the effects of social pressure and perceived difficulty. Rev Gen Psychol. 9(3):214–227.