• Keine Ergebnisse gefunden

P ART II: H UMAN R IGHTS I MPACT A SSESSMENTS AND P UBLIC L AW

2 Chapter : Human Rights Impact Assessments: Types and Background Norms

5.4 Causes of Uncertainty and their Relevance for Impact Assessments

5.4.4 Risk Biases and the Quest for Legal Responses

Ex-ante impact assessments unavoidably are confronted with uncertainty and risk.991 An ad-vantage of a structured risk and impact assessment is that it can bring order into a number of facts and values and is therefore a “disciplined form of practical reasoning”.992 At the same time, risk and impact assessments face the challenge of complexity. Deciding what impacts are likely to be significant requires, theoretically, the evaluation of potentially unlimited information, even though individuals and institutions only have limited capacities to process different information at the same time. So parties involved in HRIAs – both those in charge of conducting the impact analysis and those involved as participants during consultation – must take decisions under (remaining) uncertainty. “Decision” in this sense is not limited to the authority to enact a law or conclude a contract. It also includes the decision of individuals or organizations to informally object and resist against a project or policy during the consultation period. How humans per-ceive risks and how they make decisions under uncertainty can therefore be one factor to evalu-ate the prospect of success of HRIAs. As will be outlined below, psychological research found that cognitive biases often distort the quality of our perception of risks and opportunities. This is important to the institutionalization of impact assessments: the design of impact assessment law can help to cushion the effect of these biases as will be illustrated in the following.

Many of these cognitive biases are heuristic biases. Heuristics are generally defined as rules or procedures that enable more efficient decision-making by converting complex problems into

991 On the differing use of the terms „uncertainty“ and „risk“ in different disciplines see section 5.2.

992 Fischhoff and Kadvany, Risk (above, n. 825), p. 149.

simpler ones. 993 Heuristics are tools used in science,994 every-day decision-making – and in law:

as argued above, structural principles have a heuristic function995 and thus help to make legal decisions. The importance of heuristics can therefore hardly be overestimated. Arguably, people conducting HRIAs – both as “analysts” in charge and as consulted participants – use heuristic tools to assess the consequences of an initiative.

However, as is the case with all tools, the effect they produce depends on how they are used. One of the most relevant heuristic biases for risk decision-making is the so-called availability heuris-tic. People often rely on an event’s availability when assessing its probability: Availability means the “ease with which relevant instances come to mind”996. While availability and frequency often correlate, other factors also determine how available certain information is, such as how often a topic is discussed in the media. Consequently, availability heuristic can lead to a structural bi-as.997 In consequence, certain “risks” might be over-, other less “available” risks might be under-estimated – and in consequence be over- or under-regulated. People tend to rate the probability of an event higher if they have experienced or learned about such an event before. A related bias concerns simulation heuristic: people tend to judge an event as likely if it is easy to imagine it happening.998 This would help to explain why it is so difficult to adequately assess many risks in decision-making with potentially transnational effects, namely if those who assess potential risks of an initiative do not know the social and political context of the places where the harm might realize. Many potential impacts may therefore be hard to imagine for people not familiar with the culture of an affected community: These risks might not easily “come to mind”. Con-ducting impact assessments in an inclusive manner involving different departments of an insti-tution as well as different external actors would allow introducing different viewpoints and dif-ferent types of experience. In consequence, more “relevant instances” might come to a decision-maker’s mind which would, in turn, reduce the misleading effect of the availability and simula-tion heuristic biases. Rules on participasimula-tion and transparency, as well as broad-based consulta-tion requirements can, before this background, make sense.

In particular, Kahnemann and Tversky analyzed how people actually manage and deal with risk and found different explanations for why people make apparently irrational choices. The ration-al choice axiom often fails to describe human behavior because it ignores basic psychologicration-al principles, in particular the fact that people often make decisions based on how gains and losses are perceived and evaluated.999 A first observation is what has been called the default-rule. Often, people fear change more than the potential results. Therefore, whether an initiative will be suc-cessful often depends on the context and how it is framed. For example, studies suggest that

993 Spencer Phillips Hey, ‘Heuristics and Meta-heuristics in Scientific Judgement’, British Journal for the Philosophy of Science, 62 (2016), pp. 471–495; Beutin, Die Rationalität der Risikoentscheidung (above, n.

345), p. 210.

994 Hey, ‘Heuristics and Meta-heuristics in Scientific Judgement’ (above, n. 993); Peter Winker, Optimiza-tion Heuristics in Econometrics (Chichester, New York: Wiley, 2001).

995 See section 3.2.2.3.

996 Amos Tversky and Daniel Kahneman, ‘Availability: A Heuristic for Judging Frequency and Probability’, Cognitive Psychology, 5 (1973), pp. 207–232, p. 207.

997 Ibid.; Sunstein and Kuran, ‘Availability Cascades and Risk Regulation’ (above, n. 856).

998 Fischhoff and Kadvany, Risk (above, n. 825), p. 101.

999 Daniel Kahneman and Amos Tversky, ‘Prospect Theory: An Analysis of Decision under Risk’, Economet-rica, 47 (1979), pp. 263–291, p. 289.

fewer people would “forbid” an activity than they would “not allow” it.1000 Consequently, how questions are phrased during the consultation period of an HRIA can influence the answers.

Another factor is the optimism or overconfidence bias: People often over-estimate their own abil-ities. This is often the case when experts rely only on their discipline even where a broader per-spective would be required.1001 Applied to HRIAs, it means that those in charge of conducting the IA may tend to overestimate their ability to calculate and predict potential impacts. At the same time, they may tend to believe that they understand the initiative and its (human rights) impacts better than they actually do. The overconfidence bias often results in so-called planning fallacy.

If people – as individuals or as part of an organization – plan projects, they estimate how much time they need to complete it. Psychological research suggests that many of these predictions are unrealistic because planners tend to be optimistic and rely on their best-case scenario even though similar projects in the past have normally run late.1002 Similarly, those in charge of con-ducting an HRIA may overestimate their ability to design a policy or install mitigation measures that would, in a timely manner, effectively avoid or mitigate negative effects. This has conse-quences for individual planners and the institutional design of impact assessments: If planners are aware of these biases, they might better be able to adjust their predictions accordingly. This would require adequate training. However, law could react to this bias and require, for example, that an IA report explicitly identifies a worst-case scenario, no matter how unlikely it appears at that time.

Another particularity is the observation that people are generally good at explaining away in-convenient evidence.1003 This is naturally true for all sides of a disputed initiative, and a potential obstacle to finding a compromise. While participatory impact assessments allow producing dif-ferent types of evidence, so that it is harder for decision-makers to withhold inconvenient in-formation, it is nevertheless a problem insofar as decision-makers in the end will have to evalu-ate the respective evidence. It is at this stage that the tendency to explain away inconvenient evidence can become problematic.

An additional challenge is what has been called the outcome bias: Often, people confuse the qual-ity of a risk evaluation with the qualqual-ity of the outcome. It means that people tend to judge the quality of a risk decision ex-post in light of the actual outcome and not based on the information that was available at the time when the risk decision was made. The outcome bias is therefore a cognitive error made to evaluate the risk decision as such if the result of the decision is already known (it is therefore closely related to the hindsight bias).1004 This can be counterproductive

1000 Fischhoff and Kadvany, Risk (above, n. 825), p. 74. Also, people often place extra value on certain out-comes, e.g. going from a chance of 90% to 100% means more than going from 45% – 50%: Kahneman and Tversky, ‘Prospect Theory: An Analysis of Decision under Risk’ (above, n. 999), p. 265.

1001 Fischhoff and Kadvany, Risk (above, n. 825), p. 62.

1002 Roger Buehler, Dale Griffin, and Michael Ross, ‘Inside the Planning Fallacy: The Causes and Conse-quences of Optimistic Time Predictions’, in: Thomas Gilovich, Dale W. Griffin, and Daniel Kahneman (eds.), Heuristics and biases, pp. 250–270

1003 Fischhoff and Kadvany, Risk (above, n. 825), p. 16.

1004 Jonathan Baron and John Hershey, ‘Outcome Bias in Decision Evaluation’, Journal of Personality and Social Psychology, 54 (1988), pp. 459–579; Francesca Gino, Don A. Moore, and Max H. Bazerman, ‘No harm, no foul: The outcome bias in ethical judgments’, Harvard Business School, in: Working Paper, 08-080. The hindsight bias was identified in experimental work and means that “people consistently exaggerate what could have been anticipated in foresight”: Fischhoff, Slovic and Lichtenstein, Lay foibles and expert fables in

insofar as the fact that an event did occur does not mean that the prognosis of a low probability was wrong, or the other way round. It is therefore a bias that can make it difficult to learn les-sons during ex-post impact assessments. In order to avoid this bias, it is necessary to evaluate a risk decision without being influenced by a favorable or disadvantageous outcome as such.

Another and - in particular for policy-making - problematic aspect is the sunk-cost-bias. 1005 Peo-ple often “throw good money after bad” so as to avoid acknowledging losses or bad decisions.

This is extremely relevant for international development law insofar as it indicates that once a decision has been taken it is less likely that it will be corrected: a dam construction rarely stops after it has begun, regardless of the number of problems that are encountered. This appears par-ticularly important to evaluate the chance of ex-post impact assessments to correct an initiative with negative human rights consequences. If such an assessment identifies significant human rights impacts, the sunk-cost-bias would nevertheless reduce the chances that the project or policy is changed. While the sunk-cost-bias might intuitively only be applied to projects, at least a similar logic also applies to abstract policies. Legislation, regulation or, even more so, interna-tional agreements are often enacted in long and time-consuming procedures. Unless the respec-tive legal document has installed some fast-track flexibility mechanisms1006, most actors have an incentive not to re-start a complex and demanding negotiation process.

An “emotional” heuristic tool are so-called affect-heuristics1007 which relate to the role of emo-tions in decision-making and risk-evaluation. Like other types of heuristics, it can help or hinder.

In the context of international economic law, emotions can play a major role. One example is that individuals often blame their problems on other persons or institutions (such as the World Bank) rather than on situations (corruption, mismanagement). However, affect heuristics does not only influence how laypersons make decisions, but can also explain expert decisions, at least insofar as they go beyond the analysis of hard data: scientific research is not only value-laden, but at times also guided by emotions. Other – often emotional - challenges concern differing risk perceptions, in particular if distributive aspects are involved. This is to a certain extent guided by individual interests and the unwillingness to accept impacts because oneself is affected. Often derogatory described as a NIMBY attitude (“not in my backyard”),1008 the reaction is not always selfish. Rather, this reaction touches upon the fundamental conflict between individual and community interests. At the same time, it is a question of distributional justice: often risks in society are unequally shared, and people who are financially better-off can usually evade nega-tive impacts (e.g. by moving to more expensive districts less affected by industrial emissions) or because they can adapt to change and gain benefits from legislative and regulatory reform (e.g.

from the liberalization of markets). In consequence, it is an important challenge for the assess-ment of impacts and risks to determine which objections are based on pure self-interest and which ones can normatively be justified.

judgment about risk, in: Baruch Fischhoff (ed.), Risk Analysis and Human Behavior (Abingdon: Earthscan 2012), p. 199.

1005 Hal R. Arkes and Catherine Blumer, ‘The psychology of sunk cost’, Organizational Behavior and Human Decision Processes, Vol. 35 Issue 1 (1985), pp. 124-140; Fischhoff and Kadvany, Risk (above, n. 825), p. 77.

1006 On the link between flexibility mechanisms and the effectiveness of HRIAs see section 9.1.2.1.5.

1007 Paul Slovic, Melissa Finucane, Ellen Peters et al., ‘The Affect Heuristic’, in: Thomas Gilovich, Dale W.

Griffin, and Daniel Kahneman (eds.), Heuristics and biases, pp. 397–420.

1008 Peter D. Kinder, Not in My Backyard Phenomenon, in: Encyclopædia Britannica (9 Oct. 2019), available

at: https://www.britannica.com/topic/Not-in-My-Backyard-Phenomenon.

It is unrealistic to completely avoid the aforementioned biases. Social risk scholars have pointed out that decision-makers can only to a certain extent take rational decisions. In other words, decisions in real life are generally taken within “bounded rationality”.1009 A realistic goal of ra-tional behavior under uncertainty would therefore not be optimization in the strict sense of the word – which would require that an investigation must continue until the best option is found – but “approximate optimization” 1010, which means that enough elements of a decision can and must be ignored in order to think systematically about those that remain.1011 In the context of decisions under uncertainty and risk, it may still be rational - in the sense of “bounded rationali-ty” - to apply “the same rule to many hazards, while ignoring differences among them”.1012 Simi-larly, institutionalized ex-ante impact assessments may not be able to identify optimized but only satisficing options. This inherent limit to rational decision-making should be borne in mind when evaluating the quality of an HRIA. Expectations should not be unrealistically high.

Outline

ÄHNLICHE DOKUMENTE