• Keine Ergebnisse gefunden

Phenomenological and epistemological causes of uncertainty

P ART II: H UMAN R IGHTS I MPACT A SSESSMENTS AND P UBLIC L AW

2 Chapter : Human Rights Impact Assessments: Types and Background Norms

5.4 Causes of Uncertainty and their Relevance for Impact Assessments

5.4.1 Phenomenological and epistemological causes of uncertainty

Broadly speaking, uncertainty can be based on phenomenological, epistemological936 or practical causes consisting mainly in the distribution of knowledge. Many causes of uncertainty are phe-nomenological. This means that the very nature of phenomena causes uncertainty, upon which it depends whether or not this type of uncertainty is (for the time being) reducible or not.937 This is, first, of major relevance in the case of novelty and innovation,938 be it technological (“regula-tion of risk”) or regulatory (“regula(“regula-tion as risk”) innova(“regula-tion. One legal technique to deal with this type of uncertainty would be the application of flexible instruments to adapt to new insights and experience gained over time. This would mean that ex-post HRIAs are conducted after a pro-gram or policy is implemented. If negative human rights impacts are identified, the policy can be adapted accordingly. The flexibility of legal agreements and acts is, from this perspective, desir-able, even though at the expense of legal determinacy.

931 Ibid., p. 12.

932 Ibid., p. 13.

933 Allan Gibbard, Wise Choices, Apt Feelings (Oxford: Clarendon, 2002), p. 88.

934 Sigel, Klauer and Pahl-Wostl, ‘Conceptualising uncertainty in environmental decision-making: The ex-ample of the EU Water Framework Directive’ (above, n. 930), p. 12.

935 On the ability of decision-makers to agree on abstractions without agreeing on the particular meaning of those abstractions: Cass R. Sunstein, ‘Practical Reason and Incompletely Theorized Agreements’, Cur-rent Legal Problems, 51 (1998), pp. 267–298.

936 Sigel, Klauer and Pahl-Wostl, ‘Conceptualising uncertainty in environmental decision-making: The ex-ample of the EU Water Framework Directive’ (above, n. 930), p. 14; Faber, Manstetten and Proops, ‘Hu-mankind and Environment: An Anatomy of Surprise and Ignorance’ (above, n. 835), p. 228.

937 Beutin, Die Rationalität der Risikoentscheidung (above, n. 345), 205 ff.; Faber, Manstetten and Proops,

‘Humankind and Environment: An Anatomy of Surprise and Ignorance’ (above, n. 835), p. 228.

938 Ibid., p. 288.

On a more general level, deterministic theories of science were increasingly challenged over the past century or so. This means that the ideal of objective predictability is challenged not only due to practical but due to limits inherent to scientific research. The most illustrative example is the evolution of chaos theory and the often-quoted “butterfly effect”, an irritant to deterministic natural scientific theories – and potentially also for command-and-control approaches to regula-tion. In chaotic systems, two identical activities can have different impacts at different times.

Chaos does not mean randomness, but rather implies the lack of predictability because systems are highly sensitive to initial conditions.939 The consequences are apparent: If calculation was believed to be the instrument to predict future events, then highly sensitive initial conditions would make such a calculation difficult or impossible due to the infinite numbers of potential combinations. Consequently, even an increased accuracy of arithmetic operations would not lead to greater predictability.940 Arguably, not only natural systems such as the global climate or biodiversity, but also social systems can be so complex that predictability is not only practically but even theoretically impossible.941 In this sense, committee meetings have been described through the lenses of chaos theory.942 It would require further analysis, however it does not seem impossible that the assessment of human rights impacts of economic policies might also concern the assessment of impacts in a chaotic system so that predictions are insofar not possi-ble. However, this does not mean that impact assessments would not make sense, as they can serve different purposes beyond the prediction of real-life consequences – namely to accumulate preferences or to transform decision-making.943

However, even outside chaos theory, inherent limits to scientific predictability exist. Often, natu-ral and social scientists themselves disagree about methods and findings. Uncertainties in data collection or about the reliability of a method are among the different reasons for why that is the case.While natural scientists may argue about whether animal studies or epidemiology is the best method to assess cancer risks,944 social scientists may disagree about whether quantitative or qualitative methods produce more reliable results. Similar challenges exist with regard to modelling: it is hard to tell whether a model is a constructive or misleading simplification. What are the implications for impact assessment law? While deterministic theories would strongly support scientific and expert-based IAs to inform decision-makers (“information model”), in particular the phenomenological causes of uncertainty seem to indicate that such an exercise would be fruitless. In line with the more critical approaches, it would become necessary to focus on other functions of impact assessments, for example to foster deliberation, involve civil society or increase transparency.

939 This lack of predictability is what, in reference to the mathematician and meteorologist Edward N.

Lorenz, became commonly known as the “butterfly effect”: Edward Lorenz, ‘Predictability: Does the Flap of a Butterfly's wings in Brazil Set Off a Tornado in Texas?’, American Association for the Advancement of Science, 139th Meeting, 1972; Friedrich Cramer, Chaos and Order (Weinheim: VCH, 1993), p. 117; Faber, Manstetten and Proops, ‘Humankind and Environment: An Anatomy of Surprise and Ignorance’ (above, n.

835), p. 288.

940 Ibid., p. 230; Edward Lorenz, ‘Deterministic Nonperiodic Flow’, Journal of the Atmospheric Sciences, 20 (1963), pp. 130–141; James Gleick, Chaos (New York, N.Y: Penguin Books, 2008), 20th anniversary ed.

941 On the complexity and irrationality of history: Cramer, Chaos and Order (above, n. 939), 219 ff.

942 James Arthur Anderson, Communication Theory (New York: Guilford Press, 1996), 35 ff.

943 See section 5.3.

944 Fisher, Risk Regulation and Administrative Constitutionalism (above, n. 68), 7 et seq.; National Research Council, ‘Science and Judgment in Risk Assessment’, National Academy Press, 1994, 58 ff.

Epistemological causes for uncertainty refer to the way we perceive certain phenomena.945 Kant has formulated an early epistemological critique that challenges the basic assumptions of scien-tific determinism. He stated that “the things that we intuit are not in themselves what we intuit them to be, nor are their relations so constituted in themselves as they appear to us; and that if we remove our own subject or even only the subjective constitution of the senses in general, then all the constitution, all relations of objects in space and time, indeed space and time them-selves would disappear, and as appearances they cannot exist in themthem-selves, but only in us”.946 In the Preface to the second edition of the Critique of Pure Reason, Kant reflects on the method of “those who study nature” and, based on examples of mathematics and science, argues that the

“altered method of our way of thinking” means “that we can cognize of things a priori only what we ourselves have put into them”.947 The fundamental role of (subjective) perception queries the objectivity of statements or predictions about the future course of events. However, if cognition is an active reconstruction of reality depending on an individual’s perspective, one way to deal with this cognitive uncertainty would be to overcome the (limited) subjective perspectives and utilize or combine other perspectives in a process of structured reflection.948 In the area of insti-tutionalized impact assessments, these structured processes could help to increase decision-makers’ awareness of human rights impacts. Pursuant to Justice Stewart’s reversed statement “I see it when I know it”949, participatory and inclusive IAs can make decision-makers see potential impacts and turn at least closed into open ignorance.950

Even with regard to empirical research, the theory of science has not left the idea of objective knowledge unscathed. Especially for Popper, falsifiability is a constitutive element of empirical research.951 In other words: falsifiability and thus uncertainty about whether or not what is re-garded as true today will still be valid tomorrow is not an indicator of imperfect science, but an inherent element of (at least) empirical science.952

These insights justify the shift towards more inclusive and deliberative impact assessments.

Politics have recently given more “institutional form” to the “dialogical engagement” between laypersons and scientists,953 but also among scientists with different opinions. Involving differ-ent perceptions through deliberative procedures would be one rather practical response to deal with this form of epistemological uncertainty. This can therefore have important consequences

945 Faber, Manstetten and Proops, ‘Humankind and Environment: An Anatomy of Surprise and Ignorance’

(above, n. 835), p. 230.

946 Immanuel Kant, Critique of Pure Reason (Cambridge: Cambridge Univ. Press, 1998), Paul Guyer and Allen W. Wood, p. 168.

947 Ibid., p. 111.

948 Arno Scherzberg, ‘Wissen, Nichtwissen und Ungewißheit im Recht’, in: Christoph Engel, Jost Halfmann, and Martin Schulte (eds.), Wissen - Nichtwissen - Unsicheres Wissen, pp. 113–144, p. 115; James Surowiecki, Die Weisheit der Vielen (Kulmbach: Plassen Verlag, 2017).

949 Wolfgang Schulz, ‘Beurteilungsspielräume als Wissensproblem – am Beispiel Regulierungsverwaltung’, RW Rechtswissenschaft, Seite, 3 (2012), pp. 330–350, borrowing from Justice Stewart’s phrase “I know it when I see it” in his concurring opinion in: U.S. Supreme Court, Jacobellis v. Ohio.

950 On different types of ignorance see section 5.2.

951 Karl R. Popper, The Logic of Scientific Discovery (London: Hutchinson, 1959).

952 There are arguably also other inherent epistemic limits of science. For scientific assessments using mathematical models, the epistemic limits of science can, on a theoretical level, be traced back to Gödel’s incompleteness theorems, which demonstrate the “limits of provability in formal axiomatic theories”, Faber, Manstetten and Proops, ‘Humankind and Environment: An Anatomy of Surprise and Ignorance’

(above, n. 835), 231 f.

953 Giddens, ‘Risk and Responsibility’ (above, n. 824), p. 6.

for the design of impact assessment law as a deliberative and participatory process (see chap-ter 7 on different modalities of participation).

Outline

ÄHNLICHE DOKUMENTE