• Keine Ergebnisse gefunden

Methode Themenstellung “then” The crux of reducing emissions in the long-term: The underestimated “now” versus the overestimated

N/A
N/A
Protected

Academic year: 2022

Aktie "Methode Themenstellung “then” The crux of reducing emissions in the long-term: The underestimated “now” versus the overestimated"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

The crux of reducing emissions in the long-term: The underestimated “now” versus the overestimated “then”

JONAS, Matthias (International Institute for Applied Systems Analysis, IIASA) ŻEBROWSKI, Piotr (International Institute for Applied Systems Analysis, IIASA) JARNICKA, Jolanta (Jagiellonian University, Cracow)

Inhaltliche Zuordnung: Auswirkungen des Klimawandels auf Umwelt und Gesellschaft/Vermeidung und Anpassung Beitragsart: Wissenschaftlicher Beitrag

Förderprogramme: submission_foerderprogramm_sonstige Projektakronym:

Call:

Laufzeit:

Wissenschaftsdisziplin(en): Earth System Sciences

Themenstellung

This presentation is a perspective piece, on research still to be conducted. Here we aim to elaborate on the usefulness of greenhouse gas (GHG) emission inventories (cf. Lieberman et al. 2007; White et al. 2011; Ometto et al. 2015).

Emissions experts are trying to understand how helpful GHG emission inventories are in reconciling short- and long-term emission estimates. Given that emission paths are sensitive to historical conditions, as well as uncertain with respect to their mandated reduction levels, experts are grappling with the probability that long-term targets will be missed (Jonas et al. 2010a: 4.3; 2010b; 2014).

We deal with this question from a data-processing perspective. To obtain promising outcomes, we lower our ambitions by posing two conditions. First, we restrict ourselves to systems with memory. Memory allows us to reference how strongly a system’s past can influence its near-term future. This brings us to our second condition: we make only a

“small” step into the future. This step is sufficient to show that the way a data analyst and a prognostic modeler understand uncertainty, when transgressing “today”—the interface between past and future—fundamentally differ.

Methode

When we speak of memory, we refer to the momentum of a system over a historical time range. Different approaches are able to capture memory. By way of example, we capture memory with the help of three of its characteristics: its temporal extent, its weight over time, and its quality over time. The extent of memory quantifies how many historical data directly influence the current system, while the weight of memory describes the strength of that influence. The quality of memory steers how well we know the latter.

One important question is how well we need to know these (and/or possibly other) characteristics of memory in order to delineate a system’s near-term future, which we seek to do by means of what we call the system’s explainable outreach [EO]. In our perspective piece, we argue that we have reasons for optimism that the system’s EO can be derived under both incomplete knowledge of memory and imperfect understanding of how the system is forced.

Our focus is on forced systems. In many cases, we know that a system possesses memory, for example, because it does not respond instantaneously to the forcing it experiences. But we find it difficult to quantify memory, let alone visualize it, in a way that is easy to understand, particularly for practitioners and decision-makers.

Figure 1 is as a major example of a forced system. Here, the forcing is due to anthropogenic activities (e.g., fossil-fuel burning, cement production, and land use). The figure informs us of the emission reduction paths we would have to follow almost instantaneously at the global scale to keep global warming at or below 2ºC to prevent the most dangerous impacts of climate change. However, the figure does not inform us about the “degree and extent of persistence” with which GHG emissions will continue along their historical path into the future, for example, due to old, carbon-intensive furnaces remaining in operation; such knowledge is crucial for the design, implementation, and effectiveness of realistic emission-reduction policies and for overcoming path-dependences caused by memory.

Is it then possible to distinguish between the various characteristics of memory and specify them through diagnostic

(2)

data-processing alone? Or, to put it another way, how much systems understanding do we need to possess and also to inject into the data-analysis process to enable those distinctions to be made? This question, which is directly related to our previous question, is of considerable interest. As yet, we do not see any possibility of it being answered, except theoretically. However, we do see a value in exploring it in order to identify approximate, yet sufficiently robust, modi operandi to identify EO concepts that are easy to apply in practice.

Ergebnisse

The main objective of this perspective piece is to explore and reflect on an approach that allows a system’s EO to be derived and to demonstrate the usefulness of doing so. To that end, we start from a simple synthetic data (time) series example—our control—which we equip, step by step, with realistic physical features, such as memory and noise, while exploring the system’s persistence - the counterpart to memory - and deriving its EO (forward mode). The main reason for providing the example is to better understand memory and persistence and to consolidate our systems thinking.

Nonetheless, our perspective piece should be considered as explorative only. Systemic insight is more valuable than mathematical rigor. The example is geared to making the concept of EOs applicable. We discuss how consequential the example is, where it underperforms, and the questions it raises.

Overall, our insights indicate the high chance of our conjecture proving true: being ignorant of memory and persistence, we underestimate, probably considerably, the momentum with which GHG emissions will continue on their historical path beyond today and thus overestimate the amount of reductions that we might achieve in the future.

Abb: Figure 1. Illustrating the effect of implementing pledges and the so-called intended, nationally determined contributions of 146 governments.

Referenzen

ÄHNLICHE DOKUMENTE

regression) fall within the (in- and out-of-sample) confidence band of our linear regression for a time that corresponds to 2x the extent of memory?.. Our take-home messages

Between 1997 and 2007, US emissions grew steadily (0.7% per year) as increases related to population growth and consumption volume (per capita consumption) outpaced the

Simulated annual mean surface concentrations (ng m −3 ) of the BC aerosol tracer from all emission categories (top left) as well as relative contributions (%) from the various

Building upon the preparatory detection of emission changes (emission signals) under the Kyoto Protocol, it addresses the problem of correcting allowable mid-term emission

ln I it = α 0 + β ln P it + γ ln A it + ln T it + θCommitment it + e it , (3) where i and t are the country identity and the time series; α 0 = ln α; I is various greenhouse

Keywords: Kv10.1, cell cycle, G2/M phase, DNA damage activated-checkpoint, spindle assembly checkpoint, microtubule dynamics, ORAI1, calcium....

Finalmente, dado que la distribución de los residuos de la ecuación de corto plazo es unimodal y no presenta mayor masa de probabilidad en sus colas (o valores extremos),

b) Collaborative R&D projects are consistently more likely to occur in the US than in Europe. However, in-house projects are a significant majority of the drug R&D