• Keine Ergebnisse gefunden

Toolbox and some Basic Results

1.1 On L´ evy Processes

This section is devoted to an introduction to L´evy processes. We summarize several results that we will apply in the sequel. This is for the reader’s convenience to provide her/him with the material on L´evy processes to be applied below which is widely spread over many books and articles in the recent literature; the most important results are quoted below.

We start with a short motivation why L´evy processes present an intuitive extension of deterministic relationships. The idea is taken from a manuscript by Jan Kallsen, “L´evy-Prozesse anschaulich” [73]. He provides an intuitive and illustrative way to introduce L´evy processes. We close the motivation at a certain point where mere intuition becomes difficult and continue with a rigorous treatment. Finally, we present some models often applied for pricing issues. In the following chapters, we will then discuss if and when our assumptions are satisfied in these models.

1.1.1 Motivation

The aim of this thesis is to optimize the expected utility of a portfolio of stock returns with respect to a certain utility function. So one of the first steps is to model these stocks. As we go beyond a single time step model, we need to model the whole dynamics of the stocks, i.e. a function depending

1

on time and randomness. InKohlmann and Niethammer[76] dynamics are described by continuous semimartingales. In the simplest case, the pro-cess evolves like a diffusion - a Brownian motion - which has been observed for the first time by a biologist. In finance one considers the exponential of a Brownian motion (geometric Brownian motion) to guarantee positivity of the process. The problem of such models in finance is that sudden and not predictable events cannot be modeled adequately. The variance of the process has to be increased tremendously such that this extreme behavior admits a realistic probability of appearance. However, such a high variance would not meet the normal evolution of a stock. It is therefore natural to introduce jumps. We will remain with L´evy processes, as they are quite flexible while possessing analytical tractability. One the one hand its char-acteristic function can easily be described (L´evy-Khinichin, see Theorem 1.1.14) and on the other hand, one obtains an explicit representation of the process. Every L´evy process can be represented as a sum of a drift term, a Brownian motion times a volatility factor, and an integral with respect to a (compensated) Poisson random measure (L´evy-Itˆo-decomposition, see Theorem 1.1.11).

We start supposing that we are able to describe the world in a determin-istic setting and dynamics A depend on dynamics B. If we change dynamics B, dynamics A is supposed to change as well. In Economics, we would say, we like to derive the sensitivity of A with respect to B. If the func-tion describing the relafunc-tion between A and B is differentiable, we ask for a derivative. Hence, it makes sense to assume an approximately linear de-pendency for small changes. However many dynamics in reality cannot be determined by deterministic dynamics. We thus like to have a stochastic analogue to a linear behavior. So what are the characteristics of a linear function: ˇXt = ˇX0 +bt. It is determined by its starting value and the constant increase of the function in an arbitrary time interval with certain length. The increase from 0 to 1, coincides with the increase from 1 to 2 and is equal to b. So how shall we translate this constant increase to the stochastic world? One way is to assume that the increments (the increase in the observed process, ˇXs+∆t−Xˇs) have the same (“stationary”) distribution and that they are independent of each other for disjoint time intervals. In a discrete time setting, this describes a random walk. For a generalization to continuous time, at first we choose an infinitely small time interval dt, as-sume thatX0= 0 and add stationary and independent increments on these infinitely small intervals. Then we have roughly defined a L´evy process (of course except for some technical conditions also on the distribution of the process).

We next ask why can this processes be interpreted as linear and how can we describe the distribution of such a process? We can answer these two questions together. Notice that ˇXt can be described by the sum of its increments in the infinitely small time intervals (0, dt) for n = 1 and

[(n−1)dt, ndt) forn >1:

t/dt

X

n=1

( ˇXndt−Xˇ(n−1)dt) (1.1)

We start to determine its distribution. As the increments are independent, the characteristic function of ˇXtis the characteristic function of the dtt-power of the characteristic function of ˇXdt. From the characteristic function, we can then derive a distribution. If ˇXdt possesses a characteristic function of log linear form eψ(·)dt, we obtain eψ(·)t for ˇXt. Thanks to (1.1), we get E( ˇX1) = E( ˇXdtdt) and so E( ˇXdt) = bdt for a b ∈ R. If ˇXdt is already deter-ministic, we have a linear function and its characteristic function is equal to exp(iubdt).If not, then the variance of ˇXdtis of orderdtasV( ˇX1) = V( ˇXdtdt). To induce such a property in the two simplest cases, firstly one can for example assume that the process increases with a probability of γdt by a random variable with distribution FJ and stays constant with probability 1−γdt. Secondly, we might assume that ˇXdt/dt changes with probability distributionFB with constant variance. In both cases the variance of ˇXdt is of orderdt.The latter describes a diffusion leading to a characteristic func-tion that can be approximated by exp(−12σ2u2dt).The first example leads to a compound Poisson process. Its characteristic function is approximately exp(R

(eiux−1)γFJ(dx)) asex ≈x+ 1 for smallx. Assuming both processes are independent and by adding a drift (the linear deterministic part), one gets the following first form of a characteristic function:

exp((iub−1

2u2+ Z

(eiux−1)γFJ(dx))dt)

ν =γFJ will describe the well-known L´evy measure. This measure together with the drift and the volatility of the Brownian motion σ - the charac-teristic triplet - uniquely determines the distribution of the L´evy process.

We proceed and ask what happens if jumps are very large or appear too often, perhaps infinitely often? Large jumps produce a problem only if ˇXt has no first finite moment, see Theorem 1.1.17 below. Later we will exclude this case in our considerations, as mathematical finance with infinite returns is rather unrealistic. If there is an infinite number of jumps around zero, R(eiux−1)γFJ(dx) might not exist, asFJ((−ǫ, ǫ)) is infinite for an arbitrary ǫ > 0. Fortunately, it is known that R

|x|≤1x2γFJ(dx) < ∞ which implies thatR

|x|≤1(eiux−1−x)γFJ(dx) is finite. As also FJ(R\[−1,1])<∞ (large jumps can be included) and the characteristic function of ˇXt is the charac-teristic function of the dtt-power of the characteristic function of ˇXdt,we get an intuitively derived version of the so-called L´evy-Khinchin representation for the characteristic function of a limiting L´evy process ˇXt:

exp((iu˜b− 1

2u2+ Z

(eiux−1−iux1|x|≤1)γFJ(dx))t),

where ˜b = b+R

x1|x|≤1γFJ(dx). We could now examine all the different variants testing on their flexibility. But sooner or later these heuristics become inconvenient. We therefore continue with a rigorous treatment next.

We closely follow Cont and Tankov [32].

1.1.2 Definition and some Relevant Facts on L´evy Processes We next continue with a rigorous treatment of L´evy processes. L´evy pro-cesses perfectly make sense as an extension of their discrete time equivalent - random walks: a random walk is determined by its initial starting point and the evolution of the following independent steps - increments. These new steps are always drawn from the same “stationary” distribution. A nat-ural generalization in continuous time is thus given by L´evy processes. We consider those processes on a finite time interval [0, T], T <∞ throughout the thesis:

Definition 1.1.1. A RCLL (right-continuous left limits/cadlag) stochastic process( ˇXt)t[0,T]on (Ω,F, P)with values in RN such thatXˇ0 = 0is called a L´evy process if it possesses the following properties:

1. Independent Increments: for every increasing sequence of times t0, ..., tn the random variables Xˇt0,Xˇt1−Xˇt0, ...,Xˇtn−Xˇtn−1 are inde-pendent.

2. Stationary increments: the law ofXˇt+h−Xˇt does not depend on t.

3. Stochastic continuity: ∀ǫ >0, limh0P(|Xˇt+h−Xˇt|> ǫ) = 0.

Note, the third condition does not imply that the sample paths are con-tinuous. We start with the simplest counterexample - a Poisson process:

Example 1.1.2 (Poisson process). Let (vi)i be a sequence of independent exponential random variables with parameterγ >0 and ˇvn=Pn

i=1vi.The process

Ut=X 1tˇvn

is called a Poisson process with intensityγ.

Clearly the process is almost surely finite, piecewise constant, right con-tinuous with left limits (cadlag), and Markovian. It has stationary and independent increments and satisfies stochastic continuity. For a full list of properties see [32, Proposition 2.12]. We however wish to recall two impor-tant facts. Firstly, for every t > 0, Ut follows a Poisson distribution with parameterγt,i.e. for alln∈N, P(Ut=n) =eγt(γt)n!n.This further implies the special form of the characteristic function ofUt:

E(eiuUt) = exp{γt(eiu−1)} (1.2)

Further it can be easily seen that the sum of two independent Poisson pro-cesses is again a Poisson process with respect to the sum of the intensities (see [32, Proposition 2.13]). Moreover, a Poisson process counts the number of jump times in [0, t]:

Ut= #{n≥1,vˇn∈[0, t]}, Ut−Us= #{n≥1,vˇn∈(s, t]}

So for any measurable set D ⊂ [0, T], a random measure G is defined by setting

G(ω, D) = #{n≥1,vˇn(ω)∈D}.

For a fixedω the measureG(ω,·) is positive, integer valued, andG(ω, D) is finite for any bounded setD.The process is therefore of the following form:

Ut(ω) =G(ω,[0, t]) =:

Z

[0,t]

G(ω, ds)

Moreover the average value of the random measure atDis thenE(G(·, D)) = γ|D|, where |D| describes the Lebesgue measure of D. γ|D| will be equal to the L´evy measure of D. Finally notice, a Poisson process U is not a martingale. However, we obtain a martingale by compensating with the L´evy measure on [0, t] :

t=Ut− hUit=Ut−γt,

wherehUiis the predicable quadratic variation ofU, see e.g. [67]. The mar-tingale ˜U is called the compensated Poisson process. A new random measure can be defined by compensating withγ|D|,i.e. ˜G(ω, D) =G(ω, D)−γ|D|. G˜ is signed.

Note, the defining conditions of a L´evy process also do not exclude that a L´evy process can have continuous sample paths. Every Brownian motion has a continuous modification, independent and stationary increments and is therefore a L´evy process. In fact, we will later see that we can split a L´evy process in two parts, a Brownian part and a jump part - the so-called L´evy-Itˆo-decomposition. This implies that every Gaussian L´evy process is continuous and can be represented as a Brownian motion adding a drift term. Before we explain this decomposition in detail, we like to summarize some facts on the characteristic function of a L´evy process, Poisson random measures, the so-called L´evy measure, and illustrate these findings by con-sidering a compound Poisson process. This further leads to a first version of the L´evy-Itˆo-decomposition. We start with a proposition on the characteris-tic function of a L´evy process, see e.g.Cont and Tankov [32, Proposition 3.2]:

Proposition 1.1.3. Let( ˇXt)t≥0 be a L´evy process onRN.Then there exists a continuous function φ :RN → R called the characteristic exponent of Xˇ such that:

E(eiuXˇt) =etφ(u), u∈RN.

The special form of φ has already been mentioned in the beginning, it is given by the L´evy-Khinchin decomposition. It will be introduced be-low after explaining the L´evy-Itˆo-decomposition. The latter decomposition leads to a natural characterization of the distribution of a L´evy process, the characteristic triplet. By the knowledge of this triplet the L´evy-Khinchin decomposition is then uniquely determined.

We proceed defining a compound Poisson process. Roughly speaking, a Poisson process schedules the jumps of a compound Poisson process and an additional experiment in case of a jump determines the jump size. This additional experiment is distributed with respect to a jump size distribution:

Definition 1.1.4. A compound Poisson process with intensity γ > 0 and jump size distributionf is a stochastic process Jt defined as

Jt=

Ut

X

i=1

Xi,

where jump sizesXiare i.i.d. with distributionf andUtis a Poisson Process with intensity γ,independent from (Xi)i.

In principle, by integrating out the jump size distribution, in analogy to a Poisson process, the characteristic function of a compound Poisson process can be obtained, see alsoCont and Tankov [32, Proposition 3.4]:

Proposition 1.1.5. Let J be a compound Poisson process on RN with in-tensity γ and jump size distribution f. Its characteristic function then has the following representation:

E(eiuJt) = exp{γt Z

RN

(eiux−1)f(dx)}. (1.3) ν(dx) =γf(dx) is the so-called L´evy measure of the compound Poisson process. To explain the L´evy measure in general we need to define a Poisson random measure and present an additional lemma. Before we proceed, note that random measure are defined inJacod and Shiryaev[68, Section II.1]

in a rather general setting. We focus on Poisson random measures as those are sufficient to describe L´evy processes. Apart from being integer-valued, those random measures follow a Poisson distribution for every fixed measur-able set.

We have already become acquainted with a Poisson random measure in Example 1.1.2. Poisson random measures generalize this notion. The

Lebesgue measure is replaced by a Radon measure, i.e. a measure µ on (I,B(RN)), I ⊂ RN such that on every bounded, closed, and measurable set D ∈ B(RN) we have µ(D) < ∞. On the basis of Radon measures, one defines:

Definition 1.1.6. Let (Ω,F, P) be a probability space, I ⊂ RN, and µ a given (positive) Radon measure on (I,B(RN)). A Poisson random measure on I with intensity measure µ is an integer valued random measure:

G: Ω× B(RN)→N, (ω, D)7→G(ω, D) such that the following three assertions hold:

1. For almost allω ∈Ω,G(ω,·)is an integer-valued Radon measure onI, i.e. for any bounded measurable setD⊂I, G(·, D)<∞ is an integer valued random variable.

2. For each measurable setD⊂I, G(·, D) =:G(D)is a Poisson random variable with parameter µ(D), i.e. for all k∈N:

P(G(D) =k) =eµ(D)(µ(D))k k! .

3. For disjoint measurable sets D1, ..., Dn ∈ B(RN), the variables G(D1),...,G(Dn) are independent.

The compensated Poisson random measure is then defined by G(D) =˜ G(D)−µ(D).

We have E( ˜G(D)) = 0 and V( ˜G(D)) = µ(D) and get the following connection to the L´evy measure ν of a compound Poisson process, see [32, Proposition 3.5]:

Proposition 1.1.7. Let J be a compound Poisson process with intensityγ and jump size distribution f. Then its jump measure NJ on RN ×[0, T] defined by

Nj(D) = #{(Jt−Jt, t)∈D}, D ⊂RN ×[0, T]

is a Poisson random measure with intensity measure µ(dx × dt) = γf(dx)dt:=ν(dx)dt.

Finally, we state the general definition of a L´evy measure:

Definition 1.1.8. Let Xˇ be a L´evy process onRN. The measure ν on RN defined by:

ν(B) =E(#{t∈[0,1] : ∆ ˇXt6= 0,∆ ˇXt∈B}), B∈ B(RN)

is the so-called L´evy measure of Xˇ : ν(B) is the expected number per unit time of jumps whose size belongs toB.

The question arises if for any Radon measure there exists a Poisson random measure with intensity µ. The answer is yes, see e.g. [32, Proposi-tion 2.14]. So for every L´evy measure there exists a corresponding Poisson random measure such that the process can be represented by (an integral w.r.t.) this measure? The answer is almost yes and given by the L´evy-Itˆo-decomposition. The aim is to represent the processes by an integral w.r.t.

its Poisson random measure or its compensated version, respectively. We start defining these integrals and continue with the simple case whenν is a finite measure. In this case the answer to the last question is definitely yes.

At first, set I =RN\{0} ×[0, T] and notice that every Poisson random measure on I can be associated to a random sequence ( ¯Jn)n = (( ¯Xn,v¯n))n in I. Roughly speaking, a Poisson random measure counts the number of jumps with certain size in the evaluated set, where ¯vn indicates if there is a jump (jump time) and ¯Xn determines the corresponding size. In detail, for every Poisson random measure G on I there exists a sequence of pairs ( ¯Xn(ω),¯vn(ω))∈I,such that

G(ω, D) =X

n≥1

δ( ¯Xn(ω),¯vn(ω))(D), D⊂I,

where δ denotes the Dirac measure, i.e. δ(x,y)(D) = 1, if (x, y) ∈ D and 0 otherwise. The assertion is contained in the proof of [32, Proposition 2.14]. In fact, any right continuous process with left limits (RCLL) ˇX in RN induces an integer-valued random measureNXˇ on RN×[0, T]:

NXˇ(ω, dx×dt) =X

s

1∆ ˇXs6=0δ( ˇXs(ω),s)(dx, dt). (1.4) SeeJacod and Shiryaev[68, Proposition 1.14/1.16] for further details on general jump measures. We rather proceed with our integration theory in a nutshell:

A Poisson random measure (induced by the sequence ( ¯Xn(ω),¯vn(ω))) is called nonanticipating, if (¯vn) is Fn-adapted and ¯Xn is Fv¯n-adapted. As G(ω,·) is a measure for every ω, we can define integrals as usual. For a simple functionh(s, x) =P

ici1Di(x, s),where (Di)i are disjoint subsets of I and non-negative constants (ci)i, we define the new integral as

G(ω, h) =X

i

ciG(ω, Di).

Its expectation is µ(h). So for every positive and measurable function h, which can be approximated by an increasing sequence of simple functions (hn)n, the integral ofh is defined as the limit of the integrals ofhn,G(h) = limn→∞G(hn). As usual for an arbitrary measurable function h : [0, T]× RN →Rsatisfying

µ(|h|) = Z

[0,T]

Z

RN\{0}|h(s, x)|µ(dx×ds)<∞, (1.5)

we decomposehin a negative parth and a positive parth+ and define the

This is possible asG(h+) and G(h) have finite expectation implying that both are almost surely finite. Moreover,

t=

is an adapted process. Furthermore, the integral for the compensated Pois-son random measure is implied, see e.g.Cont and Tankov [32, Proposi-tion 2.16]:

Proposition 1.1.9. Let Gan nonanticipating Poisson random measure on RN\{0}×[0, T]with intensityµ, implying a compensated measureG˜=G−µ,

Remember, we set µ(dx×ds) = ν(dx)ds. So if the µ-integral exists, R

[0,t]

R

RN\{0}h(s, x)µ(dx×ds) is continuous. Hence, the above decomposition describes the unique Doob-M´eyer decomposition of ˜Xinto a local martingale and a predictable process of bounded variation. See [67; 68] for the general jump case. Clearly, if the measure µ is not finite and h(s, x) ≡s then the integral cannot exist. But also if h(s, x) = xs, we might not get (1.5). If the process satisfies R

kxk≤1kxkν(dx) < ∞ then there is some hope. The jump part of the process has finite variation. The constructed integral is defined. This is definitely the case when the L´evy measure ν is finite, i.e.

the application to cases when processes are of so-called finite activity. To proceed at first we recognize the following proposition, see e.g. Cont and Tankov [32, Proposition 3.3]:

Proposition 1.1.10. (Jt)t is a compound Poisson process if and only if it is a L´evy process and its sample paths are piecewise constant.

So adding a Brownian motion times a constant volatility to a compound Poisson process, already contains a huge class of jump processes, so-called jump-diffusions. A first version of the L´evy-Itˆo decomposition is derived

easily: Proposition 1.1.7 implies that every compound Poisson process J with jump measureNJ can be represented as:

Jt= X

The Poisson random measureNJ has intensity measureν(dx)dt, whereν is a finite measure with

ν(D) =E(#{t∈[0,1] : ∆Jt6= 0, ∆Jt∈D}).

Adding a Brownian motion with a drift term σWt+bt independent of Jt gives another L´evy process: ˇXt = Jt+bt+σWt. We thus obtain a first

However, a problem appears when the L´evy measure is not finite. The sum P

s∈[0,t]∆Jsbecomes infinite. Thus an extension of the integral with respect to the corresponding Poisson random measure has to look a bit different.

The integrals have to be defined with respect to the compensated measure to obtain the L´evy-Itˆo-decomposition. The decomposition was originally invented by L´evy by a pathwise analysis and completed by Itˆo. The version we cite here can be found inCont and Tankov [32]; Sato[115]:

Theorem 1.1.11. Let ( ˇXt)t[0,T] be a L´evy process on RN with L´evy random measure on RN×[0, T]with intensity measure ν(dx)dt.

3. There exist an N-dimensional vector b, an N-dimensional Brownian motion (Wt)t, and a symmetric non-negative definite matrix Σ with Cholesky decomposition σσ such that

t = bt+σWt+ ˇXt(1)+ lim

Convergence in (1.6) is almost surely and uniform inton [0,T]. More-over, the terms in (1.6) are independent.

The jump measure NXˇ is a Poisson random measure. NXˇ(D) therefore follows a Poisson distribution. Its compensated version ˜NXˇ is obtained by substracting the intensity measure µ := ν⊗λ evaluated at D. Hence, the above triplet (Σ, ν, b) uniquely determines the distribution of the above L´evy process.

Definition 1.1.12. The triplet (Σ, ν, b) obtained in (1.6) is called the characteristic triplet or L´evy triplet.

Before we proceed, we present some remarks on the L´evy-Itˆo-decomposition are in order:

Remark 1.1.13. (i) The threshold 1 is arbitrarily chosen and can be changed as long it is strictly greater than 1 and finite.

(ii)R

kxk>1ν(dx)<∞ implies that every L´evy process ˇX jumps only finitely often with a jump size larger than 1. ˇXt(1) =P

|∆ ˇXs|>1, s[0,t]∆ ˇXs is finite and a compound Poisson process.

(iii) ˇXt(ǫ) = P

ǫ≤|∆ ˇXs|≤1, s∈[0,t]∆ ˇXs is again a compound Poisson pro-cess. However, around zero there might be an infinite number of jumps.

Hence the limit might not exist, but the limit of ˜Xˇt(ǫ) does. This follows from a “central-limit-type” argument using that for an arbitrary sequence (ǫn)n, ǫn ↓ 0, Yn = ˜Xˇtn)−X˜ˇtn+1) has mean zero and P

V(Yn) is finite asR

kxk≤1kxk2ν(dx) <∞, see [32, Proof of Proposition 3.7] to get an idea.

limǫ1X˜ˇt(ǫ)can be interpreted as an infinite sum of independent and compen-sated Poisson processes. Thus every L´evy processes can be approximated arbitrarily well by a jump-diffusion.

(iv) In the sequel, we write as usual:

limǫ↓1

˜ˇ Xt(ǫ)=

Z

kxk≤1, s∈[0,t]

xN˜Xˇ(dx×ds)

(v) A L´evy process is of finite variation if and only if Σ = 0 and R

kxk≤1kxkν(dx)<∞.

If the characteristic triplet of a L´evy process is known, we immediately get a representation for the characteristic function, see [32, Theorem 3.1]:

Theorem 1.1.14. [L´evy-Khinchin representation] Let( ˇXt)t be a L´evy pro-cess on RN with characteristic triplet(Σ, ν, b). Then

E(eiuXˇt) = etφ(u), u∈RN φ(u) = −1

2uΣu+ibu+ Z

RN\{0}

(eiux−1−iux1kxk≤1)ν(dx).

Up to now, we only know compound Poisson processes, Brownian mo-tions, and combinations of it. The L´evy-Khinchin characterization by means of the Fourier transform yields a general way to determine L´evy processes.

To construct further L´evy processes, we can just fix the triplet and get the

To construct further L´evy processes, we can just fix the triplet and get the