• Keine Ergebnisse gefunden

On sequential parameter estimation for some linear stochastic differential equations with time delay

N/A
N/A
Protected

Academic year: 2022

Aktie "On sequential parameter estimation for some linear stochastic differential equations with time delay"

Copied!
13
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

On sequential parameter estimation for some linear stochastic di erential equations with

time delay

Uwe Kuchler

Humboldt University, Berlin Institute of Mathematics

Vjatscheslav A. Vasil'iev

Tomsk State University, Dept. of Applied Mathematics

and Cybernetics

November 3, 1998

Abstract

We consider the parameter estimation problem for the scalar diusion type process described by the stochastic equation with time delay

dX(t) =Xm

i=0#iX(t;ri)dt + dW(t):

The asymptotic behavior of the classical maximum likelihood estimator (MLE) very depends on the true values of parameter # = (#0#1:::#m)0:

Here we construct a sequential MLE with preassigned least square accuracy for the so-called stationary and the periodic cases of the solutionX(): The limit behaviour of the duration of the procedure with given accuracy is obtained.

Keywords: stochastic dierential equations time delay maximum likelihood esti- mator sequential analysis least square accuracy.

This work was supported by the Deutsche Forschungsgemeinschaft, Sonderforschungsbereich 373

"Quantikation und Simulation okonomischer Prozesse", Berlin, Germany

1

(2)

1 Introduction

Assume (W(t)Ftt 0) is a realvalued Wiener process on a ltered probability space (F(Ft t 0)P) and (X(t)t ;r) satises the following dierential equation with time delay

Unter den Linden 6, D-10099 Berlin, Germany Lenina 36, 634050 Tomsk, Russia.

dX(t) =i=0Pm #iX(t;ri)dt + dW(t) t0 X(s) = X0(s)s2;r0]:

9

=

(1)

The parameters ri #i i = 0:::m are real numbers with 0 = r0 < r1 < ::: <

rm =:r if m1 and r0 =r = 0 if m = 0: The initial process (X0(s) s2;r0]) is supposed to be cadlag and allX0(s) s 2;r0] are assumed to beF0;measurable.

Moreover assume that

EZ 0

;1X02(s)ds <1:

The equation (1) is a special case of so-called ane stochastic dierential equation studied in detail e.g. in Mo/Sch] and Mo]. In particular it holds, that (1) has a uniquely determined solution (X(t)t;r) having the representation

X(t) = j=0Pm #j R0

;rjx0(t;s;rj)X0(s)ds+

+x0(t)X0(0) +R0tx0(t;s)dW(s) t > 0

X(t) = X0(t) t 2;r0]

9

>

>

>

>

>

=

>

>

>

>

>

(2) and satisfying ER0TX2(s)ds < 1 for everyT with 0 < T < 1: Here the function x0() denotes the fundamental solution of the corresponding to (1) linear determin- istic equation

x0(t) = 1 +Xm

j=0 t

Z

0 #jx0(t;rj) t0 (3)

x0(s) = 0 s 2;r0) x0(0) = 1:

(see Ha/Ve] for details on (3)).

Fix a subject ofRm+1 and assume the vector# = (#0#1:::#m)0 2 is unknown and has to be estimated based on the observation (X(t)): The delay times ri are supposed to be known.

The measures P## 2Rm+1 generated by the solutions of (1) form an exponential family in the sense of Ku/So]. Thus, one possibility to estimate # is to use the maximum likelihood method. The corresponding log-likelihood-function is given by

`t(#) = #0(t); 1

2#0G(t)# #2 t > 0 (4)

2

(3)

where

(t) = (Zt

0 X(s;ri)dX(s) i = 0:::m)0 and G(t) = (Z0tX(s;ri)X(s;rj)ds ij = 0:::m)

denotes the Fisher information matrix (for details see Gu/Ku] and Ku/So]). An- other method is provided by sequential estimation. Sequential estimation of one- dimensional parameters in exponential families of processes have been studied e.g.

in Li/Sh] and Nov], see also Ku/So] (1997), Chapter 10. The more-dimensional parameter case cannot be treated in the same way. Indeed, the construction of the stopping time for the observation in these papers very uses the one-dimensionality of the Fisher information. For processes arising from linear stochastic dierential equa- tions without time delay having more-dimensional parameters, sequential methods have been developed in Ko/Pe] (1985), (1987), (1992).

Here we shall extend these results to equations of the type (1). We shall construct for every " > 0 a sequential procedure # to estimate # with ";accuracy in the square mean sense, i.e. with E#;#]2 ":

The method used below is a two step construction of a random time, where the rst step uses the trace of the Fisher information matrix and follows the line of the one-dimensional case mentioned above.

A generalization of the sequential estimators, constructed in the sequel, to dieren- tial equations of the type (1) but based on noisy observations, will be presented in a subsequent paper.

2 Results

Consider the process (X(t)t ;r) described by equation (1) above.

Throughout this paper we suppose that the following assumption holds.

Assumption (A) : For every # 2 there exist a (deterministic scalar) positive increasing function '() on 01) with Tlim

!1

'(T) = 1 and a possibly random (m + 1)(m + 1);matrix function I1(T) T 2 01) being continuous periodic with period 0 ( = 0 means I1(T) I1(0) and positive denite for every T:

Moreover, it holds

Tlim!1

G(T)'(T) ;I1(T)

= 0 a.s. (5)

The assumption (A) is satised under further restrictions on only. For example, if m = 1 then it holds exactly in the following two cases.

Consider the set of all complex roots of the socalled characteristic equation ;#0;#1e;r = 0

3

(4)

and put v0 =v0(#) = maxfRej 2 g: It can be easily shown that v0 <1: Then (A) holds for = f# 2R2 j v0(#) < 0 or v0(#) > 0 and v0(#)62 ]g see Gu/Ku]

(1998) for details. If v0 < 0 then the equation (1) admits a stationary solution and every solution tends to it in distribution, moreover we have = 0 we call this case the "stationary case". If v0 > 0 and v0 62 the equality (5) is valid with some > 0: We denote this case as the "periodic" one.

A similar picture appears in the classical moredimensional linear equation dX(t) = AX(t)dt + dW(t) t 0 X(0) = X0

with the Fisher information matrix

;(T) =Z0TX(T)X0(T)dt:

HereW() is ad;dimensional standard Wiener process andA a given dd matrix.

Let max and min be eigenvalues of A having the maximal and minimal abso- lute value under all of eigenvalues of eA respectively. It is well known that the limiting matrix limT!1T;1;(T) exist and is a positive denite deterministic ma- trix in the stable case (Remax < 0) and ;(T) increase exponentially in the un- stable case (Remin > 0): Note that for stable case the sequential parameter es- timation problem of matrix A was considered in Ko/Pe] (1985), for the scalar model in Nov] and Li/Sh], for unstable case in Ko/Pe] (1987) and in mixed case (Remax > 0 Remin < 0 and + 6= 0 for all eigenvalues of A) in Ko/Pe]

(1992).

The sequential estimation problem for the matrixA in the stable case by noisy ob- servations was studed in Va/Ko] (1987) and Va/Ko] (1990).

Let us return to the study of (1) and let Assumption (A) be true.

To estimate# with pressigned accuracy " > 0 we shall start with the maximumlike- lihood estimator of # for the given lenght T of observation dened by the equality

^#(T) = G;1(T)(T)T > 0: (6)

From (1) and (6) we nd the deviation of the estimator ^#(T) from # :

^#(T);# = G;1(T)(T) (7)

with

(T) =ZT

0 Z(t)dW(t) Z(t) = (X(t)X(t;r1):::X(t;rm))0:

Now we make a time substitution which enables us to control the second moments of the noise.

Fix an arbitrary increasing sequence (cn)n1 of reals tending to innity. Let us dene the sequence of (Ft);stopping times ("(n)n1) as follows

"(n) = inffT > 0 : trG(T) = ";1cng (8)

4

(5)

These moments are nite a.s. due to the condition (5).

One can easily verify that for any " > 0 the sequence (("(n))n 1) satises the equalities

E#k("(n))k2 =";1cn n 1: (9)

(Throughout this paper jjjj denotes the Euclidian norm.)

The equalities (9) suggest that the estimation of the parameter # should be per- formed at the moments"(n):

#n(") = ^#("(n)) n 1: (10) According to (7) in order to obtain the estimates with xed least square devia- tion now one should control the behaviour of the sequence of random matrices

(G;1("(n)) n 1): It can be achieved by conducting the observations up to the

moment"(n) with a specially choosen number n. Let

"= inffN 1 :SN(")%g (11)

where SN(") =n=1PN 2n(") 2n(") = ("c;n1)2:kG;1("(n))k;2 % =nP

11=cn: The sequential plan (T(") #") of estimation of the vector # will be dened by

T(") = "(") #" =S;"1X"

n=12n(")#n("): (12) Obviously,"is a (Fn("));stopping time, and therefore, by construction,T(") turns out to be an (Ft);stopping time.

In such a way the sequential estimate #" is a random weighted mean of the maxi- mum likelihood estimates, calculated at the stopping times"(n)n1:

The following theorem summarizes the main result.

Theorem 1. Assume that Assumption A holds. Then for any " > 0 and any # 2 the sequential estimation plan (12) of # possesses the properties:

1:T(") <1 P#;a.s.

2:E#k#";#k2 "

and the following inequalities hold P#;a.s.

3:0 < lim"

!0"'(T("))lim"

!0"'(T(")) <1:

Proof. 1: Let us verify the niteness of T(") = "("). While the moments"(n) are nite for all n 1, it suces to establish the niteness of the moment". Making

5

(6)

use of the denition (9) of "(n) and the condition (5) we have

nlim!1

";1cn

'("(n)) ;trI1("(n))

= 0 a:s: (13)

and as follows by the denition of 2n(")

nlim!1j2n(");2("(n))j= 0 a:s: (14) where

2(u) = trI1(u)kI1;1(u)k];2: (15) Note that by the conditions on the matrix function I1(u) we have

uinf2R12(u) > 0:

Then n=1P1 2n(") =1 a.s. and for all" > 0 the moments " and T(") are nite a.s.

2: Now we estimate the mean square deviation of #". From (7), (9), (12) and by denitions of " n and % it follows that

E#k#";#k2 =E#S;"2kX"

n=12n(")(#n(");#)k2

E#S;"1

X

n12n(")k#n(");#k2 %;1X

n1E#2n(")

kG;1("(n))k2k("(n))k2 ="2%;1X

n1

c12nE#k("(n))k2 =

="%;1X

n1

c1n =":

For the rst inequality we used the Cauchy-Bunjakovsky inequality.

3: In order to establish the limiting relationships for T(") we note that as in (14) for all n 1 it holds

lim"!0 j2n(");2("(n))j= 0 a.s. (16)

According to (16) and by the denition of the moment " for small but positive "

we have the inequalities

0"00 a.s. (17)

with

0 = inffN 1 : N > % supu

20)2(u)];1g;1 00 = inffN 1 : N > % infu

20)2(u)];1g: 6

(7)

Similar (13) we can obtain lim"!0

";1c"

'(T(")) ;trI1(T("))

= 0 a.s. (18)

From (17) and (18) follows the assertion 3 of the Theorem 1 0< 0lim"

!0"'(T(")) 00 <1 where

0=c0 sup

u20)I1(u)];1 00 =c00 infu

20)I1(u)];1: (19) Theorem 1 is proved.

3 Example

Consider system (1) with m = 1 r1 = 1

dX(t) = #0X(t)dt + #1X(t;1)dt + dW(t) t0 X(s) = X0(s)s2 ;10]:

)

(20) Assume for reasons of of citation, that X0 is continuous.

The sequential plan (T(")#") of estimation # = (#0#1)0 will be dened as (12) with the Fisher information matrix

G(T) =

0

B

B

B

@

T

R

0 X2(t)dt R0TX(t)X(t;1)dt

T

R

0 X(t)X(t;1)dt R0TX2(t;1)dt

1

C

C

C

A (21)

Ku/So].

We can reformulated Theorem 1 for this cases as follows.

Theorem 2. Let the parameters #0 and #1 in (20) such that we have the stationary or periodic case (for the notation see chapter one). Then the sequential plan (12) of estimation # = (#0#1)0 possesses the properties:

1o:T(") <1 P#;a.s.

2o:E#k#";#k2 ":

3o:Besides the following limit inequalities 0< lim"

!0(")T(")lim"

!0(")T(") <1 P#;a.s. (22) 7

(8)

are fullled, where (") = " in the stationary case and (") = (ln";1);1 in the periodic case. Moreover, in the periodic case the limiting inequality

lim"!0jT("); 1

2v0ln";1j <1 a.s. (23) holds.

Proof of 1o;2o: According to Theorem 1 the assertions 1o and 2o of Theorem 2 will be proved if the matrix G(T) (21) satises the condition (5).

Now we establish the auxiliary equalities

Tlim!1T;1G(T) = I1 a.s. (24)

for the stationary case and

Tlim!1 je;2v0TG(T);I1(T)j= 0 a.s. (25) for the periodic case,v0 > 0.

Here

I1=

0

B

B

@ 1

R

0 x20(t)dt 1R0 x0(t)x0(t + 1)dt

1

R

0 x0(t)x0(t + 1)dt 1R0 x20(t)dt

1

C

C

A

and I1(T) is a periodic matrix

I1(T) = g11(T) g12(T) g12(T) g22(T)

!

gij(T) = Z1

0 e;2v0tUi(T ;t)Uj(T ;t)dt ij = 02 Ui(t) = i(t)X0(0) +bZ0

;1 i(t;s;1)e;v0(s+1)X0(s)ds +Z1

0 i(t;s)e;v0sdW(s) i(t) = Aicos(0t) + Bisin(ot) i = 02

A0 = 2(v0;a + 1)

(v0;a + 1)2+20 B0 = 20

(v0;a + 1)2+20 A2

B2

!

= e;v0 cos0 ;sino

sin0 cos0

! A0

B0

!

8

(9)

o = argfIm j 2 Re = v0 Im > 0g: Taking into account the representation

X(t) = x0(t)X0(0) +bZ0

;1 x0(t;s;1)X0(s)ds +Z0tx0(t;s)dW(s) (26) for the solution (X(t)t ;1) of (21) Gu/Ku], Ku/So] and the fact that in the stationary case

1

Z

0 x20(t)dt < 1 we can see that

tlim!1 jX(t);Z(t)j = 0 a:s:

whereZ(t) = Rt

;1

x0(t;s)dW(s) is a stationary process with the correlation matrix I1, which is ergodic Gu/Ku], Ku/So]. Then the equality (24) hold.

In the periodic case according to Gu/Ku]

x0(t) = 0(t)ev0t+o(et) and x0(t;1) =2(t)evot+o(et)

for some with < v0. Similar to Lemma 4.8 in Gu/Ku] we can prove the equality

tlim!1 je;#0tX(t);U0(t)j= 0 a.s.

From here we have

tlim!1 je;2v0TZT

0 X2(t)dt;Z1

0 e;2v0tU2(T ;t)dtj=

= limT

!1 j

T

Z

0 e;2v0(T;t)e;2v0tX2(t);U2(t)]dt +ZT

0 e;2v0(T;t)U2(t)dt;

; 1

Z

0 e;2v0tU2(T ;t)dtj= limT

!1 1

Z

T e;2v0tU2(T ;t)dt = 0 a.s.

The other equations in (25) may be proved analogously. Note that according to Gu/Ku] I1(u) > 0 for u 2 0) and the matrix function I1(u) is continuous on R1. It follows I1(u) > 0 for u 2 0].Then (24), (25) and the conditions (5) for the matrixG(T) dened by (21) are established.

9

(10)

3o. In order to obtain the exact limiting relationships forT(") in the stationary case it suces to note that by the denition of stopping times"(n) and (24) we get for all n1

"lim!0""(n) = cn(trI1);1 > 0 a.s. (27)

"lim!0"G("(n)) = cn(trI1);1 I1> 0 a.s.

and as follows

lim"!02n(") = (trI1kI1;1k);2 > 0 a.s. (28)

Take into account that in this case '(T) = T from (8), (11), (24) and (28) we have

1 lim"

!0"T(")lim"!0"T(") 2 (29) with

1 =c;1 (trI1);1 2 =c(trI1);1 (30)

= inffN 1 :N > %(trI1kI1;1k)2g: Then the inequalities (22) for the stationary case hold.

Now we establish the assertion 3o of Theorem 2 for the periodic case.

By the denition (8) and according to (25) we have

"lim!0j";1c" e;2v0T(");trI1(T("))j= 0 a.s. (31) Since infu trI1(u) > 0 we can rewrite (31) in the form

lim"!0

T("); 1

2voln";1; 1

2volnc"+ 12v0lntrI1(T("))= 0 a.s.

From here and (17) we can obtain the relationships lim"!0(ln";1);1T(") = 12v0 a.s.

and

~1 lim"

!0

T("); 1

2v0ln";1lim"

!0

T("); 1

2v0ln";1 ~2

with

~1 = 12v0lnc0( supu

20)trI1(u));1

~2 = 12v0lnc00( infu

20)trI1(u));1: 10

(11)

The assertion 3o of Therem 2 is established. Theorem 2 is proved.

From Theorem 2 it follows that the duration T(") of the sequential estimation has a nonrandom lower and upper bounds ;1(")~ 1 and ;1(")~ 2 respectively asymptot- ically. These bounds have the same increasing rate with " ! 0: From assertions 2 and 3 of Theorem 2 follows that the convergence rate of the mean square deviation of the sequential estimator#" corresponds with the rate of convergence of the MLE in stationary and periodic cases Gu/Ku].

According to the inequalities (29) the duration of observations T(") in stationary case is approximately not great than ";1 2 with 2 dened by (30) when" is small.

Note that in this case one can obtain the following limiting equalities

lim"!0" = a.s. (32)

and "lim!0"T(") = 2 a.s.

Here is dened by (30). To obtain (32) we change the denition of " a little bit.

Replace the magnitudesn;2(") in the denition of " in (11) by the nearest integer from above and choose (cn) in such a way that the constant % in (11) is irrational.

In this case, the limit lim"

!0SN(") is stricly greate thn % and this implies (32).

From (32) it follows that by small " the moments " = a.s. and by the property (28) it is obvious that the sequential estimate#" may be represented in stationary case as the mean of nite numbers of maximum likelihood estimates ^# which are calculated at the moments"(n) :

#" 1

X

n=1^#("(n)): (33)

The number may be asymptotically estimated with help of the property (24) and by the denition (30) of the moment:

It should be pointed out also that by known bound for infu

20)2(u) > 0 with 2(u) dened by (15), according to (18) we obtain

" inffN 1 :N > %;1g= 1

by small" if the sequence (cn) is such that % < : Then for the sequential estimate

#" dened by (12) for small " we have

#" = ^#(1(")) a.s.

Remark. From Theorem 2 we can see that the sequential estimators#" converge to the true value# in mean square as "!0 in stationary and periodic cases. Moreover, for any sequence ("nn 1) of positive integers such thatnP

1"n <1 we can dene the sequence of estimators (~#nn1) ~#n =#"n n1: Then the sequence (~#n) of estimators for# is strongly consistent. It follows from the assertion 2 of Theorem 2 and the Borel - Cantelli lemma.

11

(12)

References

Gu/Ku] Gushchin, A.A. and K"uchler, U. Asymptotic Inference for a linear stochas- tic dierential equation with time delay, to appear in Bernoulli.

Ha/Ve] Hale, J.K. and Verduyn Lunel, S.M. (1993) Introduction to functional- di erential equations, New York Springer-Verlag.

Ko/Pe] Konev, V.V. and Pergamenshchikov, S.M. (1985) Sequential estimation of the parameters of diusion processes.Problems of Inform. Trans.,21, 1, 48-62.

Ko/Pe] Konev, V.V. and Pergamenshchikov, S.M. (1987) Sequential estimation of the parameters of unstable dynamical systems in continuous time.Math. Stat.

and Appl., Publishing House of Tomsk University, Tomsk,11, 85-94.

Ko/Pe] Konev, V.V. and Pergamenshchikov, S.M. (1992) Sequential estimation of the parameters of linear unstable stochastic systems with guaranteed accuracy.

Problems of Inform. Trans.,28, 4, 35-48.

Ku/Kut] K"uchler and Kutoyants, Yu. A. (1998) Delay estimation for station- ary diustion-type process. Discussion Paper 47 of the SFB 373, Humboldt- University of Berlin, 1998.

Ku/Me] K"uchler, U. and Mensch, B. (1991) Langevins stochastic dierential equa- tions extended by a time-delayed term.Stochastics and Stochastic Reports,40, 23-42.

Ku/So] K"uchler, U. and Sorensen, M. (1997) Exponential Families of Stochastic Processes, New York, Heidelberg Springer Verlag.

Li/Sh] Liptzer, R.S. and Shiryaev A.N. (1977) Statistics of Random Processes, Vol 1,2. New York, Heidelberg Springer Verlag.

Mo] Mohammed, S.E-A. (1984) Stochastic Functional Di erential Equations, Pit- man, London.

Mo/Sch] Mohammed, S.E-A. and Scheutzow, M.K.R. (1990) Lyapunov exponents and stationary solutions for ane stochastic delay equations. Stochastics and Stochastic Reports, 29, 259-283.

Nov] Novikov, A.A. (1971) The sequential parameter estimation in the process of diusion type. Probab. Theory and its Appl., 16, 2, 394-396.

Va/Ko] Vasiliev, V.A. and Konev, V.V. (1987) On sequential identication of linear dynamic systems in continuous time by noisy observations. Probl. of Contr.

and Inform. Theory, 16, 2, 101-112.

12

(13)

Va/Ko] Vasiliev, V.A. and Konev, V.V. (1990) On sequential parameter estimation of continuous dynamic systems by discrete time observations.Probl. of Contr.

and Inform. Theory, 19, 3, 197-207.

13

Referenzen

ÄHNLICHE DOKUMENTE

Evolution of the parameter estimates: the SIR for 1000 ensemble members (red) and for 250 ensemble members (black), the EnKF (green), and the true parameter (blue)..

This chapter introduces the maple software package stochastic con- sisting of maple routines for stochastic calculus and stochastic differential equa- tions and for constructing

In the first part (Sections 2 and 3) we present a convergence theory for stochastic linear multi-step Maruyama methods (SLM- MMs) applied to SDDEs.. In [9] we considered

In the deterministic (and stochastic) case, the oscillations in the solutions of first order delay differential equations are generated by the delayed argument, as first or-

[r]

Vasil’iev, On sequential parameter estimation for some lin- ear stochastic differential equations with time delay, Sequential Analysis 20 (3) (2001) 117–146..

Key words: Asymptotic normality, consistency, discrete time observation of continu- ous time models, prediction-based estimating functions, pseudo-likelihood, stochastic

[r]