• Keine Ergebnisse gefunden

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik

N/A
N/A
Protected

Academic year: 2021

Aktie "Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik

Prof. Dr. Stefan Volkwein Sabrina Rogg

Optimierung

http://www.math.uni-konstanz.de/numerik/personen/rogg/de/teaching/

Sheet 5

Tutorial: 23 June Exercise 14 (Classical Newton method)

Consider the functions f, g : R → R given by

f (x) = x

3

− 2x + 2 and g(x) = sin(x).

(1) Show that for the starting point x

0

= 0, the classical Newton iteration of f has two accumulation points which are both not roots of f .

(2) Find a starting point x

0

such that the Newton iteration is of the form x

k

= x

0

+ kπ, k ∈ N

>0

.

(3) Generate suitable plots (Matlab) for (1) and (2).

Exercise 15 (Damped Newton)

Consider to use the classical Newton method for finding the root of g(x) = arctan(x). Let x

k

= 5. Compute the next iteration point x

k+1

according to the Newton algorithm

g

0

(x

k

)∆x

k

= −g(x

k

);

x

k+1

= x

k

+ ∆x

k

.

The damped Newton algorithm is defined as

g

0

(x

k

)∆x

k

= −g(x

k

);

x

k+1

= x

k

+ α

k

∆x

k

,

with α

k

∈ (0, 1) computed by a line search procedure such that

|g(x

k

+ α

k

∆x

k

)| ≤ (1 − δα

k

)|g(x

k

)|, δ ∈ (0, 1).

Let δ = 0.5. We start from α = 1 in the line search procedure and divide it by 2 every

time the condition is not satisfied. Compute the point x

k+1

using the damped Newton

algorithm and compare it to the one returned by the classical Newton method. What do

you conclude?

(2)

Exercise 16

Let f ∈ C

1

( R

n

, R ) be the quadratic function

f(x) = 1

2 hx, Qxi + hc, xi + γ,

with Q ∈ R

n×n

symmetric and positive definite, c ∈ R

n

and γ ∈ R . Let x

0

∈ R

n

and H be a symmetric positive definite matrix.

Define f(x) := ˜ f(H

12

x) and x ˜

0

:= H

12

x

0

. Let (˜ x

k

)

k∈N

be a sequence generated by the steepest descent method,

˜

x

k+1

= ˜ x

k

+ ˜ t

k

d ˜

k

with d ˜

k

= −∇ f(˜ ˜ x

k

)

and t ˜

k

= t( ˜ d

k

) the optimal stepsize choice as determined in Exercise 5 (for f ˜ ).

Let (x

k

)

k∈N

be generated by the gradient-like method with preconditioner H, x

k+1

= x

k

+ t

k

d

k

with d

k

= H

−1

(−∇f (x

k

))

and t

k

= t(d

k

) the optimal stepsize choice as determined in Exercise 5.

Show (by induction) that the two optimization methods are equivalent, i.e., for all k ∈ N it holds:

x

k

= H

12

x ˜

k

.

Referenzen

ÄHNLICHE DOKUMENTE

Determine the number of gradient steps required for finding the minimum of f with the different matrices M and initial value x0 = [1.5;0.6] (use = 10

Herefore, write function files testfunction.m and rosenbrock.m which accept an input argument x and return the function and gradient values at x. Finally, write a file main.m where

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Show that for the starting point x 0 = 0, the classical Newton iteration of f has two accumulation points which are both not roots of f.. Find another initial point which does not

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und