• Keine Ergebnisse gefunden

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik

N/A
N/A
Protected

Academic year: 2021

Aktie "Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und Statistik

Prof. Dr. Stefan Volkwein Sabrina Rogg

Optimierung

http://www.math.uni-konstanz.de/numerik/personen/rogg/de/teaching/

Sheet 6

Tutorial: 7th July Exercise 17 (Scaled gradient method)

Consider the quadratic function f : R 2 → R , f (x) = 1

2 x >

100 −1

−1 2

x + 1 1 x + 3.

Implement a modified version of the Gradient Method where the update is x k+1 = x k + t k d k with d k = M −1 (−∇f(x k ))

and with the exact stepsize t k (Exercise 5). M is one of the following matrices:

M = Id = 1 0

0 1

, M = ∇ 2 f =

100 −1

−1 2

, M =

f xx 0 0 f yy

=

100 0 0 2

. As basis you can use the Gradient Method you implemented for the first program sheet.

Determine the number of gradient steps required for finding the minimum of f with the different matrices M and initial value x0 = [1.5;0.6] (use = 10 −9 ). How close is the computed point to the exact analytical minimum? Explain your observations.

Exercise 18 (Cauchy-step property)

The Cauchy step is defined as s CP a = −t a ∇f(x a ), where t a is given by (see the lecture notes)

t a =

(

a

k∇f(x

a

)k if ∇f (x a ) > H a ∇f(x a ) ≤ 0,

min

a

k∇f(x

a

)k , ∇f (x k∇f(x

a

)k

2

a

)

>

H

a

∇f(x

a

)

if ∇f (x a ) > H a ∇f(x a ) > 0.

Once the Cauchy point x CP a = x a + s CP a is computed, show that there is a sufficient decreasing in the quadratic model, i.e, the Cauchy point satisfies

m a (x a ) − m a (x CP a ) ≥ 1

2 k∇f(x a )k min

a , k∇f(x a )k 1 + kH a k

.

Exercise 19 (Dogleg strategy)

Let us consider the quadratic model of the function f in x a m a (x) = f (x a ) + ∇f(x a ) > (x − x a ) + 1

2 (x − x a ) > B a (x − x a ),

(2)

with B a positive definite (Hessian matrix in x a or an approximation of it). In the trust- region subproblem we approximately solve

kx−x min

a

k<∆

a

m a (x). (1)

If the trust region is big enough, i.e as if there is no constraint kx − x a k < ∆ a , the exact (global) solution to (1) is the (quasi-)Newton point

x QN a = x a − B a −1 ∇f(x a ).

The idea of the dogleg method is as follows: Minimize m a along a path consisting of two straight lines: one from the current point x a to the Cauchy point x CP a and the other one from x CP a to the (quasi-)Newton point x QN a . This path is described as

x(τ ) =

( x a + τ(x CP a − x a ) τ ∈ [0, 1], x CP a + (τ − 1)(x QN a − x CP a ) τ ∈ (1, 2].

Show that h(τ) := m a (x(τ )) is a decreasing function of τ .

Hint: consider m

a

on the two straight lines separately.

Referenzen

ÄHNLICHE DOKUMENTE

Compute the point x k+1 using the damped Newton algorithm and compare it to the one returned by the classical Newton method. What do

Herefore, write function files testfunction.m and rosenbrock.m which accept an input argument x and return the function and gradient values at x. Finally, write a file main.m where

Universität Konstanz Sommersemester 2015 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und

Show that for the starting point x 0 = 0, the classical Newton iteration of f has two accumulation points which are both not roots of f.. Find another initial point which does not

Universität Konstanz Sommersemester 2014 Fachbereich Mathematik und