• Keine Ergebnisse gefunden

Wissenschaftliches Rechnen II/Scientific Computing II

N/A
N/A
Protected

Academic year: 2021

Aktie "Wissenschaftliches Rechnen II/Scientific Computing II"

Copied!
2
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Wissenschaftliches Rechnen II/Scientific Computing II

Sommersemester 2016 Prof. Dr. Jochen Garcke Dipl.-Math. Sebastian Mayer

Exercise sheet 11 To be handed in on Thursday, 07.06.2016

Application of PCA and MDS

1 Group exercises

G 1. (MDS: embedding of out-of-sample data)

Assume you had training data in the form of a centered Gram matrix G

c

= (hy

i

, y

j

i)

ni,j=1

= Y

T

Y or in the form of a Euclidean distance matrix D = (ky

i

− y

j

k)

ni,j=1

and learned a p-dimensional embedding of the training data using the CMDS algorithm.

Now assume there is a new test point x ∈ R

d

, which is different from the y

i

but stems from the same data generating source. You cannot observe x directly but only one of the following sets of features:

a) you either observe inner products x

S

= Y

T

x,

b) or you observe squared Euclidean distances x

E

= (kx − y

i

k

22

)

ni=1

.

Use the components computed by the CMDS algorithm to construct a p-dimensional embedding ˆ x of x from the given feature representation. Give a geometric interpretation of the constructed embedding ˆ x. Discuss what properties the training data y

1

, . . . , y

n

must have such that the obtain embedding ˆ x is reasonable.

G 2. (Kernel-MDS)

Discuss how MDS could be generalized to distances and inner products which are indu- ced by a reproducing kernel k : Ω × Ω → R . Concretely, assume that there are points x

1

, . . . , x

n

∈ Ω of which you observe (k(x

i

, x

j

))

ni,j=1

and you want to construct embed- dings ˆ x

1

, . . . , x ˆ

n

∈ R

p

such that

k(x

i

, x

j

) ≈ hˆ x

i

, x ˆ

j

i.

(2)

2 Homework

H 1. (Optimal p-dimensional subspace in a RKHS)

Let k : Ω × Ω → R be a reproducing kernel, H its native Hilbert space and X = {x

1

, . . . , x

n

} ⊂ Ω. Consider the kernel matrix K = (k(x

i

, x

j

))

ni,j=1

and the corresponding eigenvalue decomposition K = V ΛV

T

with Λ = diag(λ

1

, . . . , λ

m

, 0, . . . , 0), where we assume m ≤ n. Consider for i = 1, . . . , m the functions

f

i

:= 1

√ λ

i

n

X

j=1

V

ij

k(x

j

, ·) ∈ H

X

.

a) Show that (f

i

)

mi=1

forms an orthonormal basis of H

X

. Hint: Use that Λ = V

T

KV . b) Let p ∈ {1, . . . , m}. Show that (f

i

)

pi=1

is the solution of

min

(gi)ni=1ONB ofHX n

X

i=1

kk(x

i

, ·) −

p

X

j=1

g

j

(x

i

)g

j

k

2k

.

Hint: Argue analogously as in Sheet 10, H2 b).

(10 Punkte) H 2. (Programming exercise: pedestrian classification)

See accompanying notebook.

(10 Punkte)

2

Referenzen

ÄHNLICHE DOKUMENTE

(14 points) The programming exercise should be handed in either before/after the exercise class on 21.6.18 (bring your own laptop!) or in the HRZ-CIP-Pool, after making an appoint-

Summer term 2018 Priv.-Doz..

The goal of this exercise sheet is to resolve singular initial conditions for parabolic equations, using a geometrically refined

Furthermore assume the corresponding value function as continuous in Ω.. assume the extrema to be a

Let B e n be the transformed version of the Bernoulli polynomial B n such that the domain is [0, 2π] instead of

Wissenschaftliches Rechnen II/Scientific Computing II.. Sommersemester

As usual, you find the tasks in the accompanying note- book on the lecture’s website.

Give an alternative proof of Lemma 46 based on the representer