• Keine Ergebnisse gefunden

Tracking Computer Vision I -

N/A
N/A
Protected

Academic year: 2022

Aktie "Tracking Computer Vision I -"

Copied!
82
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Computer Vision I - Tracking

Carsten Rother

Computer Vision I: Tracking 17/02/2015

17/02/2015

[slide credits: Alex Krull]

(2)

What is Tracking?

• „Tracking an object in an image sequence means continuously identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

17/02/2015 Computer Vision I: Tracking 2

(3)

What is Tracking?

• „Tracking an object in an image sequence means continuously identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

17/02/2015 Computer Vision I: Tracking 3

(4)

What is Tracking?

• „Tracking an object in an image sequence means continuosly

identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

17/02/2015 Computer Vision I: Tracking 4

(5)

Tracking vs Localization

• Tracking of objects is closely related to:

• camera pose estimation in a known environment

• localization of agents (eg. Robots) in a known environment

• Reminder: SLAM has unknown location (agent, camera) and unknown environment

17/02/2015 Computer Vision I: Tracking 5

(6)

Outline

• This lecture

• The Bayes Filter

Explained for localization

• Next lecture

• The Particle Filter

• The Kalman Filter

• Pros and Cons

• Beyond tracking and localization

• Case study:

6-DOF Model Based Tracking via Object Coordinate Regression

17/02/2015 Computer Vision I: Tracking 6

(7)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 7

• We have:

• Probabilistic model for movement

• Probabilistic model for measurement

Based on map of the environment

• Where is the ship?

• Using all previous and current observations

(8)

1

z

t

z

t

1

x

t

x

t

The Hidden Markov Model

17/02/2015 Computer Vision I: Tracking 8

Observations:

Depth measurement

States:

Positions

time

(9)

The Hidden Markov Model

17/02/2015 Computer Vision I: Tracking 9

Observation Model:

What is the likelihood of an observation, given a state?

𝑝 𝑧

𝑡

𝑥

𝑡

= 1

𝑐 𝑒

− 𝑧𝑡−𝑑 𝑥𝑡 2/(2𝜎2)

1

z

t

z

t

1

x

t

x

t

time

𝑑 𝑥𝑡 is the known true depth

(10)

The Hidden Markov Model

17/02/2015 Computer Vision I: Tracking 10

Motion Model:

Probability for state transition

𝑝 𝑥

𝑡+1

𝑥

𝑡

1

z

t

z

t

1

x

t

x

t

time

(11)

Probability distribution for the state given all previous and current observations:

• This is what we are interested in

• Eg. use maximum as current estimate

The Posterior distribution

17/02/2015 Computer Vision I: Tracking 11

z

t

x

t

1

z

t

1

x

t

z

0

x

0

...

...

𝑝 𝑥

𝑡

𝑧

0:𝑡

𝑥𝑡 = 𝑎𝑟𝑚𝑎𝑥𝑥𝑡 𝑝 𝑥𝑡 𝑧0:𝑡

(12)

The Prior distribution

17/02/2015 Computer Vision I: Tracking 12

1

x

t

x

t

1

z

t

1

x

t

z

0

x

0

...

...

𝑝 𝑥

𝑡+1

𝑧

0:𝑡

Probability distribution for the next state given only previous observations:

• Intermediate step for calculating the next posterior

z

t

(13)

Important Distributions

17/02/2015 Computer Vision I: Tracking 13

𝑝 𝑥𝑡+1 𝑧0:𝑡

1

xt

xt

1

zt

1

xt

z0

x0 ...

zt

zt

xt

1

zt

1

xt

z0

x0 ...

𝑝 𝑥𝑡 𝑧0:𝑡

1

xt

xt

zt

xt

𝑝 𝑧𝑡 𝑥𝑡 𝑝 𝑥𝑡+1 𝑥𝑡

Observation Model: Motion Model:

Posterior: Prior:

Likelihood of

observation given state

Continuous Gaussian around real depth

Probability of new state given old one

Discrete Gaussian

Probability of state given previous and current observations

Probability of state given only previous observations

(14)

Probabilities - Reminder

A random variable is denoted with 𝑥 ∈ {0, … , 𝐾}

Discrete probability distribution: 𝑝(𝑥) satisfies 𝑥 𝑝(𝑥) = 1

Joint distribution of two random variables: 𝑝(𝑥, 𝑧)

Conditional distribution: 𝑝 𝑥 𝑧

Sum rule (marginal distribution): 𝑝 𝑧 = 𝑥 𝑝(𝑥, 𝑧)

Independent probability distribution: 𝑝 𝑥, 𝑧 = 𝑝 𝑧 𝑝 𝑥

Product rule: 𝑝 𝑥, 𝑧 = 𝑝 𝑧 𝑥 𝑝(𝑥)

Bayes’ rule: 𝑝 𝑥|𝑧 = 𝑝(𝑧|𝑥)𝑝 𝑥

𝑝(𝑧)

17/02/2015 Computer Vision I: Tracking 14

(15)

Probabilities - Reminder

• Sum rule (marginal distribution): 𝑝 𝑧 = 𝑥 𝑝(𝑥, 𝑧)

• Product rule: 𝑝 𝑥, 𝑧 = 𝑝 𝑧 𝑥 𝑝(𝑥)

• Bayes’ rule: 𝑝 𝑥|𝑧 = 𝑝(𝑧|𝑥)𝑝 𝑥

𝑝(𝑧)

• Sum rule (marginal distribution): 𝑝 𝑧|𝐴 =

𝑥

𝑝(𝑥, 𝑧|𝐴)

• Product rule: 𝑝 𝑥, 𝑧|𝐴 = 𝑝 𝑧 𝑥, 𝐴 𝑝(𝑥|𝐴)

• Bayes’ rule: 𝑝 𝑥|𝑧, 𝐴 =

𝑝(𝑧|𝑥,𝐴)𝑝 𝑥|𝐴 𝑝(𝑧|𝐴)

17/02/2015 Computer Vision I: Tracking 15

(16)

Probabilities - Reminder

17/02/2015 Computer Vision I: Tracking 16

𝐴 𝐵

𝐴

𝐶

𝐵

Independence

𝐴 and 𝐵 are not connected in graph

𝐴 does not contain information about 𝐵

𝑝 𝐴, 𝐵 = 𝑝 𝐴 𝑝 𝐵

𝑝 𝐴|𝐵 = 𝑝 𝐴

𝑝 𝐵|𝐴 = 𝑝(𝐵)

Conditional Independence

𝐴 and 𝐵 are connected only via 𝐶

𝐴 does not contain information about 𝐵 when 𝐶 is known

𝑝 𝐴, 𝐵|𝐶 = 𝑝 𝐴|𝐶 𝑝 𝐵|𝐶

𝑝 𝐴|𝐵, 𝐶 = 𝑝 𝐴|𝐶

𝑝 𝐵|𝐴, 𝐶 = 𝑝(𝐵|𝐶)

Two dice rolls

It rains

People have

umbrellas Tram is

crowded

(17)

Step by Step

17/02/2015 Computer Vision I: Tracking 17

𝑡 = 0

• Assume prior for first frame:

• Make first measurement: 𝑧0

𝑝(𝑥

0

)

(18)

Step by Step

17/02/2015 Computer Vision I: Tracking 18

𝑡 = 0

• Calculate likelihood 𝑝(𝑧0|𝑥0) for every possible state 𝑥0:

(19)

Step by Step

17/02/2015 Computer Vision I: Tracking 19

𝑡 = 0

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

0

𝑧

0 = 𝑝(𝑥0)𝑝(𝑧0|𝑥0)

𝑥𝑝(𝑥0 = 𝑥)𝑝(𝑧0|𝑥0 = 𝑥)

(20)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 20

𝑡 = 0

𝑝 𝑥1 𝑧0 =

𝑥

𝑝 𝑥0 = 𝑥 𝑧0 𝑝 𝑥1 𝑥0 = 𝑥

• Calculate the prior by Convolution with motion model

• Adding uncertainty

(21)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 21

𝑡 = 1

• Make new measurement: 𝑧1

(22)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 22

𝑡 = 1

• Calculate likelihood 𝑝(𝑧1|𝑥1) for every possible state 𝑥1:

(23)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 23

𝑡 = 1

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

1

𝑧

0:1 = 𝑝(𝑥1|𝑧0)𝑝(𝑧1|𝑥1)

𝑥𝑝(𝑥1 = 𝑥|𝑧0)𝑝(𝑧1|𝑥1 = 𝑥)

(24)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 24

𝑡 = 1

• Calculate the prior by Convolution with motion model

• Adding uncertainty 𝑝 𝑥2 𝑧0:1 =

𝑥

𝑝 𝑥1 = 𝑥 𝑧0:1 𝑝(𝑥2|𝑥1 = 𝑥)

(25)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 25

𝑡 = 2

• Make new measurement: 𝑧2

(26)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 26

𝑡 = 2

• Calculate likelihood 𝑝(𝑧2|𝑥2) for every possible state 𝑥2:

(27)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 27

𝑡 = 2

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

2

𝑧

0:2 = 𝑝(𝑥2|𝑧0:1)𝑝(𝑧2|𝑥2)

𝑥𝑝(𝑥2 = 𝑥|𝑧0:1)𝑝(𝑧2|𝑥2 = 𝑥)

(28)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 28

𝑡 = 2

(29)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 29

Algorithm:

1. Make observation

2. Calculate likelihood for every position 3. Multiply with last prior and normalize

• Calculate posterior

4. Convolution with motion model

• Calculate new prior 5. Go to 1.

(30)

𝑝 𝑥

0

𝑧

0

Calculating the Posterior

17/02/2015 Computer Vision I: Tracking 30

𝑝(𝑥

0

) z

0

x

1

x

0

z

1

...

= 𝑝(𝑥0)𝑝(𝑧0|𝑥0)

𝑥𝑝(𝑥0 = 𝑥)𝑝(𝑧0|𝑥0 = 𝑥)

(Multiply and normalize)

Observation model

(see proof 1)

(31)

Proof 1

17/02/2015 Computer Vision I: Tracking 31

(32)

Calculating the next Prior

17/02/2015 Computer Vision I: Tracking 32

𝑝(𝑥

0

) 𝑝 𝑥

0

𝑧

0

𝑝 𝑥

1

𝑧

0

...

=

𝑥

𝑝 𝑥0 = 𝑥 𝑧0 𝑝 𝑥1 𝑥0 = 𝑥

(Convolution)

Observation model

z

0

x

1

x

0

z

1

(see proof 2)

(33)

Proof 2

17/02/2015 Computer Vision I: Tracking 33

(34)

Calculating the Posterior

17/02/2015 Computer Vision I: Tracking 34

𝑝(𝑥

0

) 𝑝 𝑥

0

𝑧

0

𝑝 𝑥

1

𝑧

0

...

Observation model

x

1

x

0

Observation model

𝑝 𝑥

1

𝑧

0:1 = 𝑝(𝑥1|𝑧0)𝑝(𝑧1|𝑥1)

𝑥𝑝(𝑥1 = 𝑥)𝑝(𝑧1|𝑥1 = 𝑥)

z

1

z

0

(35)

𝑝 𝑥

𝑡

𝑧

0:𝑡

Calculating the Posterior (General Case)

17/02/2015 Computer Vision I: Tracking 35

𝑝(𝑥

𝑡

|𝑧

0:𝑡−1

) z

t

1

x

t

x

t

1

z

t

...

= 𝑝(𝑥𝑡|𝑧0:𝑡−1)𝑝(𝑧𝑡|𝑥𝑡)

𝑥 𝑝(𝑥𝑡 = 𝑥|𝑧0:𝑡−1)𝑝(𝑧𝑡|𝑥𝑡 = 𝑥)

(Multiply and normalize)

Observation model

...

(see proof 3)

(36)

Proof 3

17/02/2015 Computer Vision I: Tracking 36

(37)

Calculating the next Prior (General Case)

17/02/2015 Computer Vision I: Tracking 37

𝑝(𝑥

𝑡

|𝑧

0:𝑡−1

) 𝑝 𝑥

𝑡

𝑧

0:𝑡

𝑝 𝑥

𝑡+1

𝑧

0:𝑡

...

=

𝑥

𝑝 𝑥𝑡 = 𝑥 𝑧0:𝑡 𝑝(𝑥𝑡+1|𝑥𝑡 = 𝑥)

(Convolution)

Observation model

...

z

t

1

x

t

x

t

1

z

t

(see proof 4)

(38)

Proof 4

17/02/2015 Computer Vision I: Tracking 38

(39)

Overview Continuous Approaches

17/02/2015 Computer Vision I: Tracking 39

• How to apply it in continuous space?

• Two popular alternatives:

• Particle filter

Represent prior and posterior with samples

• Kalman Filter

Represent prior and posterior distributions as Gaussians

(40)

Important Distributions (Particle Filter)

17/02/2015 Computer Vision I: Tracking 40

𝑝 𝑥𝑡+1 𝑧0:𝑡

1

xt

1

xt

xt

1

zt

1

Zt

xt ...

xt

1

zt

1

Zt

z0

x0

zt

zt ...

𝑝 𝑥𝑡 𝑧0:𝑡

xt

1

zt

1

xt

z0

𝑝 𝑧𝑡 𝑥𝑡 𝑝 𝑥𝑡+1 𝑥𝑡

Observation Model: Motion Model:

Posterior: Prior:

Likelihood of

observation given state

Continuous Gaussian around real depth

Probability of new state given old one

Continuous Gaussian

Probability of state given previous and current observations

Continuous, represented as set of Samples (Particles)

Probability of state given only previous observations

Continuous, represented as set of Samples (Particles)

(41)

Particle Filter

17/02/2015 Computer Vision I: Tracking 41

(42)

Discrete Bayes Filter vs. Particle Filter

17/02/2015 Computer Vision I: Tracking 42

• Discrete Bayes Filter:

1. Make observation

• Particle Filter:

2. Calculate likelihood for every position

3. Multiply with last prior and normalize

4. Convolution with motion model

5. Go to 1.

1. Make observation

2. Calculate likelihood for every sample -> weights 3. Resampling according to weights

4. Randomly move samples according to motion model (Sampling)

5. Go to 1.

(43)

The Bayes Filter / Convolute and Multiply

17/02/2015 Computer Vision I: Tracking 43

𝑡 = 0

• Represent continuous prior with particles 𝑥𝑡1 , … , 𝑥𝑡𝑛

• Make measurement: 𝑧𝑡

(44)

Particle Filter / Resampling

17/02/2015 Computer Vision I: Tracking 44

• Calculate weight 𝑤𝑡𝑖= 𝑝(𝑧𝑡|𝑥𝑡 = 𝑥𝑡𝑖) for every particle 𝑥𝑡𝑖:

(45)

Particle Filter / Resampling

17/02/2015 Computer Vision I: Tracking 45

𝑡 = 0

• Draw samples 𝑥𝑡1 , … , 𝑥𝑡𝑛 from posterior by resampling from 𝑥𝑡1 , … , 𝑥𝑡𝑛 using the weights 𝑤𝑡𝑖

• Reducing uncertainty

0

i i

wt

(46)

Particle Filter / Resampling

17/02/2015 Computer Vision I: Tracking 46

𝑡 = 0

random

0

i i

wt

random random

• Draw samples 𝑥𝑡1 , … , 𝑥𝑡𝑛 from posterior by resampling from 𝑥𝑡1 , … , 𝑥𝑡𝑛 using the weights 𝑤𝑡𝑖

• Reducing uncertainty

(47)

Particle Filter / Resampling

17/02/2015 Computer Vision I: Tracking 47

𝑡 = 0

0

i i

wt

• Draw samples 𝑥𝑡1 , … , 𝑥𝑡𝑛 from posterior by resampling from 𝑥𝑡1 , … , 𝑥𝑡𝑛 using the weights 𝑤𝑡𝑖

• Reducing uncertainty

(48)

Particle Filter / Resampling

17/02/2015 Computer Vision I: Tracking 48

Why is this allowed?

Resampling is like multiplication and normalization:

• Sample density in posterior depends linearly on

• Density of prior samples 𝑥𝑡1 , … , 𝑥𝑡𝑛

• Likelihood 𝑤𝑡𝑖

(49)

Discrete Bayes Filter vs. Particle Filter

17/02/2015 Computer Vision I: Tracking 49

• Discrete Bayes Filter:

1. Make observation

• Particle Filter:

2. Calculate likelihood for every position

3. Multiply with last prior and normalize

4. Convolution with motion model

5. Go to 1.

1. Make observation

2. Calculate likelihood for every sample -> weights 3. Resampling according to weights

4. Randomly move samples according to motion model (Sampling)

5. Go to 1.

(50)

Particle Filter / Sampling

17/02/2015 Computer Vision I: Tracking 50

𝑡 = 0

• Obtain samples 𝑥𝑡+11 , … , 𝑥𝑡+1𝑛 from the new prior by moving each

particle according to motion model

• Adding uncertainty

(51)

Particle Filter / Sampling

17/02/2015 Computer Vision I: Tracking 51

𝑝 𝐵 𝐴 = 1 𝑝 𝐵 𝐴 = 2 𝑝 𝐴 = 1 = 𝑝 𝐴 = 2 = 0.5

Why is this allowed?

• We want to sample: Bi~𝑝 𝐵

• Don‘t know 𝑝 𝐵

• We sample first: 𝐴𝑖~𝑝 𝐴

• And then 𝐵𝑖~𝑝 𝐵|𝐴 = 𝐴𝑖

1

1 2 3 4 5 6 1 2 3 4 5 6

𝐴 𝐵

(52)

A B Particle Filter / Sampling

17/02/2015 Computer Vision I: Tracking 52

1

x

t

x

t

1

z

t

1

x

t

z

0

x

0

...

z

t

• We want to sample 𝑥𝑡+1𝑖 ~𝑝 𝑥𝑡+1 𝑧0:𝑡

• Don‘t know 𝑝 𝑥𝑡+1 𝑧0:𝑡

Why is this allowed?

(53)

Particle Filter / Sampling

17/02/2015 53

17/02/2015 Computer Vision I: Tracking 53

1

x

t

x

t

1

x

t

x

0

...

1

z

t

z

0

z

t

• We want to sample 𝑥𝑡+1𝑖 ~𝑝 𝑥𝑡+1 𝑧0:𝑡

• Don‘t know 𝑝 𝑥𝑡+1 𝑧0:𝑡

• We use samples 𝑥𝑡𝑖~𝑝 𝑥𝑡 𝑧0:𝑡

Why is this allowed?

(54)

Particle Filter / Sampling

17/02/2015 Computer Vision I: Tracking 54

• We want to sample 𝑥𝑡+1𝑖 ~𝑝 𝑥𝑡+1 𝑧0:𝑡

• Don‘t know 𝑝 𝑥𝑡+1 𝑧0:𝑡

• We use samples 𝑥𝑡𝑖~𝑝 𝑥𝑡 𝑧0:𝑡

• We sample 𝑥𝑡+1𝑖 ~𝑝 𝑥𝑡+1 𝑥𝑡 = 𝑥𝑡𝑖 = 𝑝 𝑥𝑡+1 𝑥𝑡 = 𝑥𝑡𝑖, 𝑧0:𝑡

1

x

t

x

t

1

x

t

x

0

...

1

z

t

z

0

z

t

Why is this allowed?

(55)

Particle Filter (Tracking application)

• Tracking an object in a video sequence [Perez at al. 2002]

• States:

• 2D windows (x,y and size)

• Gaussian motion

• Observations:

• Color histograms

17/02/2015 Computer Vision I: Tracking 55

(56)

Particle Filter (Tracking application)

• Pose tracking of an object in a kinect sequence [Krull at al. 2014]

• States:

• 6D Pose

3D position

3D rotation

• Observations:

• Depth images

• Predicted Object coordinates

17/02/2015 Computer Vision I: Tracking 56

(57)

Particle Filter (Summary)

• The Particle Filter implements the Bayes Filter

Prior and posterior are represented as sample sets (particle)

Likelihood is only evaluated at particles

• Multiplication -> weighted resampling

• Convolution -> random movement according to motion model

17/02/2015 Computer Vision I: Tracking 57

(58)

Overview Continuous Approaches

• How to apply it in continuous space?

• Two popular alternatives:

• Particle filter

Represent prior and posterior with samples

• Kalman Filter

Represent prior and posterior distributions as Gaussians

17/02/2015 Computer Vision I: Tracking 58

(59)

Kalman Filter

17/02/2015 Computer Vision I: Tracking 59

(60)

Important Distributions (Kalman Filter)

17/02/2015 Computer Vision I: Tracking 60

𝑝 𝑥𝑡+1 𝑧0:𝑡

1

xt

1

xt

xt

1

zt

1

Zt

xt ...

xt

1

zt

1

zt

z0

x0

zt

zt ...

𝑝 𝑥𝑡 𝑧0:𝑡

xt

1

zt

1

xt

z0

𝑝 𝑧𝑡 𝑥𝑡 𝑝 𝑥𝑡+1 𝑥𝑡

Observation Model: Motion Model:

Posterior: Prior:

Likelihood of

observation given state

Continuous Gaussian around real position

Probability of new state given old one

Continuous Gaussian

Probability of state given previous and current observations

Continuous, represented as Gaussiasn

Probability of state given only previous observations

Continuous, represented as Gaussiasn

(61)

Discrete Bayes Filter vs. Kalman Filter

17/02/2015 Computer Vision I: Tracking 61

• Discrete Bayes Filter:

1. Make observation

• Kalman Filter:

2. Calculate likelihood for every position

3. Multiply with last prior and normalize

4. Convolution with motion model

5. Go to 1.

1. Make observation

2. Calculate likelihood for

every position in closed form 3. Multiply with last prior

and normalize in closed form 4. Convolution with motion model in closed form

5. Go to 1.

(62)

Kalman Filter

17/02/2015 Computer Vision I: Tracking 62

• Calculate likelihood 𝑝(𝑧𝑡|𝑥𝑡) as Gaussian:

(63)

Kalman Filter

17/02/2015 Computer Vision I: Tracking 63

• Calculate the posterior by

Multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

𝑡

𝑧

0:𝑡 = 𝑝(𝑥 𝑝(𝑥𝑡|𝑧0:𝑡−1)𝑝(𝑧𝑡|𝑥𝑡)

𝑡 = 𝑥|𝑧0:𝑡−1)𝑝(𝑧𝑡|𝑥𝑡 = 𝑥)𝑑 𝑥

• Closed form solution:

• Another Gaussian

(64)

Discrete Bayes Filter vs. Kalman Filter

17/02/2015 Computer Vision I: Tracking 64

• Discrete Bayes Filter:

1. Make observation

• Kalman Filter:

2. Calculate likelihood for every position

3. Multiply with last prior and normalize

4. Convolution with motion model

5. Go to 1.

1. Make observation

2. Calculate likelihood for

every position in closed form 3. Multiply with last prior

and normalize in closed form 4. Convolution with motion model in closed form

5. Go to 1.

(65)

Kalman Filter

17/02/2015 Computer Vision I: Tracking 65

• Calculate the prior by

Convolution with motion model

• Adding uncertainty

𝑝 𝑥𝑡+1 𝑧0:𝑡 = 𝑝 𝑥𝑡 = 𝑥 𝑧0:𝑡 𝑝 𝑥𝑡+1 𝑥𝑡 = 𝑥 𝑑 𝑥

• Closed form solution:

• Another Gaussian

(66)

Pros and Cons

17/02/2015 Computer Vision I: Tracking 66

Particle Filter:

Observation model can be anything

Kalman Filter:

Multimodal

Likelihood calculation per particle can be expensive

Easy to implement and parallelize

Problematic in high dimensional state space (many particles required)

Motion model can be anything

Observation model: linear transformation of state plus Gaussian noise

Unimodal

Motion model: linear transformation of last state plus Gaussian noise

Closed form solution in every step

(67)

Pose Estimation vs. Pose Tracking

17/02/2015 Computer Vision I: Tracking 67

One Shot Pose Estimation [1] Pose Tracking

[1] Brachmann, E., Krull, A., Michel, F., Shotton, J., Gumhold, S., Rother, C.: Learning 6d object pose estimation using 3d object coordinates, ECCV (2014)

• Estimate 6D Pose from single RGB-D image

• Use Object Coordinate Regression

Stream of RDB-D images

• Use information from previous frames:

Realtime

Increase robustness, accuracy

(68)

• Augmented Reality

• Alteration

• Annotation

• Substitution

Robotics

Recognition/Tracking

Automatic Grasping

Application Scenarios

17/02/2015 Computer Vision I: Tracking 68

(69)

Object Coordinate Regression

17/02/2015 Computer Vision I: Tracking 69

Energy Optimization

Compare observed and rendered images

(70)

Energy Formulation

17/02/2015 Computer Vision I: Tracking (Part I) 70

𝐸𝑐 Hc = λdepthEcdepth Hc + 𝜆𝑐𝑜𝑜𝑟𝑑𝐸𝑐𝑐𝑜𝑜𝑟𝑑 𝐻𝑐 + 𝜆𝑜𝑏𝑗𝐸𝑐𝑜𝑏𝑗 𝐻𝑐

Ecdepth Hc 𝐸𝑐𝑐𝑜𝑜𝑟𝑑 𝐻𝑐 𝐸𝑐𝑜𝑏𝑗 𝐻𝑐 𝐸𝑐 Hc

• Arbitrary 6DoF pose hypotheses 𝐻𝑐 is scored according to:

Channel Energy

Comparison of Depth Comparison to Forest Prediction

(71)

Efficient Search

17/02/2015 Computer Vision I: Tracking 71

Forest Prediction

Object Space Input

Camera Space Pose Hypothesis

(72)

How to Adapt it for Pose Tracking?

17/02/2015 Computer Vision I: Tracking 72

Ingredients:

Particle Filter

Object Coordinate Regression

Efficient Search

Observation Model:

Proposal Distribution:

Find rough estimate using

Concentrate samples around rough estimate

Energy Motion Model:

Assume continous motion

(73)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 73

Sampling from Proposal Distribution Sampling from Motion Model

(74)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 74

• Find a rough estimate for

current pose Efficient Search

Sampling from Motion Model Sampling from Proposal Distribution

(75)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 75

Sampling from Proposal Distribution Sampling from Motion Model

𝑝 𝑥𝑡+1 𝑥𝑡 𝑞 𝑥𝑡+1 𝑥𝑡

(76)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 76

Most samples have a very low weight

Sampling from Proposal Distribution Sampling from Motion Model

Few samples have very low weight

𝑤𝑡𝑖= 𝑝(𝑧𝑡|𝑥𝑡 = 𝑥𝑡𝑖) 𝑤𝑡𝑖= 𝑝(𝑧𝑡|𝑥𝑡 = 𝑥𝑡𝑖)𝑝 𝑥𝑡 = 𝑥𝑡𝑖 𝑥𝑡−1 = 𝑥𝑡−1𝑖 𝑞 𝑥𝑡 = 𝑥𝑡𝑖 𝑥𝑡−1 = 𝑥𝑡−1𝑖

(77)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 77

Most samples have a very low weight

Few samples have very low weight

Sampling from Proposal Distribution Sampling from Motion Model

𝑤𝑡𝑖= 𝑝(𝑧𝑡|𝑥𝑡 = 𝑥𝑡𝑖) 𝑤𝑡𝑖= 𝑝(𝑧𝑡|𝑥𝑡 = 𝑥𝑡𝑖)𝑝 𝑥𝑡+1 𝑥𝑡 = 𝑥𝑡𝑖 𝑞 𝑥𝑡+1 𝑥𝑡 = 𝑥𝑡𝑖

(78)

Filtering with Object Coordinates

17/02/2015 Computer Vision I: Tracking 78

Sampling from Proposal Distribution Sampling from Motion Model

More efficient

Number of Particles can be reduced

Frame rate can be increased

(79)

Evaluation: Choi and Christensen’s Dataset [2]

17/02/2015 Computer Vision I: Tracking 79

• A total of 4 synthetic sequences with 4 objects

• Objects placed in static environment

• Camera moving around object

• Partial occlusion

• Very exact ground truth

[2] Changhyun Choi, Henrik I. Christensen, RGB-D Object Tracking: A Particle Filter Approach on GPU, IROS, 2013

(80)

Evaluation: Our Dataset

17/02/2015 Computer Vision I: Tracking 80

• A total of 6 captured

sequences of with 3 objects

• Manually annotated ground truth

• Moving objects in front of dynamic background

• Fast erratic movement

• Strong occlusions

(81)

Evaluation: Our Dataset

17/02/2015 Computer Vision I: Tracking 81

• Compared to [1] applied to each frame separately

• Lower average error

• Almost no outliers

(82)

Conclusion

17/02/2015 Computer Vision I: Tracking 82

• We have adapted the system from [1] for real time pose tracking

• We have designed a proposal distribution making efficient use of object coordinates

• Our method is robust against:

Quick motion

Strong occlusion

Shadows and changing light conditions

Deformation

[1] Brachmann, E., Krull, A., Michel, F., Shotton, J., Gumhold, S., Rother, C.: Learning 6d Object Pose Estimation Using 3d Object Coordinates, ECCV '14 (2014)

Referenzen

ÄHNLICHE DOKUMENTE

“Real-time moving object detection and tracking using dynamic background and foreground separation for the purpose of moving camera based traffic analysis”, supervised by

A high-performance optimised multiple object tracking software is developed, which detects objects and calculates the two-axis movement of the mount in real-time, based on

Based on this approach Poisson likelihood models for group and extended object tracking were developed [CG07]. The rest of this paper is organised

At each time step, a finite set of noisy position measurements that stem from arbitrary, unknown measurement sources on the target surface may be available.. In contrast to

It is well known that common methods based on template matching imply problems concerning the automatic generation of realistic templates to be used in correlation based

As the data association method proposed in the multiple extended object tracker is presented using marginal association probability and requires no explicit measurement

1 Von Rechtsinhabern stammende Informationen für die Rechtewahrnehmung dürfen nicht entfernt oder verändert werden, wenn irgendeine der betreffenden Informationen an

[r]