• Keine Ergebnisse gefunden

Computer Vision I -

N/A
N/A
Protected

Academic year: 2022

Aktie "Computer Vision I -"

Copied!
37
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Computer Vision I - Tracking (Part I)

Carsten Rother

24/01/2015

(2)

What is Tracking?

• „Tracking an object in an image sequence means continuously identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

(3)

What is Tracking?

• „Tracking an object in an image sequence means continuously identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

(4)

What is Tracking?

• „Tracking an object in an image sequence means continuosly

identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005]

• This can mean estimating in each frame:

• 2D location or window

• 6D rigid body transformation

• More complex parametric models

Active Appearance Models

Skeleton for human pose etc.

(5)

Tracking vs Localization

• Tracking of objects is closely related to:

• camera pose estimation in a known environment

• localization of agents (eg. Robots) in a known environment

• Reminder: SLAM has unknown location (agent, camera) and unknown environment

(6)

Outline

• This lecture

• The Bayes Filter

Explained for localization

• Next lecture

• The Particle Filter

• The Kalman Filter

• Pros and Cons

• Beyond tracking and localization

• Case study:

6-DOF Model Based Tracking via Object Coordinate Regression

(7)

Probabilities - Reminder

A random variable is denoted with 𝑥 ∈ {0, … , 𝐾}

Discrete probability distribution: 𝑝(𝑥) satisfies 𝑥 𝑝(𝑥) = 1

Joint distribution of two random variables: 𝑝(𝑥, 𝑧)

Conditional distribution: 𝑝 𝑥 𝑧

Sum rule (marginal distribution): 𝑝 𝑧 = 𝑥 𝑝(𝑥, 𝑧)

Independent probability distribution: 𝑝 𝑥, 𝑧 = 𝑝 𝑧 𝑝 𝑥

Product rule: 𝑝 𝑥, 𝑧 = 𝑝 𝑧 𝑥 𝑝(𝑥)

Bayes’ rule: 𝑝 𝑥|𝑧 = 𝑝(𝑧|𝑥)𝑝 𝑥

(8)

The Bayes Filter / Convolute and Multiply

• We have:

• Probabilistic model for movement

• Probabilistic model for measurement

Based on map of the environment

• Where is the ship?

• Using all previous and current observations

(9)

1

z

t

z

t

1

x

t

x

t

The Hidden Markov Model

Observations:

Depth measurement

States:

Positions

time

(10)

The Hidden Markov Model

Observation Model:

What is the likelihood of an observation, given a state?

𝑝 𝑧

𝑡

𝑥

𝑡

= 1

𝑐 𝑒

− 𝑧𝑡−𝑑 𝑥𝑡 2/(2𝜎2)

1

z

t

z

t

1

x

t

x

t

time

𝑑 𝑥 is the known true depth

(11)

The Hidden Markov Model

Motion Model:

Probability for state transition

𝑝 𝑥

𝑡+1

𝑥

𝑡

1

z

t

z

t

1

x

t

x

t

time

(12)

Probability distribution for the state given all previous and current observations:

• This is what we are interested in

• Eg. use maximum as current estimate

The Posterior distribution

z

t

x

t

1

z

t

1

x

t

z

0

x

0

...

...

𝑝 𝑥

𝑡

𝑧

0:𝑡

𝑥𝑡 = 𝑎𝑟𝑚𝑎𝑥𝑥𝑡 𝑝 𝑥𝑡 𝑧0:𝑡

(13)

The Prior distribution

1

x

t

x

t

1

z

t

1

x

t

z

0

x

0

...

...

𝑝 𝑥

𝑡+1

𝑧

0:𝑡

Probability distribution for the next state given only previous observations:

• Intermediate step for calculating the next posterior

z

t

(14)

Important Distributions

𝑝 𝑥𝑡+1 𝑧0:𝑡

1

xt

xt

1

zt

1

xt

z0

x0

X0

...

X0

X0

xt

1

zt

1

xt

z0

x0

X0

...

𝑝 𝑥𝑡 𝑧0:𝑡

1

xt

xt

zt

xt

𝑝 𝑧𝑡 𝑥𝑡 𝑝 𝑥𝑡+1 𝑥𝑡

Observation Model: Motion Model:

Posterior: Prior:

Likelihood of

observation given state

Continuous Gaussian around real depth

Probability of new state given old one

Discrete Gaussian

Probability of state given previous and current observations

Probability of state given only previous observations

(15)

Step by Step

𝑡 = 0

• Assume prior for first frame:

• Make first measurement: 𝑧0

𝑝(𝑥

0

)

(16)

Step by Step

𝑡 = 0

• Calculate likelihood 𝑝(𝑧0|𝑥0) for every possible state 𝑥0:

(17)

Step by Step

𝑡 = 0

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

0

𝑧

0 = 𝑝(𝑥0)𝑝(𝑧0|𝑥0)

𝑥𝑝(𝑥0 = 𝑥)𝑝(𝑧0|𝑥0 = 𝑥)

(18)

The Bayes Filter / Convolute and Multiply

𝑡 = 0

𝑝 𝑥1 𝑧0 =

𝑥

𝑝 𝑥0 = 𝑥 𝑧0 𝑝 𝑥1 𝑥0 = 𝑥

• Calculate the prior by Convolution with motion model

• Adding uncertainty

(19)

The Bayes Filter / Convolute and Multiply

𝑡 = 1

• Make new measurement: 𝑧1

(20)

The Bayes Filter / Convolute and Multiply

𝑡 = 1

• Calculate likelihood 𝑝(𝑧1|𝑥1) for every possible state 𝑥1:

(21)

The Bayes Filter / Convolute and Multiply

𝑡 = 1

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

1

𝑧

0:1 = 𝑝(𝑥1|𝑧0)𝑝(𝑧1|𝑥1)

𝑥𝑝(𝑥1 = 𝑥|𝑧0)𝑝(𝑧1|𝑥1 = 𝑥)

(22)

The Bayes Filter / Convolute and Multiply

𝑡 = 1

• Calculate the prior by Convolution with motion model

• Adding uncertainty 𝑝 𝑥2 𝑧0:1 =

𝑥

𝑝 𝑥1 = 𝑥 𝑧0:1 𝑝(𝑥2|𝑥1 = 𝑥)

(23)

The Bayes Filter / Convolute and Multiply

𝑡 = 2

• Make new measurement: 𝑧2

(24)

The Bayes Filter / Convolute and Multiply

𝑡 = 2

• Calculate likelihood 𝑝(𝑧2|𝑥2) for every possible state 𝑥2:

(25)

The Bayes Filter / Convolute and Multiply

𝑡 = 2

• Calculate the posterior by

multiplying with prior

• Normalizing

• Reducing uncertainty

𝑝 𝑥

2

𝑧

0:2 = 𝑝(𝑥2|𝑧0:1)𝑝(𝑧2|𝑥2)

𝑥𝑝(𝑥2 = 𝑥|𝑧0:1)𝑝(𝑧2|𝑥2 = 𝑥)

(26)

The Bayes Filter / Convolute and Multiply

𝑡 = 2

(27)

The Bayes Filter / Convolute and Multiply

Algorithm:

1. Make observation

2. Calculate likelihood for every position 3. Multiply with last prior and normalize

• Calculate posterior

4. Convolution with motion model

• Calculate new prior 5. Go to 1.

(28)

𝑝 𝑥

0

𝑧

0

Calculating the Posterior

𝑝(𝑥

0

) z

0

x

1

x

0

z

1

...

= 𝑝(𝑥0)𝑝(𝑧0|𝑥0)

𝑥𝑝(𝑥0 = 𝑥)𝑝(𝑧0|𝑥0 = 𝑥)

(Multiply and normalize)

Observation model

(see proof 1)

(29)

Proof 1

(30)

Calculating the next Prior

𝑝(𝑥

0

) 𝑝 𝑥

0

𝑧

0

𝑝 𝑥

1

𝑧

0

...

=

𝑥

𝑝 𝑥0 = 𝑥 𝑧0 𝑝 𝑥1 𝑥0 = 𝑥

(Convolution)

Observation model

z

0

x

1

x

0

z

1

(see proof 2)

(31)

Proof 2

(32)

Calculating the Posterior

𝑝(𝑥

0

) 𝑝 𝑥

0

𝑧

0

𝑝 𝑥

1

𝑧

0

...

Observation model

x

1

x

0

Observation model

𝑝 𝑥

1

𝑧

0:1 = 𝑝(𝑥1|𝑧0)𝑝(𝑧1|𝑥1)

𝑥𝑝(𝑥1 = 𝑥)𝑝(𝑧1|𝑥1 = 𝑥)

z

1

z

0

(33)

𝑝 𝑥

𝑡

𝑧

0:𝑡

Calculating the Posterior (General Case)

𝑝(𝑥

𝑡

|𝑧

0:𝑡−1

) z

t

1

x

t

x

t

1

z

t

...

= 𝑝(𝑥𝑡|𝑧0:𝑡−1)𝑝(𝑧𝑡|𝑥𝑡)

(Multiply and normalize)

Observation model

...

(34)

Proof 3

(35)

Calculating the next Prior (General Case)

𝑝(𝑥

𝑡

|𝑧

0:𝑡−1

) 𝑝 𝑥

𝑡

𝑧

0:𝑡

𝑝 𝑥

𝑡+1

𝑧

0:𝑡

...

=

𝑥

𝑝 𝑥𝑡 = 𝑥 𝑧0:𝑡 𝑝(𝑥𝑡+1|𝑥𝑡 = 𝑥)

Observation model

...

z

t

1

x

t

x

t

1

z

t

(36)

Proof 4

(37)

Particle Filter

• How to apply it in continuous space?

• Two popular alternatives:

• Particle filter

Represent distributions with samples

• Kalman Filter

Represent distributions as Gaussians

Referenzen

ÄHNLICHE DOKUMENTE

“Real-time moving object detection and tracking using dynamic background and foreground separation for the purpose of moving camera based traffic analysis”, supervised by

4 Detection Pipeline 4.1 Dataset description 4.1.1 FieldSAFE Dataset 4.1.2 Milrem Robotics Dataset 4.2 Detection Algorithm 4.2.1 Preprocessing 4.2.2 2D region proposals

In this work we have shown that a sufficient statistic based on a model with correlated noise leads to significantly better performance, on both synthetic data and on

We want to find a few regions where this image pair matches: Applications later.. Goal: Interest Point Detection. • Goal: predict a few “interest points” in order to remove redundant

• „Tracking an object in an image sequence means continuously identifying its location when either the object or the camera are moving“ [Lepetit and Fua 2005].. • This can

If this rule is learned with the stimuli involving only two of the three positive or negative fea- tures (top column), then transfer should occur to the novel stimuli having all

As the data association method proposed in the multiple extended object tracker is presented using marginal association probability and requires no explicit measurement

Because this problem seemed not to be solvable adequately, modeling secure identification systems based on fading out that problem by a different definition (cc2 3 ): E.g., in [5,