Machine Learning I
Introduc1on
Prerequisites: Math
One should be able at least to guess, what does it mean.
Examples:
In par1cular: linear algebra (vectors, matrices, SVD, scalar products), a bit geometry, func1ons (deriva1ve, gradients, integrals, series), op1miza1on, probability theory …
Topics
1. Probability theory: probabilis1c inference and learning (3 DS) 2. Discrimina1ve learning (1 DS)
3. Linear classifiers, complex classifiers by combina1on, basic algorithms, learning (2 DS)
4. Support Vector Machines: large margin learning, complex
classifiers by generaliza1on, kernels, a bit of sta1s1cal learning theory, empirical risk minimiza1on (3 DS)
5. Decision trees, regression trees, randomized forests (1-‐2 DS) 6. Introduc1on to graphical models, MRF-‐s (1 DS)
Seminars
• 2 Groups. Please, par11on you by yourself
• Prac1cal assignments (no computers, on the board) – lectures supplement
• Assignments pair of days before on the page
• Homework !!!
• Credits: ac1ve par1cipa1on is assessed – points during the semester, op1onal – wri`en test
Exam: oral (graded), with seminars – 4SWS, without – 2SWS
Miscellaneous
• Scripts, info etc.
h`p://www.inf.tu-‐dresden.de/index.php?node_id=2092&ln=de
• Literature:
• Christopher M. Bishop: „Pa`ern Recogni1on and Machine Learning“ (prac1cally all the stuff)
• Michail I. Schlesinger, Václav Hlavác: „Ten Lectures on Sta1s1cal and Structural Pa`ern Recogni1on“ (especially sta1s1cal PR)
• During the semester – papers (see www1.inf...)
• Forum:
h`ps://auditorium.inf.tu-‐dresden.de/courses/2154651
• Comments, requests, ques1ons, cri1cism are welcome (anonym via mail-‐form as well).