• Keine Ergebnisse gefunden

Machine Learning 2020

N/A
N/A
Protected

Academic year: 2022

Aktie "Machine Learning 2020"

Copied!
4
0
0

Wird geladen.... (Jetzt Volltext ansehen)

Volltext

(1)

Machine Learning 2020

Volker Roth

Department of Mathematics & Computer Science University of Basel

17th February 2020

Volker Roth (University of Basel) Machine Learning 2020 17th February 2020 1 / 1

(2)

Place and Time

Place & Time

Tue, 10.15-12.00 Spiegelgasse 1, Seminarraum 00.003 Wed, 14.15-16.00 Alte Universit¨ at, Seminarraum -201.

Exercises: Wed, 16.15-18.00 Spiegelgasse 5, Seminarraum 05.001.

How do I get my credits??

70% of the problem sets (exercises) “edited in a meaningful way”

oral exam

Volker Roth (University of Basel) Machine Learning 2020 17th February 2020 2 / 1

(3)

Overview

Probability

Supervised Learning

I Geneative models for discrete data

I Classification: classical linear methods & extensions

I Regression estimation: classical linear methods, regularization, sprsity

& feature selection

I Bayesian model selection

I Neural networks & deep learing, interpretability of deep architectures

I Elements of statistical learning theory

I Support Vector Machines and kernel methods

I Probabilistic kernel models: Gaussian Processes

Unsupervised Learning

I Mixture models, mixtures of experts

I Linear latent models (FA, PCA, CCA)

I Nonlinear latent models (VAE, IB)

Volker Roth (University of Basel) Machine Learning 2020 17th February 2020 3 / 1

(4)

Textbooks

Kevin P. Murphy: Machine Learning. A Probabilistic Perspective. MIT Press, 2012.

Ian Goodfellow, Yoshua Bengio and Aaron Courville:

Deep Learning. MIT Press, 2016.

Bernhard Sch¨olkopf and Alexander J. Smola: Learning with Kernels.

Support Vector Machines, Regularization, Optimization, and Beyond.

Volker Roth (University of Basel) Machine Learning 2020 17th February 2020 4 / 1

Referenzen

ÄHNLICHE DOKUMENTE

The Bayesian view allows any self-consistent ascription of prior probabilities to propositions, but then insists on proper Bayesian updating as evidence arrives. For example P(cavity)

Bayesian concept learning: the number game The beta-binomial model: tossing coins The Dirichlet-multinomial model: rolling dice... Bayesian

I Discriminative: These classifiers focus on modeling the class boundaries or the class membership probabilities directly. No attempt is made to model the underlying class

In Bayesian analysis we keep all regression functions, just weighted by their ability to explain the data.. Our knowledge about w after seeing the data is defined by the

A rectifier neural network with d input units and L hidden layers of width m ≥ d can compute functions that have Ω m d (L−1)d m d linear regions..

The learning process is the process of choosing an appropriate function from a given set of functions.. Note: from a Bayesian viewpoint we would rather define a distribution

The famous (Fisher’s or Anderson’s) iris data set gives the measurements in centimeters of the variables sepal length and width and petal length and width, respectively, for 50

Let (in addition to the training data) a loss function be given that penalizes deviations between the true class and the estimated one (the same as the cost function in