This course is based on the Machine Learning book by Univ.-Prof. Dr. Elmar Rueckert.
It is written for experienced undergraduates or for first
semester graduate students.
This lecture with integrated exercises provides the basic knowledge for the application of modern machine learning methods. It includes an introduction to the basics of data modeling and probability theory. Classical probabilistic linear and non-linear regression methods are derived and discussed using practical examples.
Links and Resources
Location & Time
Lecture
- Location: HS 1 Studienzentrum
- Dates: Wednesdays 13:15-15:00 Summer semester
Exercise
- Location: HS 2 Studienzentrum
- Dates: Fridays 13:15-15:00 Summer semester
Slides
- 06.03.2023 Introduction & Organisation
- Slides
- Python crash course using our Jupyter Hub & Jupyter Notebooks
- 13.03.2024 Machine Learning Fundamentals I + Python Perceptron Example
- 20.03.2024 Machine Learning Fundamentals II
- 10.04.2024 Linear Algebra for ML + Jupyter NB Line Fitting Example + Jupyter NB Perceptron Iterative Update Example
- 17.04.2024 Probability Theory
- 24.04.2024 KL divergence & Linear Feature Regression I
- 08.05.2024 Bayesian Feature Regression
- 15.05.2024 Gaussian Process Regression
- + GPy Example + Jupyter NB GP sklearn Example + Online Tutorial on GPs with interactive animations
- 22.05.2024 Probabilistic Time Series Models, Video Recording
- 29.05.2024 Bayesian Optimization
- 05.06.2024 Optional Date
- 14.06.2024 IML Lab Feedback Discussion and Exam Preparation and Q&A
- 19.06.2024 Written Exam
- 26.06.2024 Exam Results & Best Practices & Feedback Discussion
Course Topics
- Introduction to Machine Learning (Data and modelling fundamentals)
- Introduction to Probability Theory (Statistics refresher, Bayes Theorem, Common Probability distributions, Gaussian Calculus).
- Linear Probabilistic Regression (Linear models, Maximum Likelihood, Bayes & Logistic Regression).
- Nonlinear Probabilistic Regression (Radial basis function networks, Gaussian Processes, Recent research results in Robotic Movement Primitives, Hierarchical Bayesian & Mixture Models).
- Probabilistic Inference for Time Series (Time series data, basis function models, learning).
Learning objectives / qualifications
- Students get a comprehensive understanding of basic probability theory concepts and methods.
- Students learn to analyze the challenges in a task and to identify promising machine learning approaches.
- Students will understand the difference between deterministic and probabilistic algorithms and can define underlying assumptions and requirements.
- Students understand and can apply advanced regression, inference and optimization techniques to real world problems.
- Students know how to analyze the models’ results, improve the model parameters and can interpret the model predictions and their relevance.
- Students understand how the basic concepts are used in current state-of-the-art research in robot movement primitive learning and in neural planning.
Grading
The course will be graded based on a written exam (100 Points). 50% of all questions need to be answered correctly to be positive. The exam will take place in the classroom, or online, depending on the current university regulations.
In addition, up to 10 bonus points obtained in regular quiz sessions in the classroom, and 20% of the achieved points of the Machine Learning Lab will be added to your exam result. Note that bonus points can only be obtained when attending the lectures in person.
Grading scheme: 0-49.9Pts (5), 50-65.9Pts (4), 66-79Pts (3), 80-91Pts (2), 92-100Pts (1).
Forthcoming exam dates are:
- XX.06.2025 at 13:15 HS 1 Studienzentrum
- XX.10.2025 at 13:15 – 14:45 (location not fixed)
- More dates upon request via email to cps@unileoben.ac.at (send your request one month in advance to the desired exam date).
Literature
- The Probabilistic Machine Learning book by Univ.-Prof. Dr. Elmar Rueckert.
- James-A. Goulet. Probabilistic Machine Learning for Civil Engineers. ISBN 978-0-262-53870-1.
- Daphne Koller, Nir Friedman. Probabilistic Graphical Models: Principles and Techniques. ISBN 978-0-262-01319-2
- Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer (2006). ISBN 978-0-387-31073-2.
- David Barber. Bayesian Reasoning and Machine Learning, Cambridge University Press (2012). ISBN 978-0-521-51814-7.
- Kevin P. Murphy. Machine Learning: A Probabilistic Perspective. ISBN 978-0-262-01802-9
Note that all books are available at our library or at the chair of CPS.