Course - Probabilistic Artificial Intelligence - DT8122
DT8122 - Probabilistic Artificial Intelligence
About
Examination arrangement
Examination arrangement: Approved report
Grade: Passed / Not Passed
Evaluation | Weighting | Duration | Grade deviation | Examination aids |
---|---|---|---|---|
Approved report | 100/100 |
Course content
DT8122 is a summer school course and, currently, only participants of the Nordic Probabilistic AI School (ProbAI) can register for the course.
The course is going to be organized in a series of lectures followed by hands-on tutorials. Also, there will be talks covering research and application areas related to the main topics. We will (tentatively) cover the following topics: 1. Probabilistic models, variational inference and probabilistic programming Introduction to probabilistic modelling: - Bayesian modelling: prior, likelihood and posterior - Concepts of Bayesian networks and latent-variable models - Posterior inference and parameter learning - Modelling techniques Variational inference: - Mean-field, CAVI and conjugate models - Stochastic Variational Inference and Optimization - Black-box variational inference - Automatic Differentiation Variational inference Probabilistic programming: - Introduction to the concept of probabilistic programming - Language syntax and semantics - Inference mechanisms 2. Deep Generative Models Introduction to Deep Learning: - Examples of models - Stochastic optimization and backpropagation Variational Auto-Encoders Bayesian Neural Networks Combining classical neural networks and probabilistic models
Learning outcome
The main outcome of the course is to learn the principles of probabilistic models and deep generative models in Machine Learning and Artificial Intelligence, and acquiring skills for using existing tools that implement those principles (probabilistic programming languages). Knowledge: The student will learn the theory of probabilistic modelling, variational inference, probabilistic programming and deep generative models. Skills: Model designing, inference and programming with probabilistic models and deep generative models for a certain number of problems. General competence: Thinking of machine learning and artificial intelligence problems and tasks from the principles of probabilistic modelling.
Learning methods and activities
All teaching is done at the annual Probabilistic AI Summer School (https://probabilistic.ai). The only way to take the course is to be part of the summer-school the same year as you want to take the course.
Further on evaluation
A written evaluation which will be constituted of a report and programming code from each individual student. The student will receive a problem statement with a task and should use the methods and models studied during the summer school to work on the problem. The problem will be a data analysis or machine learning task. The report will be evaluated and graded with passed or failed.
Recommended previous knowledge
We expect the student to have a good knowledge of linear algebra, probability theory (random variables, expectation, distributions), statistics (basic understanding of inference, maximum likelihood for example), and a general idea of neural networks (definition and backpropagation). Chapters 1 and 2 in Christopher Bishop book, Pattern Recognition and Machine Learning, would roughly cover necessary basics. This material will be reviewed, but the course curriculum will go much deeper than that, so we would expect that students have been at least being exposed to those topics in courses or individual study.
Course materials
Curriculum will be give during the summer school. The level to expect will be something like the following:
Books: - Christopher M. Bishop. Pattern Recognition and Machine Learning. Chapter 10 (63 pages). - Kevin P. Murphy. Machine Learning: a Probabilistic Perspective. Chapter 9, 10 and 21 (92 pages) - Goodfellow et al.: Deep Learning PART III Deep Learning Research (Ch. 13-20) (235 pages) Papers: - Ranganath, R., Gerrish, S., & Blei, D. . Black box variational inference. In Artificial intelligence and statistics. 2014. - Kingma, D., & Welling, M. Auto-encoding variational Bayes. In International conference on learning representations. 2014. - Cheng Zhang et al. Advances in Variational Inference. IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2019. - Ruslan Salakhutdinov. Learning Deep Generative Models. Annual Review of Statistics and Its Application, 2015. - I. Goodfellow at al. Generative adversarial nets. Advances in neural information processing systems. 2014. - Bingham et al. Pyro: Deep Universal Probabilistic Programming. Journal of Machine Learning Research. 2018 Other practical online resources: - Examples of probabilistic modelling with Pyro: https://pyro.ai/examples/ - Tutorials, getting-starded and examples for deep learning with PyTorch: https://pytorch.org/tutorials/
Version: 1
Credits:
7.5 SP
Study level: Doctoral degree level
Term no.: 1
Teaching semester: AUTUMN 2024
Language of instruction: English
Location: Trondheim
- Computer and Information Science
Department with academic responsibility
Department of Computer Science
Examination
Examination arrangement: Approved report
- Term Status code Evaluation Weighting Examination aids Date Time Examination system Room *
-
Autumn
ORD
Approved report
100/100
Submission
2024-09-16
INSPERA
14:00 -
Room Building Number of candidates - Spring ORD Approved report 100/100 INSPERA
-
Room Building Number of candidates
- * The location (room) for a written examination is published 3 days before examination date. If more than one room is listed, you will find your room at Studentweb.
For more information regarding registration for examination and examination procedures, see "Innsida - Exams"