Teaching & Example Course
Basic Concepts of Deep Learning

AI-readiness is an important aspect of the next funding phase of the Helmholtz centres. This workshop provides a practical introduction to machine and deep learning for researchers from diverse backgrounds, focussing on principles, hands-on PyTorch exercises, and workflows relevant to environmental science.

I am currently providing courses in the context of my main affiliation, and you can contact me there. I am a carpentries-licenses instructor and have been teaching for several years, usually in the form of workshop-like courses.

Taught Courses and Workshops

Basic Concepts of Deep Learning

This concept course introduces fundamentals of machine and deep learning, practical PyTorch usage, and exercises to train and evaluate neural networks. It is designed for environmental researchers who already have basic Python skills and want to become AI-ready for their workflows. Note: This is a course concept which we will test in 2026.

Format: Online or hybrid — full-week workshop with slides, instructor notebooks and Jupyter exercises.


Introduction to Deep Learning
Learn the core ideas of learning theory and when deep learning is an appropriate tool, including some statistical properties of learners.
Topics: basics of learning theory; estimation and regression; discriminative learning.

PyTorch Basics
Gain practical experience with PyTorch performing basic operations in PyTorch, get to know its environment and build a basic learning setup.
Topics: PyTorch tensors; tensor operations; loss functions; perceptron learning.

Deep Learning Basics
Understand differences between supervised and unsupervised learning, what tasks are learnable, and how to prepare datasets for effective model training.
Topics: supervised learning; learnability; unsupervised learning; data preparation for DL.

PyTorch for Backpropagation
Implement and train a multi-layer perceptron, learn how the training loop functions and how to inspect the training.
Topics: MPL building, training basics, PyTorch learning mechanisms, data pipelines.

Deep Learning Architectures
A survey of common deep neural networks, including what they were designed for and how they function.
Topics: DL research overview; convolutional neural networks; latent and memory models; attention and transformers.

PyTorch Applied Learning
Apply pretrained backbones and train common architectures on real datasets, learning transfer-learning workflows and dataset-specific training strategies.
Topics: building common architectures; pretrained models and checkpoints; ImageNet workflows; custom data sources and training strategies.

Bonus
Preparing Data for AI
Learn how to create reliable training datasets through cleaning, annotation and augmentation so models generalise well to real-world data.
Topics: data cleaning; annotation strategies; augmentation techniques.

Bonus
Model-based Training
Understand hybrid approaches that combine physical models with data-driven learning to leverage prior knowledge and improve generalisation.
Topics: physics-informed approaches; hybrid modeling; integration strategies.

Bonus
Synthetic Data Generation
Explore methods to generate synthetic datasets and when to use them to mitigate data scarcity and improve model robustness.
Topics: simulation pipelines; domain randomization; render-to-data strategies; Synavis Framework.

Bonus
Generative AI
Understand the capabilities of generative models and how they can augment data and workflows for more robust training.
Topics: generative models; GANs and diffusion models; generative data augmentation.

Bonus
PyTorch-based libraries
Check out modern libraries surrounding deep learning, training frameworks, meta-analysis, and common architecture providers
Topics: Libraries, Metaanalysis, pytorch-lightning, torchvision, MONAI

Bonus
Uncertainty & Bayesian Learning
Learn basic techniques for quantifying model uncertainty and how Bayesian approaches can lead to more reliable, probabilistic predictions.
Topics: uncertainty quantification; Bayesian neural networks; probabilistic predictions.