This short course provides a rigorous overview of the current state-of-the-art in generative modeling, transitioning from foundational adversarial techniques to modern diffusion and flow-based paradigms.
Designed for senior undergraduate and graduate students, the curriculum balances theoretical derivation (SDEs, ODEs, Flow Matching) with practical architectural implementation (Diffusion Transformers, LoRA, ControlNet). The course concludes with an exploration of frontier applications in engineering sciences and the ethical implications of synthetic media.
Introduction and Motivation: why studying generative models is important?, Taxonomy of Generative Models (Implicit vs. Explicit), Likelihood Maximization.
Adversarial Learning Dynamics (GAN Min-Max objective), WGAN, and the StyleGAN Paradigm (Disentanglement, AdaIN), GAN Applications.
GANs Summary, Auto-encoders, Diffusion Models, Mathematical derivation of DDPM objective.
DDPM Implementation, Conditional Generation with Diffusion Models, Latent Diffusion Models, Faster Inference and Distillation.
Math Preliminaries of Flow Matching (ODEs, Vector Fields, and Probability Paths). Diffusion Transformers (DiT)
Supplementary Lecture covering the applications of Diffusion Models.
Supplementary Lecture covering the theory of Diffusion Models.
Discussion on cross-cultural performance gaps, fairness metrics, bias mitigation, and safety in generative AI.