This short course provides a rigorous overview of the current state-of-the-art in generative modeling, transitioning from foundational adversarial techniques to modern diffusion and flow-based paradigms.
Designed for senior undergraduate and graduate students, the curriculum balances theoretical derivation (SDEs, ODEs, Flow Matching) with practical architectural implementation (Diffusion Transformers, LoRA, ControlNet). The course concludes with an exploration of frontier applications in engineering sciences and the ethical implications of synthetic media.
Taxonomy of Generative Models (Implicit vs. Explicit), The Manifold Hypothesis, Adversarial Learning Dynamics (Min-Max, Nash Equilibrium), WGAN, and the StyleGAN Paradigm (Disentanglement, AdaIN).
Non-equilibrium thermodynamics, Langevin Dynamics, Forward Process (Markovian degradation), Reverse Process (Score function learning), DDPM vs. DDIM Sampling, and SDEs.
Perceptual Compression (VQ-GAN/VAE), The U-Net Inductive Bias, Cross-Attention Conditioning, and mathematical derivation of Classifier-Free Guidance (CFG).
Part A: Parameter-Efficient Fine-Tuning (LoRA), Structural Conditioning (ControlNet), and Zero-Shot Adapters (IP-Adapter). Part B: Math Preliminaries of Flow Matching (ODEs, Vector Fields, and Probability Paths).
Part A: Rectified Flows, Conditional Flow Matching, and Reflow procedures. Part B: Scaling Laws, Diffusion Transformers (DiT), Patchification, and Adaptive Layer Norm (AdaLN). Case Studies: Flux, SD3.
Engineering: Biomedical (MRI/CT), Materials (Inverse Design), Civil (Topology Optimization/Digital Twins). Ethics: Cross-Cultural Performance, Fairness & Bias Metrics, Safety & Provenance (Glaze, SynthID, C2PA).