This repository contains my coursework for the Generative Models course conducted by Professor Abdollah Safari in Spring 2025.
All models are trained on the MNIST dataset, a benchmark image dataset for generative modeling.
-
Autoregressive Model โ NADE
Implemented an autoregressive Neural Autoregressive Distribution Estimator (NADE). -
Variational Autoencoders (VAE Family)
- Implemented VAE and ฮฒ-VAE.
- Built a Conditional VAE (CVAE).
-
Model Evaluation
Evaluated generative models using Inception Score (IS), Kernel Inception Distance (KID), and Frรฉchet Inception Distance (FID). -
GANs, EBMs, and Normalizing Flows
- Implemented Generative Adversarial Networks (GANs).
- Implemented Energy-Based Models (EBMs).
- Implemented Normalizing Flow models (NICE).
- Compared results with previous models.
-
Diffusion Models
- Implemented DDPM (Denoising Diffusion Probabilistic Models) and DDIM.
- Explored distillation techniques to accelerate diffusion sampling.
The final project focused on a literature-based study of generative models.
We selected a paper from NeurIPS/ICML 2024 that was reproducible and computationally feasible, then performed forward and backward citation analysis to trace its origins and evaluate its academic impact.
The full report (PDF) is available in the Project/ folder.