Minimal PyTorch unconditional DDPM project (MNIST default) with:
- U-Net denoiser + timestep embedding
- Linear beta schedule
- Epsilon-prediction MSE training
- Ancestral sampling script
cd /Users/jiayibaobei/Desktop/unconditional-ddpm
python -m pip install -r requirements.txtpython train.py --outdir ./runs --epochs 20 --batch_size 128Outputs:
- Checkpoints:
./runs/checkpoints/ - Sample grids:
./runs/samples/
python sample.py \
--checkpoint ./runs/checkpoints/model_epoch_20.pt \
--out ./runs/final_samples.png \
--num_samples 64- This is unconditional: class labels are not used.
- Inputs are normalized to
[-1, 1]. - Default dataset is MNIST resized to
32x32.