Skip to content

Programmergg/NTK-CL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Parameter-Efficient Fine-Tuning for Continual Learning: A Neural Tangent Kernel Perspective

arXiv License Python

Jingren Liu, Zhong Ji, Yunlong Yu, Jiale Cao, Yanwei Pang, Jungong Han, Xuelong Li


🔥 News

  • [2026-03-08] Code and scripts released.
  • [2026-03-08] Reproducibility guide updated.

🧩 Overview

This repository provides the official implementation for:

Parameter-Efficient Fine-Tuning for Continual Learning: A Neural Tangent Kernel Perspective.

We study PEFT-CL via Neural Tangent Kernel (NTK) theory and introduce NTK-CL, a framework that:

  • avoids task-specific parameter storage,
  • adaptively generates task-relevant features,
  • reduces task-interplay and task-specific generalization gaps,
  • achieves strong performance on established PEFT-CL benchmarks.

✅ Key Features

  • Training / evaluation scripts for PEFT-CL benchmarks
  • NTK-CL implementation
  • Dataset preparation scripts

🗂️ Repository Structure

.
├── data/                       # datasets / prepared data
├── dataloader/                 # dataset & dataloader implementations
├── models/                     # model definitions / PEFT modules / backbones
├── trainer.py                  # training loop / continual learning pipeline
├── requirements.txt            # python dependencies for reproducing the environment
└── main.py                     # main entry for training/evaluation

🧰 Installation

1) Create environment

conda create -n ntk_cl python=3.11 -y
conda activate ntk_cl

2) Install dependencies

pip install -r requirements.txt

📦 Datasets

To facilitate reproduction, we provide downloadable dataset packages/links for the benchmarks used in this project, including: CIFAR-100, ImageNet-R, ImageNet-A, DomainNet, Oxford Pets, EuroSAT, PlantVillage, VTAB ,and Kvasir.

Download

You can download the datasets from the following links:

  • Baidu Netdisk: https://pan.baidu.com/s/1PwDAytZY5sKyUInecc5dkg?pwd=arrd

Please download the required datasets and place them under the data/ directory following the structure expected by the code.

Directory Structure

A recommended dataset layout is:

data/
├── cifar-100-python/
├── imagenet-r/
├── imagenet-a/
├── DomainNet/
├── OxfordPets/
├── EuroSAT/
├── PlantVillage/
├── VTAB/
└── Kvasir/

🚀 Quick Start

Train and Evaluate

Example: replace args with your actual argparse options

python main.py \
  --dataset cifar224_in21k \
  --suffix cifar224_in21k \
  --model_name both \
  --seed [0, 1, 2, 3, 4] \
  --device 0

📈 Results

We present the main experimental results of NTK-CL below.


🧾 Citation

If you find this repository useful, please cite our paper:

@article{liu2024parameter,
  title={Parameter-efficient fine-tuning for continual learning: A neural tangent kernel perspective},
  author={Liu, Jingren and Ji, Zhong and Yu, YunLong and Cao, Jiale and Pang, Yanwei and Han, Jungong and Li, Xuelong},
  journal={arXiv preprint arXiv:2407.17120},
  year={2024}
}

⭐ Star History

Star History Chart


📬 Contact

For questions and discussions:


About

Repo for Parameter-Efficient Fine-Tuning for Continual Learning: A Neural Tangent Kernel Perspective (TPAMI2026)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages