Skip to content

[NeurIPS 2024 Spotlight] Official implementation for "PACE: marrying generalization in PArameter-efficient fine-tuning with Consistency rEgularization"

License

Notifications You must be signed in to change notification settings

MaxwellYaoNi/PACE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

30 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation


PACE: marrying generalization in PArameter-efficient fine-tuning with Consistency rEgularization (NeurIPS 2024 Spotlight)

Yao Ni , Shan Zhang , Piotr Koniusz

Paper arXiv Slides Slides Video Youtube

Overview



Below are the general knowledges discovered from our work:

πŸ’‘ Lower gradient norms improve model generalization.

πŸ’‘ Consistency regularization across different perturbations reduces gradient norms, improving generalization.

πŸ’‘ Consistency regularization on adapter features aligns fine-tuned models with pre-trained ones, preserving knowledge.


Code for PACE-Vision is Released.

Code for PACE-NLP is being reorganized.

We will add PACE into huggingface soon.

Citation

If you find the theories or code help your work, please kindly cite our paper:

@inproceedings{
ni2024pace,
title={{PACE}: marrying the generalization of {PA}rameter-efficient fine-tuning with Consistency rEgularization},
author={Yao Ni and Shan Zhang and Piotr Koniusz},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems},
year={2024},
url={https://openreview.net/forum?id=cOuLbPhOT1}
}

About

[NeurIPS 2024 Spotlight] Official implementation for "PACE: marrying generalization in PArameter-efficient fine-tuning with Consistency rEgularization"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages