Skip to content

nju-websoft/PFLU

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Parameter-efficient Federated Knowledge Graph Embedding Learning and Unlearning

This is the official code release of the following paper:

Xiangrong Zhu, Yuexiang Xie, Yang Liu, Yaliang Li, Wei Hu. Parameter-efficient Federated Knowledge Graph Embedding Learning and Unlearning, ISWC 2025.

Knowledge graph (KG) embedding aims to learn vector representations for entities and relations. To support distributed and secure training, federated KG embedding collaboratively learns an embedding model among multiple organizations without directly sharing raw data. A federated KG embedding framework should be capable of incorporating both learning and unlearning paradigms, due to the dynamic nature of KGs and the potential need for visibility adjustment by KG owners. However, there are still several challenges in developing a federated KG embedding framework, including overcoming the communication and computation overhead caused by the huge number of entities and controlling the scope of influence when unlearning. In this paper, we propose a novel parameter-efficient federated KG embedding learning and unlearning framework, named PFLU. Specifically, we incorporate an anchor-based federated KG embedding technique to reduce computation and communication overhead when transferring knowledge. We address the anchor selection problem by formulating it as a maximum coverage problem and designing a greedy strategy for its resolution. Besides, to achieve a desirable balance between propagation and retention of unlearning effects, we adopt a two-stage mechanism combining structural and semantic unlearning. Extensive experiments on widely-used datasets show the superior parameter efficiency of PFLU over several baselines in both federated KG embedding learning and unlearning. Furthermore, we provide empirical evidence and discussions to show the effectiveness of the proposed anchor selection strategy.

image

Quick Start

Data Preparation

Please refer to the data directory for instructions on preparing the datasets.

Run PFLU

To run federated training with DistMult:

bash code/model/run_distmult.sh

To perform knowledge unlearning with DistMult:

bash code/model/run_unlearn_distmult.sh

Other models are supported via their respective scripts.

If you have any difficulty or question in running code and reproducing experimental results, please email to xrzhu.nju@gmail.com.

Citation

If you find the repository helpful, please cite the following paper.

@inproceedings{PFLU,
  title = {Parameter-efficient Federated Knowledge Graph Embedding Learning and Unlearning},
  author = {Zhu, Xiangrong and 
            Xie, Yuexiang and 
            Liu, Yang and 
            Li, Yaliang and 
            Hu, Wei},
  booktitle = {ISWC},
  year = {2025}
}

About

Parameter-efficient Federated Knowledge Graph Embedding Learning and Unlearning, ISWC 2025

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors