We propose a deep learning based model and well-organized dataset for context aware paper citation recommendation. Our model comprises a document encoder and a context encoder, which uses Graph Convolutional Networks (GCN) layer and Bidirectional Encoder Representations from Transformers (BERT), which is a pretrained model of textual data. By modifying the related PeerRead,AAN dataset, we propose a new dataset called FullTextPeerRead, FullTextANN containing context sentences to cited references and paper metadata.
The code is based on that(BERT, GCN).
A Context-Aware Citation Recommendation Model with BERT and Graph Convolutional Networks
- Full Context PeerRead : Created by processing allenai-PeerRead
- Full Context ANN : Created by processing arXiv Vanity (Not disclosed due to copyright.)
There are two types of data, AAN and PeerRead. Both columns are identical.
| Header | Description |
|---|---|
| target_id | citing paper id |
| source_id | cited paper id |
| left_citated_text | text to the left of the citation tag when citing |
| right_citated_text | text to the right of the citation tag when citing |
| target_year | release target paper year |
| source_year | release source paper year |
The main script to train bert, bert-gcn model
python run_classifier.py [options]-
General Parameters:
--model(Required): The mode to run therun_classifier.pyscript in. Possible values:bertorbert_gcn--dataset(Required): The dataset to run therun_classifier.pyscript in. Possible values:AANorPeerRead--frequency(Required): Parse datasets more frequently--max_seq_length: Length of cited text to use--gpu: The gpu to run code
-
BERT Parameters: You can refer to it here.
--do_train,--do_predict,--data_dir,--vocab_file,--bert_config_file,--init_checkpoint, ...
If you want to use bert-gcn you have to run it.
python gcn_pretrain.py [options]
- GCN Parameters:
You can refer to it here.
--gcn_model,--gcn_lr,--gcn_epochs,--gcn_hidden1,--gcn_hidden2, ...

