audreyberquand commited on
Commit
093a749
·
1 Parent(s): 33fea58

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -1,11 +1,12 @@
1
- **SpaceSciBERT**
2
 
3
  This is one of the 3 further pre-trained models from the SpaceTransformers family presented in [SpaceTransformers: Language Modeling for Space Systems](https://ieeexplore.ieee.org/document/9548078). The original Git repo is [strath-ace/smart-nlp](https://github.com/strath-ace/smart-nlp).
4
 
5
  The further pre-training corpus includes publications abstracts, books, and Wikipedia pages related to space systems. Corpus size is 14.3 GB. SpaceSciBERT was further pre-trained on this domain-specific corpus from [SciBERT-SciVocab (uncased)](https://huggingface.co/allenai/scibert_scivocab_uncased). In our paper, it is then fine-tuned for a Concept Recognition task.
6
 
7
- If using this model, please cite the following paper:
8
 
 
9
  @ARTICLE{
10
  9548078,
11
  author={Berquand, Audrey and Darm, Paul and Riccardi, Annalisa},
@@ -16,4 +17,5 @@ volume={9},
16
  number={},
17
  pages={133111-133122},
18
  doi={10.1109/ACCESS.2021.3115659}
19
- }
 
 
1
+ ### SpaceSciBERT
2
 
3
  This is one of the 3 further pre-trained models from the SpaceTransformers family presented in [SpaceTransformers: Language Modeling for Space Systems](https://ieeexplore.ieee.org/document/9548078). The original Git repo is [strath-ace/smart-nlp](https://github.com/strath-ace/smart-nlp).
4
 
5
  The further pre-training corpus includes publications abstracts, books, and Wikipedia pages related to space systems. Corpus size is 14.3 GB. SpaceSciBERT was further pre-trained on this domain-specific corpus from [SciBERT-SciVocab (uncased)](https://huggingface.co/allenai/scibert_scivocab_uncased). In our paper, it is then fine-tuned for a Concept Recognition task.
6
 
7
+ ### BibTeX entry and citation info
8
 
9
+ ```
10
  @ARTICLE{
11
  9548078,
12
  author={Berquand, Audrey and Darm, Paul and Riccardi, Annalisa},
 
17
  number={},
18
  pages={133111-133122},
19
  doi={10.1109/ACCESS.2021.3115659}
20
+ }
21
+ ```