Commit
·
d6174ba
1
Parent(s):
940c03e
Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
**SpaceBERT**
|
| 2 |
+
|
| 3 |
+
This is one of the 3 further pre-trained models from the SpaceTransformers family presented in [SpaceTransformers: Language Modeling for Space Systems](https://ieeexplore.ieee.org/document/9548078). The original Git repo is [strath-ace/smart-nlp](https://github.com/strath-ace/smart-nlp).
|
| 4 |
+
|
| 5 |
+
The further pre-training corpus includes publications abstracts, books, and Wikipedia pages related to space systems. Corpus size is 14.3 GB. SpaceBERT was further pre-trained on this domain-specific corpus from [BERT-Base (uncased)](https://huggingface.co/bert-base-uncased). In our paper, it is then fine-tuned for a Concept Recognition task.
|
| 6 |
+
|
| 7 |
+
If using this model, please cite the following paper:
|
| 8 |
+
|
| 9 |
+
@ARTICLE{
|
| 10 |
+
9548078,
|
| 11 |
+
author={Berquand, Audrey and Darm, Paul and Riccardi, Annalisa},
|
| 12 |
+
journal={IEEE Access},
|
| 13 |
+
title={SpaceTransformers: Language Modeling for Space Systems},
|
| 14 |
+
year={2021},
|
| 15 |
+
volume={9},
|
| 16 |
+
number={},
|
| 17 |
+
pages={133111-133122},
|
| 18 |
+
doi={10.1109/ACCESS.2021.3115659}
|
| 19 |
+
}
|