Fill-Mask
Transformers
PyTorch
Safetensors
Faroese
xlm-roberta
Inference Endpoints
File size: 1,070 Bytes
ad2a99e
 
aa14981
 
 
 
 
 
8f5ad6e
 
aa14981
 
ad2a99e
aa14981
85f3b76
aa14981
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
license: agpl-3.0
datasets:
- vesteinn/FC3
- vesteinn/IC3
- mideind/icelandic-common-crawl-corpus-IC3
- DDSC/partial-danish-gigaword-no-twitter
- NbAiLab/NCC
widget:
- text: Býir vaksa <mask> enn nakað annað búøki á jørðini.
language:
- fo
---

This is a Faroese language model, it was trained by adapting the [ScandiBERT-no-faroese](https://huggingface.co/vesteinn/ScandiBERT-no-faroese) model on the [FC3 corpus](https://huggingface.co/datasets/vesteinn/FC3) for 50 epochs.

If you find this model useful, please cite

```
@inproceedings{snaebjarnarson-etal-2023-transfer,
    title = "{T}ransfer to a Low-Resource Language via Close Relatives: The Case Study on Faroese",
    author = "Snæbjarnarson, Vésteinn  and
      Simonsen, Annika  and
      Glavaš, Goran  and
      Vulić, Ivan",
    booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
    month = "may 22--24",
    year = "2023",
    address = "Tórshavn, Faroe Islands",
    publisher = {Link{\"o}ping University Electronic Press, Sweden},
}
```