Fill-Mask
Transformers
PyTorch
bert
Inference Endpoints
ancatmara's picture
Update README.md
7b83943 verified
|
raw
history blame
290 Bytes
metadata
language:
  - ga
  - sga
  - mga
  - ghc
  - la
library_name: transformers
license: cc-by-nc-sa-4.0
pipeline_tag: fill-mask
datasets:
  - ancatmara/CELT
  - ancatmara/CSnaG
base_model:
  - DCU-NLP/bert-base-irish-cased-v1
metrics:
  - name: perplexity
    type: perplexity
    value: 125.4

Citation