Fill-Mask
Transformers
PyTorch
bert
Inference Endpoints
ancatmara's picture
Update README.md
77c6733 verified
|
raw
history blame
289 Bytes
metadata
language:
  - ga
  - sga
  - mga
  - ghc
  - la
library_name: transformers
license: cc-by-nc-sa-4.0
pipeline_tag: fill-mask
dataset:
  - ancatmara/CELT
  - ancatmara/CSnaG
base_model:
  - DCU-NLP/bert-base-irish-cased-v1
metrics:
  - name: perplexity
    type: perplexity
    value: 125.4

Citation