Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

('---\ndatasets:\n- ctu-aic/csfever_nearestp\nlanguages:\n- cs\nlicense: cc-by-sa-4.0\ntags:\n- natural-language-inference\n\n---',)

🦾 bert-base-multilingual-cased-csfever_nearestp

Transformer model for Natural Language Inference in ['cs'] languages finetuned on ['ctu-aic/csfever_nearestp'] datasets.

🧰 Usage

πŸ‘Ύ Using UKPLab sentence_transformers CrossEncoder

The model was trained using the CrossEncoder API and we recommend it for its usage.

from sentence_transformers.cross_encoder import CrossEncoder
model = CrossEncoder('ctu-aic/bert-base-multilingual-cased-csfever_nearestp')
scores = model.predict([["My first context.", "My first hypothesis."],  
                        ["Second context.", "Hypothesis."]])

πŸ€— Using Huggingface transformers

from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("ctu-aic/bert-base-multilingual-cased-csfever_nearestp")
tokenizer = AutoTokenizer.from_pretrained("ctu-aic/bert-base-multilingual-cased-csfever_nearestp")

🌳 Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

πŸ‘¬ Authors

The model was trained and uploaded by ullriher (e-mail: ullriher@fel.cvut.cz)

The code was codeveloped by the NLP team at Artificial Intelligence Center of CTU in Prague (AIC).

πŸ” License

cc-by-sa-4.0

πŸ’¬ Citation

If you find this repository helpful, feel free to cite our publication:


@article{DBLP:journals/corr/abs-2201-11115,
  author    = {Herbert Ullrich and
               Jan Drchal and
               Martin R{'{y}}par and
               Hana Vincourov{'{a}} and
               V{'{a}}clav Moravec},
  title     = {CsFEVER and CTKFacts: Acquiring Czech Data for Fact Verification},
  journal   = {CoRR},
  volume    = {abs/2201.11115},
  year      = {2022},
  url       = {https://arxiv.org/abs/2201.11115},
  eprinttype = {arXiv},
  eprint    = {2201.11115},
  timestamp = {Tue, 01 Feb 2022 14:59:01 +0100},
  biburl    = {https://dblp.org/rec/journals/corr/abs-2201-11115.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.