RuBERT-tiny for NLI (natural language inference)

This is the cointegrated/rubert-tiny model fine-tuned to predict the logical relationship between two short texts: entailment or not entailment.

For more details, see the card for a related model: https://huggingface.co/cointegrated/rubert-base-cased-nli-threeway

Downloads last month
361
Safetensors
Model size
11.8M params
Tensor type
I64
·
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train cointegrated/rubert-tiny-bilingual-nli