Zero-Shot Classification
Transformers
PyTorch
Safetensors
27 languages
deberta-v2
text-classification
mdeberta-v3-base
nli
natural-language-inference
multitask
multi-task
pipeline
extreme-multi-task
extreme-mtl
tasksource
zero-shot
rlhf
Inference Endpoints
sileod commited on
Commit
4b26406
1 Parent(s): 71d356e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -78,7 +78,7 @@ pipeline_tag: zero-shot-classification
78
 
79
  # Model Card for mDeBERTa-v3-base-tasksource-nli
80
 
81
- Multilingual [mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) with 30k steps multi-task training on [mtasksource](https://github.com/sileod/tasksource/blob/main/src/tasksource/mtasks.py)
82
  This model can be used as a stable starting-point for further fine-tuning, or directly in zero-shot NLI model or a zero-shot pipeline.
83
  In addition, you can use the provided [adapters](https://huggingface.co/sileod/mdeberta-v3-base-tasksource-adapters) to directly load a model for hundreds of tasks.
84
  ```python
 
78
 
79
  # Model Card for mDeBERTa-v3-base-tasksource-nli
80
 
81
+ Multilingual [mdeberta-v3-base](https://huggingface.co/microsoft/mdeberta-v3-base) with 30k steps multi-task training on [mtasksource](https://github.com/sileod/tasksource/blob/main/mtasks.md)
82
  This model can be used as a stable starting-point for further fine-tuning, or directly in zero-shot NLI model or a zero-shot pipeline.
83
  In addition, you can use the provided [adapters](https://huggingface.co/sileod/mdeberta-v3-base-tasksource-adapters) to directly load a model for hundreds of tasks.
84
  ```python