Update README.md
Browse files
README.md
CHANGED
@@ -32,7 +32,7 @@ In this repository, we present our German zeroshot classification model.
|
|
32 |
|
33 |
This model was trained on the basis of the German BERT large model from [deepset.ai](https://huggingface.co/deepset/gbert-large) and finetuned for natural language inference based on 847.862 machine-translated nli sentence pairs, using the [mnli](https://huggingface.co/datasets/multi_nli), [anli](https://huggingface.co/datasets/anli) and [snli](https://huggingface.co/datasets/snli) datasets. For this purpose, we translated the sentence pairs in these datasets to German.
|
34 |
|
35 |
-
If you are a German speaker you may also have a look at our Blog post about this model and about Zeroshot Classification.
|
36 |
|
37 |
### Model Details
|
38 |
|
|
|
32 |
|
33 |
This model was trained on the basis of the German BERT large model from [deepset.ai](https://huggingface.co/deepset/gbert-large) and finetuned for natural language inference based on 847.862 machine-translated nli sentence pairs, using the [mnli](https://huggingface.co/datasets/multi_nli), [anli](https://huggingface.co/datasets/anli) and [snli](https://huggingface.co/datasets/snli) datasets. For this purpose, we translated the sentence pairs in these datasets to German.
|
34 |
|
35 |
+
If you are a German speaker you may also have a look at our [Blog post](https://focus.sva.de/zeroshot-klassifikation/) about this model and about Zeroshot Classification.
|
36 |
|
37 |
### Model Details
|
38 |
|