File size: 1,065 Bytes
e88c193 ca13cba e88c193 ca13cba 4c1e43e 5127871 768a9dc 5127871 264e1ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 |
---
language:
- ru
- ru-RU
tags:
- t5
inference:
parameters:
no_repeat_ngram_size: 4
datasets:
- samsum
widget:
- text: |
Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
Philipp: Sure you can use the new Hugging Face Deep Learning Container.
Jeff: ok.
Jeff: and how can I get started?
Jeff: where can I find documentation?
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
model-index:
- name: mbart_ruDialogSum
results:
- task:
name: Abstractive Dialogue Summarization
type: abstractive-text-summarization
dataset:
name: "SAMSum Corpus (translated to Russian)"
type: samsum
metrics:
- name: Validation ROGUE-1
type: rogue-1
value: 30
- name: Validation ROGUE-L
type: rogue-l
value: 30
- name: Test ROGUE-1
type: rogue-1
value: 31
- name: Test ROGUE-L
type: rogue-l
value: 31
---
### 📝 Description |