JCarlos commited on
Commit
9dfb713
1 Parent(s): 033a226

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - es
4
+ - qu
5
+ tags:
6
+ - quechua
7
+ - translation
8
+ - spanish
9
+ license: apache-2.0
10
+ ---
11
+
12
+ # t5-small-finetuned-spanish-to-quechua
13
+
14
+ This model is a finetuned version of the [t5-small](https://huggingface.co/t5-small).
15
+
16
+ ## Model description
17
+
18
+
19
+
20
+ ## Intended uses & limitations
21
+
22
+
23
+
24
+ ### How to use
25
+
26
+ You can import this model as follows:
27
+
28
+ ```python
29
+ >>> from transformers import AutoModelForSeq2SeqLM
30
+ >>> from transformers import AutoTokenizer
31
+ >>> model_name = 'hackathon-pln-es/t5-small-finetuned-spanish-to-quechua'
32
+ >>> model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
33
+ >>> tokenizer = AutoTokenizer.from_pretrained(model_name)
34
+ ```
35
+
36
+ To translate you can do:
37
+
38
+ ```python
39
+ >>> sentence = "Entonces dijo"
40
+ >>> input = tokenizer(text, return_tensors="pt")
41
+ >>> output = model.generate(input["input_ids"], max_length=40, num_beams=4, early_stopping=True)
42
+ >>> print('Original Sentence: {} \nTranslated sentence: {}'.format(sentence, tokenizer.decode(output[0])))
43
+ ```
44
+
45
+ ### Limitations and bias
46
+
47
+
48
+
49
+ ## Training data
50
+
51
+
52
+
53
+ ## Evaluation results
54
+
55
+ We obtained the following metrics during the training process:
56
+
57
+ `eval_bleu = 2.9691`
58
+ `eval_loss = 1.2064628601074219`