minhtoan commited on
Commit
e821f18
·
verified ·
1 Parent(s): 3fb19fc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -5
README.md CHANGED
@@ -14,25 +14,41 @@ pipeline_tag: translation
14
  library_name: transformers
15
  ---
16
  # Lao - English Transliteration
 
17
 
 
18
 
19
- ## How to use
20
-
21
-
22
 
 
 
23
  ```python
24
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
25
  tokenizer = AutoTokenizer.from_pretrained("minhtoan/t5-translation-lao-english")
26
  model = AutoModelForSeq2SeqLM.from_pretrained("minhtoan/t5-translation-lao-english")
27
  model.cuda()
28
- src = "ຂ້ອຍຢາກຊື້ປຶ້ມ"
29
  tokenized_text = tokenizer.encode(src, return_tensors="pt").cuda()
30
  model.eval()
31
  translate_ids = model.generate(tokenized_text, max_length=140)
32
  output = tokenizer.decode(translate_ids[0], skip_special_tokens=True)
33
  output
34
  ```
35
- 'I want to buy book'
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
 
38
  ## Author
 
14
  library_name: transformers
15
  ---
16
  # Lao - English Transliteration
17
+ Welcome to the forefront of linguistic innovation with our groundbreaking T5 model designed specifically for Lao to English translation. In a rapidly globalizing world where effective communication is paramount, our T5 model stands as a beacon of excellence, offering unparalleled accuracy, fluency, and efficiency in bridging the language gap between Lao and English.
18
 
19
+ Built on state-of-the-art deep learning architecture and trained on vast datasets of Lao and English texts, our T5 model harnesses the power of transformer-based technology to deliver seamless and precise translations. Whether you're a business expanding into Laotian markets, a researcher seeking to access Lao-language resources, or an individual connecting with Lao-speaking communities, our T5 model is your ultimate solution for unlocking linguistic barriers and fostering meaningful cross-cultural exchanges.
20
 
21
+ With a commitment to quality and innovation, our T5 model not only translates words but also preserves context, tone, and cultural nuances, ensuring that the essence of the original message remains intact in every translated sentence. Whether it's documents, websites, or multimedia content, our T5 model offers unmatched versatility and reliability, empowering users to communicate effortlessly across languages and borders.
 
 
22
 
23
+ ## How to use
24
+ ### On GPU
25
  ```python
26
  from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
27
  tokenizer = AutoTokenizer.from_pretrained("minhtoan/t5-translation-lao-english")
28
  model = AutoModelForSeq2SeqLM.from_pretrained("minhtoan/t5-translation-lao-english")
29
  model.cuda()
30
+ src = "ຂ້ອຍ​ຮັກ​ເຈົ້າ"
31
  tokenized_text = tokenizer.encode(src, return_tensors="pt").cuda()
32
  model.eval()
33
  translate_ids = model.generate(tokenized_text, max_length=140)
34
  output = tokenizer.decode(translate_ids[0], skip_special_tokens=True)
35
  output
36
  ```
37
+ 'I love you'
38
+
39
+ ### On CPU
40
+ ```python
41
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
42
+ tokenizer = AutoTokenizer.from_pretrained("minhtoan/t5-translation-lao-english")
43
+ model = AutoModelForSeq2SeqLM.from_pretrained("minhtoan/t5-translation-lao-english")
44
+ src = "ຂ້ອຍ​ຮັກ​ເຈົ້າ"
45
+ input_ids = tokenizer(src, max_length=200, return_tensors="pt", padding="max_length", truncation=True).input_ids
46
+ outputs = model.generate(input_ids=input_ids, max_new_tokens=140)
47
+ output = tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]
48
+ output
49
+ ```
50
+ 'I love you'
51
+
52
 
53
 
54
  ## Author