mervinpraison
commited on
Commit
•
4987a8f
1
Parent(s):
9b3a947
Update README.md
Browse files
README.md
CHANGED
@@ -12,12 +12,28 @@ tags:
|
|
12 |
base_model: google/gemma-7b
|
13 |
---
|
14 |
|
15 |
-
|
16 |
|
17 |
Developing a large language model for Tamil is a significant achievement, highlighting the potential for advanced AI technologies to support and enrich Tamil digital content. Created by Mervin Praison, this model stands as a testament to innovation and dedication towards enhancing language processing capabilities for Tamil speakers around the globe. It is designed to understand, interpret, and generate Tamil text, offering invaluable tools for businesses, educators, and researchers. By leveraging this model, we can unlock new opportunities for content creation, automate and improve customer support, and contribute to preserving and promoting the Tamil language in the digital age.
|
18 |
|
19 |
-
|
20 |
|
21 |
- **Developed by:** mervinpraison
|
22 |
- **License:** apache-2.0
|
23 |
- **Finetuned from model :** google/gemma-7b
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12 |
base_model: google/gemma-7b
|
13 |
---
|
14 |
|
15 |
+
# Tamil Large Language Model
|
16 |
|
17 |
Developing a large language model for Tamil is a significant achievement, highlighting the potential for advanced AI technologies to support and enrich Tamil digital content. Created by Mervin Praison, this model stands as a testament to innovation and dedication towards enhancing language processing capabilities for Tamil speakers around the globe. It is designed to understand, interpret, and generate Tamil text, offering invaluable tools for businesses, educators, and researchers. By leveraging this model, we can unlock new opportunities for content creation, automate and improve customer support, and contribute to preserving and promoting the Tamil language in the digital age.
|
18 |
|
19 |
+
## Uploaded model
|
20 |
|
21 |
- **Developed by:** mervinpraison
|
22 |
- **License:** apache-2.0
|
23 |
- **Finetuned from model :** google/gemma-7b
|
24 |
+
- **Instruction Fine Tuned :** alpaca tamil dataset
|
25 |
+
|
26 |
+
## How to use?
|
27 |
+
|
28 |
+
```
|
29 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
30 |
+
|
31 |
+
tokenizer = AutoTokenizer.from_pretrained("mervinpraison/tamil-large-language-model-7b-v1.0")
|
32 |
+
model = AutoModelForCausalLM.from_pretrained("mervinpraison/tamil-large-language-model-7b-v1.0")
|
33 |
+
|
34 |
+
query_to_llm = "ஆரோக்கியமாக இருப்பதற்கான இரண்டு வழிகள்"
|
35 |
+
inputs = tokenizer.encode(query_to_llm, return_tensors="pt")
|
36 |
+
outputs = model.generate(inputs, max_length=200)
|
37 |
+
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
|
38 |
+
print(response)
|
39 |
+
```
|