File size: 6,973 Bytes
b406fae
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5b29525
b406fae
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
849c7a0
 
 
 
 
 
 
b406fae
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
license: apache-2.0
datasets:
- VishnuPJ/Alpaca_Instruct_Malayalam
language:
- ml
- en
pipeline_tag: text-generation
---

# MalayaLLM [മലയാളം/Malayalam]

<img src="https://cdn-uploads.huggingface.co/production/uploads/64e65800e44b2668a56f9731/bipVMulaNJ9um46ecYpR4.png" alt="Baby MalayaLLM" width="300" height="200">

# MalayaLLM_7B_Instruct_v0.1

This is an attempt to construct a Language Model (LLM) focused on **generative AI for Malayalam language**. While several LLMs are proficient in supporting multiple languages, including Malayalam, enhancing their performance for specific tasks such as content generation and question answering specifically in Malayalam can be achieved through dedicated training on a Malayalam dataset. In pursuit of this, I've undertaken the **continuous pre-training of the LLAMA2 model using a comprehensive Malayalam dataset**.

The model is currently in its early stages, and ongoing training and fine-tuning with a more comprehensive dataset are necessary to enhance its performance. I will consistently provide updated revisions to the model.
# Github Repo:
For comprehensive insights into model training, fine-tuning, and other advanced techniques, refer to the MalayaLLM GitHub repository at the following link:
https://github.com/VishnuPJ/MalayaLLM
# Introducing the Developer:
Discover the mind behind this model and stay updated on their contributions to the field
https://www.linkedin.com/in/vishnu-prasad-j/
# Model description
The MalayaLLM models have been improved and customized to incorporate a comprehensive Malayalam vocabulary comprising approximately 18,000 tokens, expanding upon the groundwork laid by the original LLaMA-2.

- **Model type:** A 7B LLaMA2 finetuned model on Malayalam tokens (Alpaca_Instruct_Malayalam).
- **Language(s):** Malayalam and English
- **Datasets:**  [Alpaca_Instruct_Malayalam](https://huggingface.co/datasets/VishnuPJ/Alpaca_Instruct_Malayalam)
- **Source Model:** [MalayaLLM_7B_Base](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Base)
- **Training Precision:** `float16`
- **Code:** [GitHub](https://github.com/VishnuPJ/MalayaLLM)

**Prompt Template Without Input**

```
{system_prompt}
### Instruction:
{instruction or query}
### Response:
{response}
```

**Prompt Template With Input**

```
{system_prompt}
### Instruction:
{instruction or query}
### Input:
{input}
### Response:
{response}
```



## Available Models
| Model                    | Type                        | Data              | Base Model           | # Params | Download Links                                                         |
|--------------------------|-----------------------------|-------------------|----------------------|------|------------------------------------------------------------------------|
| MalayaLLM 7B Base   #v0.1   | Base model                  | 12GB              | LLaMA 7B             | 7B   | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Base)     |
| MalayaLLM 7B Instruct  #v0.1| Instruction following model | 52k instructions | MalayaLLM 7B Base  | 7B   | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.1) |
| ***MalayaLLM 7B Instruct  #v0.2***| Instruction following model | 52k instructions | MalayaLLM 7B Base  | 7B   | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.2) |
** **Note : MalayaLLM 7B Instruct v0.2 is the latest model.**

### Quantized Version of Available Models
| Model                    | Format | Bits                 | Download Links                                                               |
|--------------------------|--------|----------------------|------------------------------------------------------------------------------|
| MalayaLLM 7B Instruct   #v0.1  | GGUF   | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.1_GGUF)      |
| MalayaLLM 7B Instruct   #v0.2  | GGUF   | Q8_0 | [HF Hub](https://huggingface.co/VishnuPJ/MalayaLLM_7B_Instruct_v0.2_GGUF)      |

## A simple example code

```python
import os
import torch
from transformers import (
    AutoModelForCausalLM,
    AutoTokenizer,
    pipeline,
)
model_name = "VishnuPJ/MalayaLLM_7B_Instruct_v0.2"
print(f"Loading model...")
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
    model_name,
    low_cpu_mem_usage=True,
    return_dict=True,
    torch_dtype=torch.float16,
    device_map="auto",
)

tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"

pipe = pipeline(task="text-generation", model=base_model, tokenizer=tokenizer, max_length=200)
sys_prompt = "ഒരു ടാസ്ക് വിവരിക്കുന്ന ഒരു നിർദ്ദേശം ചുവടെയുണ്ട്. അഭ്യർത്ഥന ശരിയായി പൂർത്തിയാക്കുന്ന ഒരു പ്രതികരണം എഴുതുക."

while True:
    inst = input("Enter instruction (or 'exit' to end): ")
    if inst.lower() == 'exit':
        break
    # Generate response using the user-provided instruction
    result = pipe(f"{sys_prompt} ### Instruction: {inst} ### Response:")
    # Print the generated text
    print(result[0]['generated_text'])
```

## Example Output
```
Enter instruction (or 'exit' to end): സൂര്യൻ ഉദിക്കുന്ന ദിശ ഏതെന്നു പറയുക .
ഒരു ടാസ്ക് വിവരിക്കുന്ന ഒരു നിർദ്ദേശം ചുവടെയുണ്ട്. അഭ്യർത്ഥന ശരിയായി പൂർത്തിയാക്കുന്ന ഒരു പ്രതികരണം എഴുതുക. ### Instruction: സൂര്യൻ ഉദിക്കുന്ന ദിശ ഏതെന്നു പറയുക . ### Response: സൂര്യൻ ഉദിക്കുന്ന ദിശ കിഴക്കായിരിക്കും.
Enter instruction (or 'exit' to end): Where does the Sun rise?
ഒരു ടാസ്ക് വിവരിക്കുന്ന ഒരു നിർദ്ദേശം ചുവടെയുണ്ട്. അഭ്യർത്ഥന ശരിയായി പൂർത്തിയാക്കുന്ന ഒരു പ്രതികരണം എഴുതുക. ### Instruction: Where does the Sun rise? ### Response: The Sun rises in the east.
Enter instruction (or 'exit' to end):
```
##  Demo Video

Below is a brief video highlighting the model's bilingual ability to converse in both Malayalam and English.
In this demonstration, I utilize Google's transliteration tool to seamlessly translate from Manglish to Malayalam. Subsequently, I copied the translated text into the prompt console for further interaction.

<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/64e65800e44b2668a56f9731/fxVZiCeArF1so6tw9Unpc.mp4"></video>

# 🌟Happy coding💻🌟