File size: 2,892 Bytes
bcd9fa7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f9b7c30
 
 
 
 
 
 
 
bcd9fa7
f9b7c30
 
 
 
 
bcd9fa7
 
f9b7c30
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
008a681
f9b7c30
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
language:
- bn
license: apache-2.0
tags:
- text-generation-inference
- transformers
- mistral
- trl
- sft
base_model: unsloth/mistral-7b-v0.3-bnb-4bit
pipeline_tag: question-answering
datasets:
- iamshnoo/alpaca-cleaned-bengali
---

# How to Use:

You can use the model with a pipeline for a high-level helper or load the model directly. Here's how:

```python
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="asif00/mistral-bangla-4bit")
```

```python
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("asif00/mistral-bangla-4bit")
model = AutoModelForCausalLM.from_pretrained("asif00/mistral-bangla-4bit")
```

# General Prompt Structure: 

```python
prompt = """Below is an instruction in Bengali language that describes a task, paired with an input also in Bengali language that provides further context. Write a response in Bengali language that appropriately completes the request.

### Instruction:
{}

### Input:
{}

### Response:
{}
"""
```

# To get a cleaned up version of the response, you can use the `generate_response` function:

```python
def generate_response(question, context):
    inputs = tokenizer([prompt.format(question, context, "")], return_tensors="pt").to("cuda")
    outputs = model.generate(**inputs, max_new_tokens=1024, use_cache=True)
    responses = tokenizer.batch_decode(outputs, skip_special_tokens=True)[0]
    response_start = responses.find("### Response:") + len("### Response:")
    response = responses[response_start:].strip()
    return response
```

# Example Usage:

```python
question = "ভারতীয় বাঙালি কথাসাহিত্যিক মহাশ্বেতা দেবীর মৃত্যু কবে হয় ?"
context = "২০১৬ সালের ২৩ জুলাই হৃদরোগে আক্রান্ত হয়ে মহাশ্বেতা দেবী কলকাতার বেল ভিউ ক্লিনিকে ভর্তি হন। সেই বছরই ২৮ জুলাই একাধিক অঙ্গ বিকল হয়ে তাঁর মৃত্যু ঘটে। তিনি মধুমেহ, সেপ্টিসেমিয়া ও মূত্র সংক্রমণ রোগেও ভুগছিলেন।"
answer = generate_response(question, context)
print(answer)
```


# Disclaimer:

The asif00/mistral-bangla-4bit model has been trained on a limited dataset, and its responses may not always be perfect or accurate. The model's performance is dependent on the quality and quantity of the data it has been trained on. Given more resources, such as high-quality data and longer training time, the model's performance can be significantly improved.


# Resources: 
Work in progress...