asif00 commited on
Commit
bcd9fa7
1 Parent(s): f9b7c30

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -18
README.md CHANGED
@@ -1,18 +1,18 @@
1
- ---
2
- language:
3
- - bn
4
- license: apache-2.0
5
- tags:
6
- - text-generation-inference
7
- - transformers
8
- - mistral
9
- - trl
10
- - sft
11
- base_model: unsloth/mistral-7b-v0.3-bnb-4bit
12
- pipeline_tag: question-answering
13
- datasets:
14
- - iamshnoo/alpaca-cleaned-bengali
15
- ---
16
 
17
  # How to Use:
18
 
@@ -21,14 +21,14 @@ You can use the model with a pipeline for a high-level helper or load the model
21
  ```python
22
  # Use a pipeline as a high-level helper
23
  from transformers import pipeline
24
- pipe = pipeline("question-answering", model="asif00/bangla-llama-4bit")
25
  ```
26
 
27
  ```python
28
  # Load model directly
29
  from transformers import AutoTokenizer, AutoModelForCausalLM
30
- tokenizer = AutoTokenizer.from_pretrained("asif00/bangla-llama-4bit")
31
- model = AutoModelForCausalLM.from_pretrained("asif00/bangla-llama-4bit")
32
  ```
33
 
34
  # General Prompt Structure:
 
1
+ ---
2
+ language:
3
+ - bn
4
+ license: apache-2.0
5
+ tags:
6
+ - text-generation-inference
7
+ - transformers
8
+ - mistral
9
+ - trl
10
+ - sft
11
+ base_model: unsloth/mistral-7b-v0.3-bnb-4bit
12
+ pipeline_tag: question-answering
13
+ datasets:
14
+ - iamshnoo/alpaca-cleaned-bengali
15
+ ---
16
 
17
  # How to Use:
18
 
 
21
  ```python
22
  # Use a pipeline as a high-level helper
23
  from transformers import pipeline
24
+ pipe = pipeline("question-answering", model="asif00/mistral-bangla-4bit")
25
  ```
26
 
27
  ```python
28
  # Load model directly
29
  from transformers import AutoTokenizer, AutoModelForCausalLM
30
+ tokenizer = AutoTokenizer.from_pretrained("asif00/mistral-bangla-4bit")
31
+ model = AutoModelForCausalLM.from_pretrained("asif00/mistral-bangla-4bit")
32
  ```
33
 
34
  # General Prompt Structure: