pankajmathur commited on
Commit
ebb11b6
1 Parent(s): 5c90323

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +63 -3
README.md CHANGED
@@ -1,3 +1,63 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: llama3
3
+ language:
4
+ - en
5
+ library_name: transformers
6
+ pipeline_tag: text2text-generation
7
+ ---
8
+
9
+ **Model Name: Llama 3 orca_mini_v6_8b_dpo**
10
+
11
+ # Llama 3 orca_mini_v6_8b_dpo is trained with various DPO Datasets
12
+
13
+ <img src="https://huggingface.co/pankajmathur/orca_mini_v5_8b/resolve/main/orca_minis_small.jpeg" width="auto" />
14
+
15
+ ## NOTICE
16
+
17
+ By providing proper credit and attribution, you are granted permission to use this model as a foundational base for further Full fine tuning, DPO, PPO or ORPO tuning and any kind of Merges.
18
+ I actively encourage users to customize and enhance the model according to their specific needs, as this version is designed to be a comprehensive general model.
19
+ Dive in and innovate!
20
+
21
+ ## Evaluation
22
+
23
+ Coming Soon..
24
+
25
+
26
+ <br>
27
+
28
+ ## Example Usage
29
+
30
+ Here is the ChatML prompt format
31
+
32
+ ```
33
+ <|im_start|>system
34
+ You are Orca Mini, a helpful AI assistant.<|im_end|>
35
+ <|im_start|>user
36
+ Hello Orca Mini, what can you do for me?<|im_end|>
37
+ <|im_start|>assistant
38
+ ```
39
+
40
+ Below shows a code example on how to use this model
41
+
42
+ ```python
43
+ from transformers import AutoModel, AutoTokenizer
44
+ model_slug = "pankajmathur/orca_mini_v6_8b_dpo"
45
+ model = AutoModel.from_pretrained(model_slug)
46
+ tokenizer = AutoTokenizer.from_pretrained(model_slug)
47
+
48
+ messages = [
49
+ {"role": "system", "content": "You are Orca Mini, a helpful AI assistant."},
50
+ {"role": "user", "content": "Hello Orca Mini, what can you do for me?"}
51
+ ]
52
+
53
+ gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt")
54
+ model.generate(**gen_input)
55
+ ```
56
+ This model is governed by [META LLAMA 3 COMMUNITY LICENSE AGREEMENT](LICENSE)
57
+
58
+ **Quants**
59
+
60
+ GGUF : Coming Soon
61
+
62
+ AWQ: Coming Soon
63
+