MaziyarPanahi commited on
Commit
8371058
1 Parent(s): 1e379c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +65 -0
README.md CHANGED
@@ -27,8 +27,73 @@ datasets:
27
 
28
  A fine-tuned version of [v2ray/Mixtral-8x22B-v0.1](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1) model on the `philschmid/guanaco-sharegpt-style` dataset. This model has a total of 141b parameters with 35b only active.
29
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
  This is a sample story written by [MaziyarPanahi/Goku-8x22B-v0.1](https://huggingface.co/MaziyarPanahi/Goku-8x22B-v0.1/)
31
 
 
32
  ```
33
  Goku had heard a commotion from his house but when he went to check he saw nothing. He thought to himself, "I'll let it go, it was probably just a bird or something. I'm sure it will be fine." But that was when he heard the commotion again, so he went outside and this time he saw two figures on the horizon. One of the figures was a giant pinkish-purple creature, while the other was small, pink, ball-shaped thing.
34
 
 
27
 
28
  A fine-tuned version of [v2ray/Mixtral-8x22B-v0.1](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1) model on the `philschmid/guanaco-sharegpt-style` dataset. This model has a total of 141b parameters with 35b only active.
29
 
30
+
31
+ ## How to use it
32
+
33
+
34
+ **Use a pipeline as a high-level helper:**
35
+ ```python
36
+ from transformers import pipeline
37
+
38
+ pipe = pipeline("text-generation", model="MaziyarPanahi/Goku-8x22B-v0.1")
39
+ ```
40
+
41
+ **Load model directly:**
42
+ ```python
43
+
44
+ from transformers import AutoTokenizer, AutoModelForCausalLM
45
+
46
+ tokenizer = AutoTokenizer.from_pretrained("MaziyarPanahi/Goku-8x22B-v0.1")
47
+ model = AutoModelForCausalLM.from_pretrained("MaziyarPanahi/Goku-8x22B-v0.1")
48
+ ```
49
+
50
+ **Load via Adapter:**
51
+
52
+ You can also use PEFT to just load the adapter if you already have one of these models downloaded: [v2ray/Mixtral-8x22B-v0.1](https://huggingface.co/v2ray/Mixtral-8x22B-v0.1) or [mistral-community/Mixtral-8x22B-v0.1](https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1) (they are the same)
53
+
54
+ ```python
55
+ # assuming you have already downloaded the
56
+ # resizing the vocab
57
+ import torch
58
+ from transformers import AutoModelForCausalLM, AutoTokenizer
59
+
60
+ model_id="v2ray/Mixtral-8x22B-v0.1"
61
+ peft_model_id = "~/.cache/huggingface/hub/models--MaziyarPanahi--Goku-8x22B-v0.1/adapter"
62
+
63
+ tokenizer = AutoTokenizer. from_pretrained (peft_model_id)
64
+ model = AutoModelForCausalLM. from_pretrained (model_id)
65
+ # I have added 2 new tokens for ChatML template
66
+ # this step is required if you are using PEFT/Adapter
67
+ model.resize_token_embeddings (len (tokenizer))
68
+ model.load_adapter(peft_model_id)
69
+
70
+ # you can even have TextStreamer and a text-generation pipeline with your adapter
71
+ streamer = TextStreamer(tokenizer)
72
+
73
+ pipe = pipeline(
74
+ "text-generation",
75
+ model=model,
76
+ tokenizer=tokenizer,
77
+ max_new_tokens=750,
78
+ temperature=0.6,
79
+ do_sample=True,
80
+ top_k=50,
81
+ top_p=0.95,
82
+ repetition_penalty=1.1,
83
+ return_full_text=False,
84
+ add_special_tokens=False,
85
+ streamer=streamer
86
+ )
87
+
88
+ ```
89
+
90
+ ## Examples
91
+
92
+ `Goku-8x22B-v0.1` has been tested in generating text, answering questions based on long context, coding, and some reasoning. In the next version I will use more `math` and `coding` related datasets.
93
+
94
  This is a sample story written by [MaziyarPanahi/Goku-8x22B-v0.1](https://huggingface.co/MaziyarPanahi/Goku-8x22B-v0.1/)
95
 
96
+
97
  ```
98
  Goku had heard a commotion from his house but when he went to check he saw nothing. He thought to himself, "I'll let it go, it was probably just a bird or something. I'm sure it will be fine." But that was when he heard the commotion again, so he went outside and this time he saw two figures on the horizon. One of the figures was a giant pinkish-purple creature, while the other was small, pink, ball-shaped thing.
99