NickyNicky commited on
Commit
0ebcc3e
1 Parent(s): 88c0d5f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +141 -1
README.md CHANGED
@@ -1,3 +1,143 @@
1
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ library_name: transformers
3
+ tags:
4
+ - merge
5
+ language:
6
+ - en
7
+ - es
8
+ - ru
9
+ - zh
10
+ - de
11
+ - fr
12
+ - th
13
+ - ca
14
+ - it
15
+ - ja
16
+ - pl
17
+ - eo
18
+ - eu
19
+ - vi
20
+ - fi
21
+ - hu
22
+ - ar
23
+ - nl
24
+ - da
25
+ - tr
26
+ - ko
27
+ - he
28
+ - id
29
+ - cs
30
+ - bn
31
+ - sv
32
+ widget:
33
+ - text: |
34
+ <|im_start|>system
35
+ You are a helpful AI assistant.<|im_end|>
36
+ <|im_start|>user
37
+ podrias escribir un codigo de ejemplo en Python<|im_end|>
38
+ <|im_start|>assistant
39
  license: apache-2.0
40
+ ---
41
+
42
+ # Model Card for Model MixLlama
43
+
44
+
45
+ <!-- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/641b435ba5f876fe30c5ae0a/d4yUGFC5XZz41aA3_-kGC.png) -->
46
+
47
+ <!-- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/641b435ba5f876fe30c5ae0a/mZx6OGCHfm92udQfNFcGD.png) -->
48
+
49
+
50
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/641b435ba5f876fe30c5ae0a/CW8JrvB58GSt_6B5XPcGZ.png)
51
+
52
+ <!-- Provide a quick summary of what the model is/does. -->
53
+
54
+ ```Python
55
+ experts:
56
+ - source_model: NickyNicky/TinyDolphin-2.8-1.1b_oasst2_chatML_Cluster_1_V1
57
+ positive_prompts:
58
+ - ""
59
+
60
+ - source_model: NickyNicky/TinyDolphin-2.8-1.1b_oasst2_chatML_Cluster_2_V1
61
+ positive_prompts:
62
+ - ""
63
+
64
+ - source_model: NickyNicky/TinyDolphin-2.8-1.1b_oasst2_chatML_Cluster_3_V1
65
+ positive_prompts:
66
+ - ""
67
+
68
+ base_model: NickyNicky/TinyDolphin-2.8-1.1b_oasst2_chatML_Cluster_1_V1
69
+ gate_mode: random # one of "hidden", "cheap_embed", or "random"
70
+ dtype: bfloat16 # output dtype (float32, float16, or bfloat16)
71
+ ```
72
+
73
+
74
+
75
+
76
+
77
+
78
+ ```Python
79
+ from transformers import (
80
+ AutoModelForCausalLM,
81
+ AutoTokenizer,
82
+ BitsAndBytesConfig,
83
+ HfArgumentParser,
84
+ TrainingArguments,
85
+ pipeline,
86
+ logging,
87
+ GenerationConfig,
88
+ TextIteratorStreamer,
89
+ )
90
+ import torch
91
+
92
+ new_model= "NickyNicky/Mixtral-4x1.1B-TinyDolphin-2.8-1.1b_oasst2_chatML_Cluster"
93
+ model = AutoModelForCausalLM.from_pretrained(#f'NickyNicky/{new_model}',
94
+ new_model,
95
+ device_map="auto",
96
+ trust_remote_code=True,
97
+ torch_dtype=torch.bfloat16,
98
+
99
+ low_cpu_mem_usage= True,
100
+ # use_flash_attention_2=False,
101
+
102
+ )
103
+
104
+
105
+ tokenizer = AutoTokenizer.from_pretrained(new_model,
106
+ max_length=2048,
107
+ trust_remote_code=True,
108
+ use_fast = True,
109
+ )
110
+
111
+ tokenizer.pad_token = tokenizer.eos_token
112
+ # tokenizer.padding_side = 'left'
113
+ tokenizer.padding_side = 'right'
114
+
115
+
116
+ prompt= """<|im_start|>system
117
+ You are a helpful AI assistant.<|im_end|>
118
+ <|im_start|>user
119
+ escribe una historia de amor.<|im_end|>
120
+ <|im_start|>assistant
121
+ """
122
+
123
+ inputs = tokenizer.encode(prompt,
124
+ return_tensors="pt",
125
+ add_special_tokens=False).cuda()#.to("cuda") # False # True
126
+
127
+
128
+ generation_config = GenerationConfig(
129
+ max_new_tokens=700,
130
+ temperature=0.5,
131
+ top_p=0.9,
132
+ top_k=40,
133
+ repetition_penalty=1.1, #1.1, # 1.0 means no penalty, > 1.0 means penalty, 1.2 from CTRL paper
134
+ do_sample=True,
135
+ pad_token_id=tokenizer.eos_token_id,
136
+ eos_token_id=tokenizer.eos_token_id,
137
+ )
138
+ outputs = model.generate(
139
+ generation_config=generation_config,
140
+ input_ids=inputs,)
141
+ # tokenizer.decode(outputs[0], skip_special_tokens=False) #True
142
+ print(tokenizer.decode(outputs[0], skip_special_tokens=False))
143
+ ```