Kquant03 commited on
Commit
b7b0e79
1 Parent(s): c617829

Upload 11 files

Browse files
README.md CHANGED
@@ -1,81 +1,125 @@
1
  ---
2
- license: mit
3
- language:
4
- - en
5
- thumbnail: "https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/TqnMpteVAyfiiNHx4lVkU.png"
6
- ---
7
- # You are welcome here, traveler.
8
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/TqnMpteVAyfiiNHx4lVkU.png)
9
 
10
- ### Named after the method used to create it, interleaving the layers of its predecessor to become far larger, giving it much more potential.
 
11
 
12
- [Elothir](https://wowpedia.fandom.com/wiki/Elothir) was an ancient treeant, and I couldn't think of a better name for a model that was created using the passthrough method.
13
 
14
- The passthrough method differs significantly from the previous ones. By concatenating layers from different LLMs, it can produce models with an exotic number of parameters (e.g., 9B with two 7B parameter models). These models are often referred to as "frankenmerges" or "Frankenstein models" by the community.
 
15
 
16
- Many thanks to [Abacaj](https://huggingface.co/abacaj) for providing the [fine tuned weights](https://huggingface.co/abacaj/phi-2-super) that were used in the creation of this base model. You can find the full script for how the model was merged [here](https://huggingface.co/Replete-AI/Phi-Elothir/blob/main/mergekit_config.yml)...thanks to [KatyTheCutie](https://huggingface.co/KatyTheCutie) for helping me figure out how to make the model as big as I possibly could.
17
 
18
- ## This idea was brought to me by [The Face of Goonery](https://huggingface.co/The-Face-Of-Goonery), also known as Caleb Morgan. I have him to thank if fine-tuning this model turns out to be a success
19
- # How to run inference:
20
 
21
- ```python
22
- import transformers
23
- import torch
24
 
25
- if __name__ == "__main__":
26
- model_name = "abacaj/phi-2-super"
27
- tokenizer = transformers.AutoTokenizer.from_pretrained(model_name)
28
-
29
- model = (
30
- transformers.AutoModelForCausalLM.from_pretrained(
31
- model_name,
32
- )
33
- .to("cuda:0")
34
- .eval()
35
- )
36
-
37
- messages = [
38
- {"role": "user", "content": "Hello, who are you?"}
39
- ]
40
- inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
41
- input_ids_cutoff = inputs.size(dim=1)
42
-
43
- with torch.no_grad():
44
- generated_ids = model.generate(
45
- input_ids=inputs,
46
- use_cache=True,
47
- max_new_tokens=512,
48
- temperature=0.2,
49
- top_p=0.95,
50
- do_sample=True,
51
- eos_token_id=tokenizer.eos_token_id,
52
- pad_token_id=tokenizer.pad_token_id,
53
- )
54
-
55
- completion = tokenizer.decode(
56
- generated_ids[0][input_ids_cutoff:],
57
- skip_special_tokens=True,
58
- )
59
-
60
- print(completion)
61
- ```
62
 
63
- # Chat template
64
 
65
- The model uses the same chat template as found in Mistral instruct models:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
66
 
67
- ```python
68
- text = "<|endoftext|>[INST] What is your favourite condiment? [/INST]"
69
- "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!<|endoftext|> "
70
- "[INST] Do you have mayonnaise recipes? [/INST]"
71
  ```
72
-
73
- You don't need to do it manually if you use the HF transformers tokenizer:
74
-
75
- ```python
76
- messages = [
77
- {"role": "user", "content": "Hello, who are you?"},
78
- {"role": "assistant": "content": "I am ..."}
79
- ]
80
- inputs = tokenizer.apply_chat_template(messages, return_tensors="pt").to(model.device)
81
- ```
 
1
  ---
2
+ base_model:
3
+ - abacaj/phi-2-super
4
+ tags:
5
+ - mergekit
6
+ - merge
 
 
7
 
8
+ ---
9
+ # Teldrassil
10
 
11
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
 
13
+ ## Merge Details
14
+ ### Merge Method
15
 
16
+ This model was merged using the passthrough merge method.
17
 
18
+ ### Models Merged
 
19
 
20
+ The following models were included in the merge:
21
+ * [abacaj/phi-2-super](https://huggingface.co/abacaj/phi-2-super)
 
22
 
23
+ ### Configuration
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
24
 
25
+ The following YAML configuration was used to produce this model:
26
 
27
+ ```yaml
28
+ dtype: float16
29
+ merge_method: passthrough
30
+ slices:
31
+ - sources:
32
+ - model: abacaj/phi-2-super
33
+ layer_range: [0,2]
34
+ - sources:
35
+ - model: abacaj/phi-2-super
36
+ layer_range: [1,3]
37
+ - sources:
38
+ - model: abacaj/phi-2-super
39
+ layer_range: [2,4]
40
+ - sources:
41
+ - model: abacaj/phi-2-super
42
+ layer_range: [3,5]
43
+ - sources:
44
+ - model: abacaj/phi-2-super
45
+ layer_range: [4,6]
46
+ - sources:
47
+ - model: abacaj/phi-2-super
48
+ layer_range: [5,7]
49
+ - sources:
50
+ - model: abacaj/phi-2-super
51
+ layer_range: [6,8]
52
+ - sources:
53
+ - model: abacaj/phi-2-super
54
+ layer_range: [7,9]
55
+ - sources:
56
+ - model: abacaj/phi-2-super
57
+ layer_range: [8,10]
58
+ - sources:
59
+ - model: abacaj/phi-2-super
60
+ layer_range: [9,11]
61
+ - sources:
62
+ - model: abacaj/phi-2-super
63
+ layer_range: [10,12]
64
+ - sources:
65
+ - model: abacaj/phi-2-super
66
+ layer_range: [11,13]
67
+ - sources:
68
+ - model: abacaj/phi-2-super
69
+ layer_range: [12,14]
70
+ - sources:
71
+ - model: abacaj/phi-2-super
72
+ layer_range: [13,15]
73
+ - sources:
74
+ - model: abacaj/phi-2-super
75
+ layer_range: [14,16]
76
+ - sources:
77
+ - model: abacaj/phi-2-super
78
+ layer_range: [15,17]
79
+ - sources:
80
+ - model: abacaj/phi-2-super
81
+ layer_range: [16,18]
82
+ - sources:
83
+ - model: abacaj/phi-2-super
84
+ layer_range: [17,19]
85
+ - sources:
86
+ - model: abacaj/phi-2-super
87
+ layer_range: [18,20]
88
+ - sources:
89
+ - model: abacaj/phi-2-super
90
+ layer_range: [19,21]
91
+ - sources:
92
+ - model: abacaj/phi-2-super
93
+ layer_range: [20,22]
94
+ - sources:
95
+ - model: abacaj/phi-2-super
96
+ layer_range: [21,23]
97
+ - sources:
98
+ - model: abacaj/phi-2-super
99
+ layer_range: [22,24]
100
+ - sources:
101
+ - model: abacaj/phi-2-super
102
+ layer_range: [23,25]
103
+ - sources:
104
+ - model: abacaj/phi-2-super
105
+ layer_range: [24,26]
106
+ - sources:
107
+ - model: abacaj/phi-2-super
108
+ layer_range: [25,27]
109
+ - sources:
110
+ - model: abacaj/phi-2-super
111
+ layer_range: [26,28]
112
+ - sources:
113
+ - model: abacaj/phi-2-super
114
+ layer_range: [27,29]
115
+ - sources:
116
+ - model: abacaj/phi-2-super
117
+ layer_range: [28,30]
118
+ - sources:
119
+ - model: abacaj/phi-2-super
120
+ layer_range: [29,31]
121
+ - sources:
122
+ - model: abacaj/phi-2-super
123
+ layer_range: [30,32]
124
 
 
 
 
 
125
  ```
 
 
 
 
 
 
 
 
 
 
added_tokens.json ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "\t\t": 50294,
3
+ "\t\t\t": 50293,
4
+ "\t\t\t\t": 50292,
5
+ "\t\t\t\t\t": 50291,
6
+ "\t\t\t\t\t\t": 50290,
7
+ "\t\t\t\t\t\t\t": 50289,
8
+ "\t\t\t\t\t\t\t\t": 50288,
9
+ "\t\t\t\t\t\t\t\t\t": 50287,
10
+ " ": 50286,
11
+ " ": 50285,
12
+ " ": 50284,
13
+ " ": 50283,
14
+ " ": 50282,
15
+ " ": 50281,
16
+ " ": 50280,
17
+ " ": 50279,
18
+ " ": 50278,
19
+ " ": 50277,
20
+ " ": 50276,
21
+ " ": 50275,
22
+ " ": 50274,
23
+ " ": 50273,
24
+ " ": 50272,
25
+ " ": 50271,
26
+ " ": 50270,
27
+ " ": 50269,
28
+ " ": 50268,
29
+ " ": 50267,
30
+ " ": 50266,
31
+ " ": 50265,
32
+ " ": 50264,
33
+ " ": 50263,
34
+ " ": 50262,
35
+ " ": 50261,
36
+ " ": 50260,
37
+ " ": 50259,
38
+ " ": 50258,
39
+ " ": 50257
40
+ }
config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "abacaj/phi-2-super",
3
+ "architectures": [
4
+ "PhiForCausalLM"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "auto_map": {
8
+ "AutoConfig": "microsoft/phi-2--configuration_phi.PhiConfig",
9
+ "AutoModelForCausalLM": "microsoft/phi-2--modeling_phi.PhiForCausalLM"
10
+ },
11
+ "bos_token_id": 50256,
12
+ "embd_pdrop": 0.0,
13
+ "eos_token_id": 50256,
14
+ "hidden_act": "gelu_new",
15
+ "hidden_size": 2560,
16
+ "initializer_range": 0.02,
17
+ "intermediate_size": 10240,
18
+ "layer_norm_eps": 1e-05,
19
+ "max_position_embeddings": 2048,
20
+ "model_type": "phi",
21
+ "num_attention_heads": 32,
22
+ "num_hidden_layers": 62,
23
+ "num_key_value_heads": 32,
24
+ "partial_rotary_factor": 0.4,
25
+ "qk_layernorm": false,
26
+ "resid_pdrop": 0.1,
27
+ "rope_scaling": null,
28
+ "rope_theta": 10000.0,
29
+ "tie_word_embeddings": false,
30
+ "torch_dtype": "float16",
31
+ "transformers_version": "4.38.2",
32
+ "use_cache": true,
33
+ "vocab_size": 51200
34
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:121b7f5714504562fac7d4c6a7462faed3c7bfc2fcfad7c094acc821f60d1a1d
3
+ size 9991282024
model-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a57b9351ea4cbb5b75676f5d14777cae62123be5eeb5145cf878d9044ad6f5f4
3
+ size 288463400
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4"}, "weight_map": {"model.layers.60.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.60.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.59.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.58.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.57.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.58.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.57.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.58.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.57.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.58.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.57.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.58.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.57.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.58.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.57.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.58.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.57.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.56.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.55.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.56.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.55.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.56.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.55.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.56.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.55.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.56.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.55.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.56.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.55.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.56.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.55.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.54.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.53.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.54.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.53.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.54.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.53.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.54.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.53.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.54.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.53.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.54.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.53.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.54.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.53.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.52.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.51.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.52.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.51.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.52.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.51.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.52.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.51.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.52.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.51.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.52.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.51.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.52.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.51.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.50.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.49.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.50.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.49.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.50.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.49.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.50.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.49.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.50.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.49.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.50.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.49.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.50.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.49.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.48.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.47.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.48.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.47.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.48.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.47.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.48.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.47.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.48.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.47.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.48.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.47.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.48.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.47.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.46.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.45.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.46.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.45.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.46.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.45.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.46.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.45.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.46.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.45.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.46.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.45.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.46.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.45.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.44.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.43.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.44.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.43.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.44.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.43.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.44.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.43.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.44.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.43.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.44.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.43.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.44.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.43.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.42.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.41.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.42.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.41.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.42.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.41.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.42.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.41.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.42.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.41.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.42.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.41.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.42.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.41.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.40.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.39.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.40.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.39.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.40.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.39.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.40.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.39.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.40.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.39.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.40.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.39.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.40.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.39.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.38.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.37.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.38.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.37.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.38.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.37.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.38.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.37.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.38.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.37.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.38.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.37.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.38.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.37.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.36.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.35.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.36.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.35.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.36.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.35.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.36.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.35.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.36.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.35.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.36.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.35.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.36.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.35.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.34.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.33.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.34.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.33.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.34.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.33.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.34.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.33.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.34.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.33.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.34.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.33.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.34.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.33.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.32.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.31.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.32.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.31.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.32.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.31.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.32.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.31.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.32.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.31.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.32.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.31.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.32.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.31.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.30.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.29.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.30.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.29.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.30.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.29.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.30.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.29.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.30.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.29.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.30.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.29.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.30.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.29.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.28.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.27.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.28.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.27.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.28.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.27.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.28.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.27.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.28.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.27.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.28.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.27.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.28.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.27.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.26.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.25.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.26.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.25.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.26.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.25.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.26.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.25.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.26.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.25.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.26.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.25.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.26.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.25.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.24.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.23.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.24.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.23.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.24.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.23.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.24.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.23.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.24.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.23.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.24.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.23.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.24.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.23.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.22.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.21.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.22.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.21.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.22.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.21.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.22.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.21.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.22.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.21.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.22.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.21.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.22.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.21.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.20.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.19.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.20.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.19.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.20.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.19.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.20.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.19.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.20.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.19.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.20.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.19.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.20.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.19.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.18.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.17.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.18.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.17.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.18.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.17.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.18.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.17.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.18.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.17.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.18.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.17.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.18.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.17.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.16.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.15.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.16.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.15.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.16.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.15.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.16.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.15.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.16.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.15.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.16.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.16.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.15.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.14.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.13.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.14.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.13.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.14.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.13.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.14.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.13.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.14.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.13.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.14.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.13.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.14.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.13.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.12.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.11.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.12.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.11.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.12.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.11.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.12.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.11.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.12.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.11.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.12.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.11.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.12.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.11.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.10.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.9.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.10.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.9.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.10.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.9.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.10.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.9.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.10.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.9.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.10.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.9.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.10.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.9.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.8.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.7.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.8.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.7.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.8.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.7.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.8.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.7.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.8.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.7.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.8.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.7.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.8.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.7.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.6.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.5.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.6.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.5.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.6.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.5.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.6.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.5.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.6.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.5.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.6.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.5.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.6.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.5.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.4.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.3.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.4.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.3.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.4.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.3.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.4.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.3.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.4.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.3.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.4.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.3.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.4.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.3.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.2.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.1.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.2.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.1.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.2.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.1.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.2.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.1.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.2.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.1.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.2.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.2.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.1.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.0.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.0.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.0.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.0.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.0.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.0.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.embed_tokens.weight": "model-00001-of-00002.safetensors", "model.final_layernorm.weight": "model-00001-of-00002.safetensors", "model.final_layernorm.bias": "model-00001-of-00002.safetensors", "lm_head.weight": "model-00001-of-00002.safetensors", "lm_head.bias": "model-00001-of-00002.safetensors", "model.layers.61.mlp.fc2.weight": "model-00001-of-00002.safetensors", "model.layers.61.mlp.fc2.bias": "model-00001-of-00002.safetensors", "model.layers.61.mlp.fc1.weight": "model-00001-of-00002.safetensors", "model.layers.61.mlp.fc1.bias": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.v_proj.bias": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.k_proj.bias": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.q_proj.bias": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.dense.weight": "model-00001-of-00002.safetensors", "model.layers.61.self_attn.dense.bias": "model-00001-of-00002.safetensors", "model.layers.61.input_layernorm.weight": "model-00001-of-00002.safetensors", "model.layers.61.input_layernorm.bias": "model-00001-of-00002.safetensors", "model.layers.60.mlp.fc2.weight": "model-00002-of-00002.safetensors", "model.layers.59.mlp.fc2.weight": "model-00002-of-00002.safetensors", "model.layers.60.mlp.fc2.bias": "model-00002-of-00002.safetensors", "model.layers.59.mlp.fc2.bias": "model-00002-of-00002.safetensors", "model.layers.60.mlp.fc1.weight": "model-00002-of-00002.safetensors", "model.layers.59.mlp.fc1.weight": "model-00002-of-00002.safetensors", "model.layers.60.mlp.fc1.bias": "model-00002-of-00002.safetensors", "model.layers.59.mlp.fc1.bias": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.v_proj.bias": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.v_proj.bias": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.k_proj.bias": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.k_proj.bias": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.dense.weight": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.dense.weight": "model-00002-of-00002.safetensors", "model.layers.60.self_attn.dense.bias": "model-00002-of-00002.safetensors", "model.layers.59.self_attn.dense.bias": "model-00002-of-00002.safetensors", "model.layers.60.input_layernorm.weight": "model-00002-of-00002.safetensors", "model.layers.59.input_layernorm.weight": "model-00002-of-00002.safetensors", "model.layers.60.input_layernorm.bias": "model-00002-of-00002.safetensors", "model.layers.59.input_layernorm.bias": "model-00002-of-00002.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|endoftext|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<|endoftext|>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,325 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": false,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "50257": {
13
+ "content": " ",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": false
19
+ },
20
+ "50258": {
21
+ "content": " ",
22
+ "lstrip": false,
23
+ "normalized": true,
24
+ "rstrip": false,
25
+ "single_word": false,
26
+ "special": false
27
+ },
28
+ "50259": {
29
+ "content": " ",
30
+ "lstrip": false,
31
+ "normalized": true,
32
+ "rstrip": false,
33
+ "single_word": false,
34
+ "special": false
35
+ },
36
+ "50260": {
37
+ "content": " ",
38
+ "lstrip": false,
39
+ "normalized": true,
40
+ "rstrip": false,
41
+ "single_word": false,
42
+ "special": false
43
+ },
44
+ "50261": {
45
+ "content": " ",
46
+ "lstrip": false,
47
+ "normalized": true,
48
+ "rstrip": false,
49
+ "single_word": false,
50
+ "special": false
51
+ },
52
+ "50262": {
53
+ "content": " ",
54
+ "lstrip": false,
55
+ "normalized": true,
56
+ "rstrip": false,
57
+ "single_word": false,
58
+ "special": false
59
+ },
60
+ "50263": {
61
+ "content": " ",
62
+ "lstrip": false,
63
+ "normalized": true,
64
+ "rstrip": false,
65
+ "single_word": false,
66
+ "special": false
67
+ },
68
+ "50264": {
69
+ "content": " ",
70
+ "lstrip": false,
71
+ "normalized": true,
72
+ "rstrip": false,
73
+ "single_word": false,
74
+ "special": false
75
+ },
76
+ "50265": {
77
+ "content": " ",
78
+ "lstrip": false,
79
+ "normalized": true,
80
+ "rstrip": false,
81
+ "single_word": false,
82
+ "special": false
83
+ },
84
+ "50266": {
85
+ "content": " ",
86
+ "lstrip": false,
87
+ "normalized": true,
88
+ "rstrip": false,
89
+ "single_word": false,
90
+ "special": false
91
+ },
92
+ "50267": {
93
+ "content": " ",
94
+ "lstrip": false,
95
+ "normalized": true,
96
+ "rstrip": false,
97
+ "single_word": false,
98
+ "special": false
99
+ },
100
+ "50268": {
101
+ "content": " ",
102
+ "lstrip": false,
103
+ "normalized": true,
104
+ "rstrip": false,
105
+ "single_word": false,
106
+ "special": false
107
+ },
108
+ "50269": {
109
+ "content": " ",
110
+ "lstrip": false,
111
+ "normalized": true,
112
+ "rstrip": false,
113
+ "single_word": false,
114
+ "special": false
115
+ },
116
+ "50270": {
117
+ "content": " ",
118
+ "lstrip": false,
119
+ "normalized": true,
120
+ "rstrip": false,
121
+ "single_word": false,
122
+ "special": false
123
+ },
124
+ "50271": {
125
+ "content": " ",
126
+ "lstrip": false,
127
+ "normalized": true,
128
+ "rstrip": false,
129
+ "single_word": false,
130
+ "special": false
131
+ },
132
+ "50272": {
133
+ "content": " ",
134
+ "lstrip": false,
135
+ "normalized": true,
136
+ "rstrip": false,
137
+ "single_word": false,
138
+ "special": false
139
+ },
140
+ "50273": {
141
+ "content": " ",
142
+ "lstrip": false,
143
+ "normalized": true,
144
+ "rstrip": false,
145
+ "single_word": false,
146
+ "special": false
147
+ },
148
+ "50274": {
149
+ "content": " ",
150
+ "lstrip": false,
151
+ "normalized": true,
152
+ "rstrip": false,
153
+ "single_word": false,
154
+ "special": false
155
+ },
156
+ "50275": {
157
+ "content": " ",
158
+ "lstrip": false,
159
+ "normalized": true,
160
+ "rstrip": false,
161
+ "single_word": false,
162
+ "special": false
163
+ },
164
+ "50276": {
165
+ "content": " ",
166
+ "lstrip": false,
167
+ "normalized": true,
168
+ "rstrip": false,
169
+ "single_word": false,
170
+ "special": false
171
+ },
172
+ "50277": {
173
+ "content": " ",
174
+ "lstrip": false,
175
+ "normalized": true,
176
+ "rstrip": false,
177
+ "single_word": false,
178
+ "special": false
179
+ },
180
+ "50278": {
181
+ "content": " ",
182
+ "lstrip": false,
183
+ "normalized": true,
184
+ "rstrip": false,
185
+ "single_word": false,
186
+ "special": false
187
+ },
188
+ "50279": {
189
+ "content": " ",
190
+ "lstrip": false,
191
+ "normalized": true,
192
+ "rstrip": false,
193
+ "single_word": false,
194
+ "special": false
195
+ },
196
+ "50280": {
197
+ "content": " ",
198
+ "lstrip": false,
199
+ "normalized": true,
200
+ "rstrip": false,
201
+ "single_word": false,
202
+ "special": false
203
+ },
204
+ "50281": {
205
+ "content": " ",
206
+ "lstrip": false,
207
+ "normalized": true,
208
+ "rstrip": false,
209
+ "single_word": false,
210
+ "special": false
211
+ },
212
+ "50282": {
213
+ "content": " ",
214
+ "lstrip": false,
215
+ "normalized": true,
216
+ "rstrip": false,
217
+ "single_word": false,
218
+ "special": false
219
+ },
220
+ "50283": {
221
+ "content": " ",
222
+ "lstrip": false,
223
+ "normalized": true,
224
+ "rstrip": false,
225
+ "single_word": false,
226
+ "special": false
227
+ },
228
+ "50284": {
229
+ "content": " ",
230
+ "lstrip": false,
231
+ "normalized": true,
232
+ "rstrip": false,
233
+ "single_word": false,
234
+ "special": false
235
+ },
236
+ "50285": {
237
+ "content": " ",
238
+ "lstrip": false,
239
+ "normalized": true,
240
+ "rstrip": false,
241
+ "single_word": false,
242
+ "special": false
243
+ },
244
+ "50286": {
245
+ "content": " ",
246
+ "lstrip": false,
247
+ "normalized": true,
248
+ "rstrip": false,
249
+ "single_word": false,
250
+ "special": false
251
+ },
252
+ "50287": {
253
+ "content": "\t\t\t\t\t\t\t\t\t",
254
+ "lstrip": false,
255
+ "normalized": true,
256
+ "rstrip": false,
257
+ "single_word": false,
258
+ "special": false
259
+ },
260
+ "50288": {
261
+ "content": "\t\t\t\t\t\t\t\t",
262
+ "lstrip": false,
263
+ "normalized": true,
264
+ "rstrip": false,
265
+ "single_word": false,
266
+ "special": false
267
+ },
268
+ "50289": {
269
+ "content": "\t\t\t\t\t\t\t",
270
+ "lstrip": false,
271
+ "normalized": true,
272
+ "rstrip": false,
273
+ "single_word": false,
274
+ "special": false
275
+ },
276
+ "50290": {
277
+ "content": "\t\t\t\t\t\t",
278
+ "lstrip": false,
279
+ "normalized": true,
280
+ "rstrip": false,
281
+ "single_word": false,
282
+ "special": false
283
+ },
284
+ "50291": {
285
+ "content": "\t\t\t\t\t",
286
+ "lstrip": false,
287
+ "normalized": true,
288
+ "rstrip": false,
289
+ "single_word": false,
290
+ "special": false
291
+ },
292
+ "50292": {
293
+ "content": "\t\t\t\t",
294
+ "lstrip": false,
295
+ "normalized": true,
296
+ "rstrip": false,
297
+ "single_word": false,
298
+ "special": false
299
+ },
300
+ "50293": {
301
+ "content": "\t\t\t",
302
+ "lstrip": false,
303
+ "normalized": true,
304
+ "rstrip": false,
305
+ "single_word": false,
306
+ "special": false
307
+ },
308
+ "50294": {
309
+ "content": "\t\t",
310
+ "lstrip": false,
311
+ "normalized": true,
312
+ "rstrip": false,
313
+ "single_word": false,
314
+ "special": false
315
+ }
316
+ },
317
+ "bos_token": "<|endoftext|>",
318
+ "chat_template": "{{ bos_token }}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% if message['role'] == 'user' %}{{ '[INST] ' + message['content'] + ' [/INST]' }}{% elif message['role'] == 'assistant' %}{{ message['content'] + eos_token + ' ' }}{% else %}{{ raise_exception('Only user and assistant roles are supported!') }}{% endif %}{% endfor %}",
319
+ "clean_up_tokenization_spaces": true,
320
+ "eos_token": "<|endoftext|>",
321
+ "model_max_length": 2048,
322
+ "pad_token": "<|endoftext|>",
323
+ "tokenizer_class": "CodeGenTokenizer",
324
+ "unk_token": "<|endoftext|>"
325
+ }
vocab.json ADDED
The diff for this file is too large to render. See raw diff