matchaaaaa commited on
Commit
a4fdf26
1 Parent(s): 3dfcbf1

Upload 52 files

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +84 -3
  2. chaifighter-cute.png +0 -0
  3. config.json +25 -0
  4. model-00001-of-00044.safetensors +3 -0
  5. model-00002-of-00044.safetensors +3 -0
  6. model-00003-of-00044.safetensors +3 -0
  7. model-00004-of-00044.safetensors +3 -0
  8. model-00005-of-00044.safetensors +3 -0
  9. model-00006-of-00044.safetensors +3 -0
  10. model-00007-of-00044.safetensors +3 -0
  11. model-00008-of-00044.safetensors +3 -0
  12. model-00009-of-00044.safetensors +3 -0
  13. model-00010-of-00044.safetensors +3 -0
  14. model-00011-of-00044.safetensors +3 -0
  15. model-00012-of-00044.safetensors +3 -0
  16. model-00013-of-00044.safetensors +3 -0
  17. model-00014-of-00044.safetensors +3 -0
  18. model-00015-of-00044.safetensors +3 -0
  19. model-00016-of-00044.safetensors +3 -0
  20. model-00017-of-00044.safetensors +3 -0
  21. model-00018-of-00044.safetensors +3 -0
  22. model-00019-of-00044.safetensors +3 -0
  23. model-00020-of-00044.safetensors +3 -0
  24. model-00021-of-00044.safetensors +3 -0
  25. model-00022-of-00044.safetensors +3 -0
  26. model-00023-of-00044.safetensors +3 -0
  27. model-00024-of-00044.safetensors +3 -0
  28. model-00025-of-00044.safetensors +3 -0
  29. model-00026-of-00044.safetensors +3 -0
  30. model-00027-of-00044.safetensors +3 -0
  31. model-00028-of-00044.safetensors +3 -0
  32. model-00029-of-00044.safetensors +3 -0
  33. model-00030-of-00044.safetensors +3 -0
  34. model-00031-of-00044.safetensors +3 -0
  35. model-00032-of-00044.safetensors +3 -0
  36. model-00033-of-00044.safetensors +3 -0
  37. model-00034-of-00044.safetensors +3 -0
  38. model-00035-of-00044.safetensors +3 -0
  39. model-00036-of-00044.safetensors +3 -0
  40. model-00037-of-00044.safetensors +3 -0
  41. model-00038-of-00044.safetensors +3 -0
  42. model-00039-of-00044.safetensors +3 -0
  43. model-00040-of-00044.safetensors +3 -0
  44. model-00041-of-00044.safetensors +3 -0
  45. model-00042-of-00044.safetensors +3 -0
  46. model-00043-of-00044.safetensors +3 -0
  47. model-00044-of-00044.safetensors +3 -0
  48. model.safetensors.index.json +1 -0
  49. special_tokens_map.json +23 -0
  50. tokenizer.json +0 -0
README.md CHANGED
@@ -1,3 +1,84 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - Sao10K/Fimbulvetr-11B-v2
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+
9
+ ---
10
+
11
+ ![cute](https://huggingface.co/matchaaaaa/chaifighter-20b/resolve/main/chaifighter-cute.jpg)
12
+
13
+ # Chaifighter 20B
14
+
15
+ Meet Chaifighter 20B. This is my shot at making [Fimbulvetr 11B v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2) a bit more creative and verbose while retaining its incredible coherence and intelligence. It also shows that SOLAR-based models and Mistral-based models can be merged, as SOLAR 10.7B was based on a Mistral 7B frankenmerge and finetuned a bit.
16
+
17
+ I also wanted to provide an alternative to [Psyonic Cetacean 20B](https://huggingface.co/jebcarter/psyonic-cetacean-20B), which is a fantastic model that you should check out if you haven't already! The issue with that model is that it's based on Llama 2, which is outdated now. The older architecture lacked many performance enhancements that were introduced by the Mistral architecture, and on my 16 GB RTX 4060 Ti, those performance enhancements were the difference between decently speedy and intolerably sluggish. I wanted to help cater towards those who can run a more than a 13B but not a 34B, so this is a good middle ground.
18
+
19
+ Chaifighter 20B is geared towards long-form roleplay chats rather than short-form IRC/Discord RP chats. It loves verbosity and detail, and its quality will depend on how much "ammunition" you can give it. While it sorta-kinda can do short-form with some swiping, it isn't really ideal. But for those essay-writing powerhouses that love typing up a storm in the character card, this one's for you.
20
+
21
+ Chaifighter 20B natively supports a context window of only 4096 tokens maximum. I tried RoPE scaling but it was not happy from the limited testing I did. Your mileage may vary, and if anyone can manage to get it working higher, I've love to hear about it!
22
+
23
+ ## Mergekit
24
+
25
+ Chaifighter 20B is a frankenmerge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
26
+
27
+ ## Prompt Template: Alpaca
28
+
29
+ ```
30
+ Below is an instruction that describes a task. Write a response that appropriately completes the request.
31
+
32
+ ### Instruction:
33
+ {prompt}
34
+
35
+ ### Response:
36
+ ```
37
+
38
+ ## Merge Details
39
+ ### Merge Method
40
+
41
+ This model was merged using the passthrough merge method.
42
+
43
+ ### Models Merged
44
+
45
+ The following models were included in the merge:
46
+ * [Sao10K/Fimbulvetr-11B-v2](https://huggingface.co/Sao10K/Fimbulvetr-11B-v2)
47
+ * [SanjiWatsuki/Kunoichi-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-7B)
48
+ * [Gryphe/MythoMist-7b](https://huggingface.co/Gryphe/MythoMist-7b)
49
+ * [Undi95/Toppy-M-7B](https://huggingface.co/Undi95/Toppy-M-7B)
50
+
51
+ ### The Sauceeeeee
52
+
53
+ ```yaml
54
+ slices:
55
+ - sources:
56
+ - model: Sao10K/Fimbulvetr-11B-v2
57
+ layer_range: [0, 40] # all but last 8 layers
58
+ - sources:
59
+ - model: SanjiWatsuki/Kunoichi-7B
60
+ layer_range: [0, 24] # all but last 8 layers
61
+ - sources:
62
+ - model: Undi95/Toppy-M-7B
63
+ layer_range: [16, 24] # 16 layers of Toppy and MythoMist split and interleaved to (in theory) boost the model's coherence
64
+ - sources:
65
+ - model: Gryphe/MythoMist-7b
66
+ layer_range: [16, 24]
67
+ - sources:
68
+ - model: Undi95/Toppy-M-7B
69
+ layer_range: [25, 32]
70
+ - sources:
71
+ - model: Gryphe/MythoMist-7b
72
+ layer_range: [25, 32]
73
+ merge_method: passthrough
74
+ dtype: bfloat16
75
+ ```
76
+ Yeah, it's mad sussy. I know what I did, but I'm not sorry.
77
+
78
+ ## Other stuff
79
+
80
+ Okay! Fine! It's not really a 20B, it's a 21B, but I did everything planning for a 20B before deciding to add 4 more layers to the model to make it more stable. It made a big difference.
81
+
82
+ Yapping time. As far as the name is concerned, I'm going for a tea/coffee/hot drink motif for my models, and one of the names I was debating on using for this model was Chai-Latte. As I worked on this merge, I got the idea of naming it "Chaifighter" as a play on "Psyfighter2", one of the models making up Psyonic Cetacean and also a play on a model called "Tiefighter" from which it was derived. Both are fantastic models, especially given their age. They're both worth checking out too if you haven't done so. "Chai" itself is a play on a certain AI chatting website (CAI) that got me into this lovely mess in the first place. So I guess it's fitting to name the first model of the series after it.
83
+
84
+ And lastly, of course, thank you for checking out my model! Have a great day and please take care of yourself, alright? :)
chaifighter-cute.png ADDED
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "matchaaaaa/Chaifighter-20B",
3
+ "architectures": [
4
+ "MistralForCausalLM"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 1,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 4096,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 14336,
13
+ "max_position_embeddings": 4096,
14
+ "model_type": "mistral",
15
+ "num_attention_heads": 32,
16
+ "num_hidden_layers": 94,
17
+ "num_key_value_heads": 8,
18
+ "rms_norm_eps": 1e-05,
19
+ "rope_theta": 10000.0,
20
+ "tie_word_embeddings": false,
21
+ "torch_dtype": "bfloat16",
22
+ "transformers_version": "4.40.2",
23
+ "use_cache": true,
24
+ "vocab_size": 32000
25
+ }
model-00001-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:391ade9e99dd7928c3af501d1994c919a937121e2250b655d8a10704569dcd92
3
+ size 994067336
model-00002-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fbe38e471f3733496062fb4889727b643541f7c3b219e41bcbbdfc6b3c7039fa
3
+ size 989890720
model-00003-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c18487ebe34602b79eb216d642601604bfdf5b2dd9cc18b13bc8bd02129a6eb6
3
+ size 998296064
model-00004-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:45be0fae510296eb327ceb4adc1bb75e122d83b79b4b74ff3c30ec392ef35010
3
+ size 981502456
model-00005-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9218325bb1a36c197a8b23e32795f772d8c980354b3eec2deaf24f2bb2a0dbc7
3
+ size 922798696
model-00006-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5088aa4546196645dfe337ad8fa72306bd8e9dc1e0083294f95f9b4d8f2bac43
3
+ size 989890728
model-00007-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9adc1b2713a0b23bbfba4e369ac20111685e0edbecbd38e0b5dd28b2b743abf1
3
+ size 989890728
model-00008-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:026ffa4da95b5e6a18101ad92017646764c1b94bd1862d9a087f7ff29ce1338f
3
+ size 989907328
model-00009-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:35f18260dd339a05e0a47fd6344a1c3fec4c1630ad3cd8258f33af454ae247a8
3
+ size 939525040
model-00010-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8efd46047779389e1002420669fc3a1361ef2d97bf43ae6d04baf375a9f7058b
3
+ size 922815792
model-00011-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:efa2d8a3f208f5d248b013455fc257d9d6f2afe240f4c3e2220a0e124e60cf70
3
+ size 973113176
model-00012-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ac66d30f698eb2b710a138f05e48fefea96089fdba4d64043ede748bc83a7e5c
3
+ size 889227648
model-00013-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1a06c8afb43550136e2be474bff073f0d13a99cd83d95996eaf158b832d158f7
3
+ size 989890744
model-00014-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbc3b05eade7f35581b84a42fa75a0b6ea4ed12ae5bcbf993d1e192a943e8185
3
+ size 989890704
model-00015-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa0f6002906aae8632894f6eaaa78e4b0a522151315ac54752d8edb8f662c35d
3
+ size 922799184
model-00016-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a5d5aca896bc0752d1c56f75ca8bcf911f2a321ff281c605a1a6bedc94c89d5b
3
+ size 989907320
model-00017-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c836ea254dd6dadda4fff2e40732a93d571aeb82dfa34146d2d1113ca132cfe5
3
+ size 939525040
model-00018-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f04ff38a61c1d08159ca4b4208b650bc070de941cb2efc43b0eb8357d09dd84d
3
+ size 922815784
model-00019-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a7f9d7e1b1ad683f1442a99bbc5e625838222cd8e016b42db854893567feb6e0
3
+ size 939525040
model-00020-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e858971773a00994e941f2cb941c7b1ae111ba7a79350166ada9c4fba6f1cb3
3
+ size 922815792
model-00021-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:65d521abc197e49d66dc40d98b1a820ab700517715f6bc5fd07c678cdb405474
3
+ size 939525040
model-00022-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f6a3c6abc7c1ce7fe210b93fe89eff7eb2d32a895a88a54c7c8106c8461edd50
3
+ size 922815792
model-00023-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:345f12d6111dffbbf0fdc7487af77bc56b312c337cb9037a890bcd6087598439
3
+ size 973113176
model-00024-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1f2fb3f98e6ed98235117f5fb8b62a74e4ad2ac6db0d9a1ec63ce89c2b292406
3
+ size 973122496
model-00025-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e46b04c04a96dc4b1aa089c38dcf540ac6dc8a222e52c054097d77a855020ea3
3
+ size 981493712
model-00026-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:96e030e6b86315215de599d41d171b71ca4efbf69afdb3c3f8c0f4df2a589cc3
3
+ size 914384584
model-00027-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2ace59277085dc67503b7b6553ab6398872d755ac7a85b8a9f32579a59144683
3
+ size 956353256
model-00028-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8cc91641239086d46b9545ee6f5677472b1eb78c987b05e7043d3aa7acac0932
3
+ size 956353256
model-00029-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b5cb3189f1cd093aa9a49ae77d8d576d66b138d5dd3bdb49b71ff4004b315414
3
+ size 998270688
model-00030-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:44fd9b998fb0aa72d7ba678fda9c942c3e3fc52fe2cd0873ed5ea7407fb1ac98
3
+ size 897607600
model-00031-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c5ea4b7321a25f3a91fc312e1e3a6eb0a91ffb7e84466c745547edc02de6c9e
3
+ size 956344952
model-00032-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c7bd9b46732d9c2d4efb130395de37fe3bc32ee94f390e5d6f3f4ab9068eb55c
3
+ size 989899024
model-00033-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93e3eddfd0ae9e14c05485865155b2e1631206a9a3ecbf220cb14d444e2ff490
3
+ size 956353256
model-00034-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:142de3df4fd42281ee2365191ea2a1a4744976ffdc0883792e729c6cd8e8e936
3
+ size 998270688
model-00035-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c66a4494ffd7561de78216f6d7d04f282aa0b81df29eeda695eabeba8ebacca6
3
+ size 989891184
model-00036-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d0ade5dce06ce6d7036d4de48eaaf322daf5166a7870393c1b01275c3a9c2857
3
+ size 947956216
model-00037-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c0e46973cbdf02c729f0e25b1ab6ae8dd21df452e328eb1980d7638fef06ed4f
3
+ size 989890728
model-00038-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4c48129aa4819760e25fc61fa59f2fdc9605a2ef189ac77c8e447a2ffb1b71f7
3
+ size 989899024
model-00039-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6765ee2b8d5c0efae906132ba42687e810c5e4f32495d5961a136f206f2a5993
3
+ size 989890720
model-00040-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:46669f6918352b6509e485f76624c64e66e9726572071e4113302980c3545572
3
+ size 989890720
model-00041-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c1c5182246a37242e3db2b88057fcaa11c9d12f94732f3ce964c65074078c854
3
+ size 989890720
model-00042-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:da214364c112fe4c12968db76df2a591aae90ff084db5afad350595d9b342425
3
+ size 998296040
model-00043-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2193c278d9e659251c04be5730fed02d84d0098a310aa0006a4a7083c2463c68
3
+ size 981502448
model-00044-of-00044.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93ca2a9ac1c3e4ad4587223d5e14fa9877f63002259872760cd216e8d7c1bdd9
3
+ size 50340304
model.safetensors.index.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"metadata": {"mergekit_version": "0.0.4.2", "total_size": 41529352192}, "weight_map": {"lm_head.weight": "model-00001-of-00044.safetensors", "model.embed_tokens.weight": "model-00001-of-00044.safetensors", "model.layers.40.input_layernorm.weight": "model-00001-of-00044.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00044.safetensors", "model.layers.40.mlp.down_proj.weight": "model-00001-of-00044.safetensors", "model.layers.0.mlp.down_proj.weight": "model-00001-of-00044.safetensors", "model.layers.40.mlp.gate_proj.weight": "model-00001-of-00044.safetensors", "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00044.safetensors", "model.layers.40.mlp.up_proj.weight": "model-00002-of-00044.safetensors", "model.layers.0.mlp.up_proj.weight": "model-00002-of-00044.safetensors", "model.layers.40.post_attention_layernorm.weight": "model-00002-of-00044.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00002-of-00044.safetensors", "model.layers.40.self_attn.k_proj.weight": "model-00002-of-00044.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00002-of-00044.safetensors", "model.layers.40.self_attn.o_proj.weight": "model-00002-of-00044.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00002-of-00044.safetensors", "model.layers.40.self_attn.q_proj.weight": "model-00002-of-00044.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00002-of-00044.safetensors", "model.layers.40.self_attn.v_proj.weight": "model-00002-of-00044.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00002-of-00044.safetensors", "model.layers.41.input_layernorm.weight": "model-00002-of-00044.safetensors", "model.layers.1.input_layernorm.weight": "model-00002-of-00044.safetensors", "model.layers.41.mlp.down_proj.weight": "model-00002-of-00044.safetensors", "model.layers.1.mlp.down_proj.weight": "model-00002-of-00044.safetensors", "model.layers.41.mlp.gate_proj.weight": "model-00002-of-00044.safetensors", "model.layers.1.mlp.gate_proj.weight": "model-00002-of-00044.safetensors", "model.layers.41.mlp.up_proj.weight": "model-00002-of-00044.safetensors", "model.layers.1.mlp.up_proj.weight": "model-00003-of-00044.safetensors", "model.layers.41.post_attention_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.41.self_attn.k_proj.weight": "model-00003-of-00044.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00003-of-00044.safetensors", "model.layers.41.self_attn.o_proj.weight": "model-00003-of-00044.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00003-of-00044.safetensors", "model.layers.41.self_attn.q_proj.weight": "model-00003-of-00044.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00003-of-00044.safetensors", "model.layers.41.self_attn.v_proj.weight": "model-00003-of-00044.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00003-of-00044.safetensors", "model.layers.50.input_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.10.input_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.50.mlp.down_proj.weight": "model-00003-of-00044.safetensors", "model.layers.10.mlp.down_proj.weight": "model-00003-of-00044.safetensors", "model.layers.50.mlp.gate_proj.weight": "model-00003-of-00044.safetensors", "model.layers.10.mlp.gate_proj.weight": "model-00003-of-00044.safetensors", "model.layers.50.mlp.up_proj.weight": "model-00003-of-00044.safetensors", "model.layers.10.mlp.up_proj.weight": "model-00003-of-00044.safetensors", "model.layers.50.post_attention_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00044.safetensors", "model.layers.50.self_attn.k_proj.weight": "model-00003-of-00044.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00004-of-00044.safetensors", "model.layers.50.self_attn.o_proj.weight": "model-00004-of-00044.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00004-of-00044.safetensors", "model.layers.50.self_attn.q_proj.weight": "model-00004-of-00044.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00004-of-00044.safetensors", "model.layers.50.self_attn.v_proj.weight": "model-00004-of-00044.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.input_layernorm.weight": "model-00004-of-00044.safetensors", "model.layers.11.input_layernorm.weight": "model-00004-of-00044.safetensors", "model.layers.51.mlp.down_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.mlp.down_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.mlp.gate_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.mlp.gate_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.mlp.up_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.mlp.up_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.post_attention_layernorm.weight": "model-00004-of-00044.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00004-of-00044.safetensors", "model.layers.51.self_attn.k_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.self_attn.o_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00004-of-00044.safetensors", "model.layers.51.self_attn.q_proj.weight": "model-00004-of-00044.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00044.safetensors", "model.layers.51.self_attn.v_proj.weight": "model-00005-of-00044.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.input_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.12.input_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.52.mlp.down_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.mlp.down_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.mlp.gate_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.mlp.gate_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.mlp.up_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.mlp.up_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.post_attention_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.52.self_attn.k_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.self_attn.o_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.self_attn.q_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00005-of-00044.safetensors", "model.layers.52.self_attn.v_proj.weight": "model-00005-of-00044.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00005-of-00044.safetensors", "model.layers.53.input_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.13.input_layernorm.weight": "model-00005-of-00044.safetensors", "model.layers.53.mlp.down_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.mlp.down_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.mlp.gate_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.mlp.gate_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.mlp.up_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.mlp.up_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.post_attention_layernorm.weight": "model-00006-of-00044.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00006-of-00044.safetensors", "model.layers.53.self_attn.k_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.self_attn.o_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.self_attn.q_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00006-of-00044.safetensors", "model.layers.53.self_attn.v_proj.weight": "model-00006-of-00044.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00006-of-00044.safetensors", "model.layers.54.input_layernorm.weight": "model-00006-of-00044.safetensors", "model.layers.14.input_layernorm.weight": "model-00006-of-00044.safetensors", "model.layers.54.mlp.down_proj.weight": "model-00006-of-00044.safetensors", "model.layers.14.mlp.down_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.mlp.gate_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.mlp.gate_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.mlp.up_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.mlp.up_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.post_attention_layernorm.weight": "model-00007-of-00044.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00007-of-00044.safetensors", "model.layers.54.self_attn.k_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.self_attn.o_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.self_attn.q_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00007-of-00044.safetensors", "model.layers.54.self_attn.v_proj.weight": "model-00007-of-00044.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00007-of-00044.safetensors", "model.layers.55.input_layernorm.weight": "model-00007-of-00044.safetensors", "model.layers.15.input_layernorm.weight": "model-00007-of-00044.safetensors", "model.layers.55.mlp.down_proj.weight": "model-00007-of-00044.safetensors", "model.layers.15.mlp.down_proj.weight": "model-00007-of-00044.safetensors", "model.layers.55.mlp.gate_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.mlp.gate_proj.weight": "model-00008-of-00044.safetensors", "model.layers.55.mlp.up_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.mlp.up_proj.weight": "model-00008-of-00044.safetensors", "model.layers.55.post_attention_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.55.self_attn.k_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00008-of-00044.safetensors", "model.layers.55.self_attn.o_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00008-of-00044.safetensors", "model.layers.55.self_attn.q_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00008-of-00044.safetensors", "model.layers.55.self_attn.v_proj.weight": "model-00008-of-00044.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00008-of-00044.safetensors", "model.layers.72.input_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.64.input_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.56.input_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.16.input_layernorm.weight": "model-00008-of-00044.safetensors", "model.layers.72.mlp.down_proj.weight": "model-00008-of-00044.safetensors", "model.layers.64.mlp.down_proj.weight": "model-00008-of-00044.safetensors", "model.layers.56.mlp.down_proj.weight": "model-00008-of-00044.safetensors", "model.layers.16.mlp.down_proj.weight": "model-00009-of-00044.safetensors", "model.layers.72.mlp.gate_proj.weight": "model-00009-of-00044.safetensors", "model.layers.64.mlp.gate_proj.weight": "model-00009-of-00044.safetensors", "model.layers.56.mlp.gate_proj.weight": "model-00009-of-00044.safetensors", "model.layers.16.mlp.gate_proj.weight": "model-00009-of-00044.safetensors", "model.layers.72.mlp.up_proj.weight": "model-00009-of-00044.safetensors", "model.layers.64.mlp.up_proj.weight": "model-00009-of-00044.safetensors", "model.layers.56.mlp.up_proj.weight": "model-00009-of-00044.safetensors", "model.layers.16.mlp.up_proj.weight": "model-00010-of-00044.safetensors", "model.layers.72.post_attention_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.64.post_attention_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.56.post_attention_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.72.self_attn.k_proj.weight": "model-00010-of-00044.safetensors", "model.layers.64.self_attn.k_proj.weight": "model-00010-of-00044.safetensors", "model.layers.56.self_attn.k_proj.weight": "model-00010-of-00044.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00010-of-00044.safetensors", "model.layers.72.self_attn.o_proj.weight": "model-00010-of-00044.safetensors", "model.layers.64.self_attn.o_proj.weight": "model-00010-of-00044.safetensors", "model.layers.56.self_attn.o_proj.weight": "model-00010-of-00044.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00010-of-00044.safetensors", "model.layers.72.self_attn.q_proj.weight": "model-00010-of-00044.safetensors", "model.layers.64.self_attn.q_proj.weight": "model-00010-of-00044.safetensors", "model.layers.56.self_attn.q_proj.weight": "model-00010-of-00044.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00010-of-00044.safetensors", "model.layers.72.self_attn.v_proj.weight": "model-00010-of-00044.safetensors", "model.layers.64.self_attn.v_proj.weight": "model-00010-of-00044.safetensors", "model.layers.56.self_attn.v_proj.weight": "model-00010-of-00044.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00010-of-00044.safetensors", "model.layers.73.input_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.65.input_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.57.input_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.17.input_layernorm.weight": "model-00010-of-00044.safetensors", "model.layers.73.mlp.down_proj.weight": "model-00010-of-00044.safetensors", "model.layers.65.mlp.down_proj.weight": "model-00010-of-00044.safetensors", "model.layers.57.mlp.down_proj.weight": "model-00010-of-00044.safetensors", "model.layers.17.mlp.down_proj.weight": "model-00010-of-00044.safetensors", "model.layers.73.mlp.gate_proj.weight": "model-00011-of-00044.safetensors", "model.layers.65.mlp.gate_proj.weight": "model-00011-of-00044.safetensors", "model.layers.57.mlp.gate_proj.weight": "model-00011-of-00044.safetensors", "model.layers.17.mlp.gate_proj.weight": "model-00011-of-00044.safetensors", "model.layers.73.mlp.up_proj.weight": "model-00011-of-00044.safetensors", "model.layers.65.mlp.up_proj.weight": "model-00011-of-00044.safetensors", "model.layers.57.mlp.up_proj.weight": "model-00011-of-00044.safetensors", "model.layers.17.mlp.up_proj.weight": "model-00011-of-00044.safetensors", "model.layers.73.post_attention_layernorm.weight": "model-00011-of-00044.safetensors", "model.layers.65.post_attention_layernorm.weight": "model-00011-of-00044.safetensors", "model.layers.57.post_attention_layernorm.weight": "model-00011-of-00044.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00011-of-00044.safetensors", "model.layers.73.self_attn.k_proj.weight": "model-00011-of-00044.safetensors", "model.layers.65.self_attn.k_proj.weight": "model-00011-of-00044.safetensors", "model.layers.57.self_attn.k_proj.weight": "model-00011-of-00044.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00011-of-00044.safetensors", "model.layers.73.self_attn.o_proj.weight": "model-00012-of-00044.safetensors", "model.layers.65.self_attn.o_proj.weight": "model-00012-of-00044.safetensors", "model.layers.57.self_attn.o_proj.weight": "model-00012-of-00044.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00012-of-00044.safetensors", "model.layers.73.self_attn.q_proj.weight": "model-00012-of-00044.safetensors", "model.layers.65.self_attn.q_proj.weight": "model-00012-of-00044.safetensors", "model.layers.57.self_attn.q_proj.weight": "model-00012-of-00044.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00012-of-00044.safetensors", "model.layers.73.self_attn.v_proj.weight": "model-00012-of-00044.safetensors", "model.layers.65.self_attn.v_proj.weight": "model-00012-of-00044.safetensors", "model.layers.57.self_attn.v_proj.weight": "model-00012-of-00044.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00012-of-00044.safetensors", "model.layers.74.input_layernorm.weight": "model-00012-of-00044.safetensors", "model.layers.66.input_layernorm.weight": "model-00012-of-00044.safetensors", "model.layers.58.input_layernorm.weight": "model-00012-of-00044.safetensors", "model.layers.18.input_layernorm.weight": "model-00012-of-00044.safetensors", "model.layers.74.mlp.down_proj.weight": "model-00012-of-00044.safetensors", "model.layers.66.mlp.down_proj.weight": "model-00012-of-00044.safetensors", "model.layers.58.mlp.down_proj.weight": "model-00012-of-00044.safetensors", "model.layers.18.mlp.down_proj.weight": "model-00012-of-00044.safetensors", "model.layers.74.mlp.gate_proj.weight": "model-00012-of-00044.safetensors", "model.layers.66.mlp.gate_proj.weight": "model-00013-of-00044.safetensors", "model.layers.58.mlp.gate_proj.weight": "model-00013-of-00044.safetensors", "model.layers.18.mlp.gate_proj.weight": "model-00013-of-00044.safetensors", "model.layers.74.mlp.up_proj.weight": "model-00013-of-00044.safetensors", "model.layers.66.mlp.up_proj.weight": "model-00013-of-00044.safetensors", "model.layers.58.mlp.up_proj.weight": "model-00013-of-00044.safetensors", "model.layers.18.mlp.up_proj.weight": "model-00013-of-00044.safetensors", "model.layers.74.post_attention_layernorm.weight": "model-00013-of-00044.safetensors", "model.layers.66.post_attention_layernorm.weight": "model-00013-of-00044.safetensors", "model.layers.58.post_attention_layernorm.weight": "model-00013-of-00044.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00013-of-00044.safetensors", "model.layers.74.self_attn.k_proj.weight": "model-00013-of-00044.safetensors", "model.layers.66.self_attn.k_proj.weight": "model-00013-of-00044.safetensors", "model.layers.58.self_attn.k_proj.weight": "model-00013-of-00044.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00013-of-00044.safetensors", "model.layers.74.self_attn.o_proj.weight": "model-00013-of-00044.safetensors", "model.layers.66.self_attn.o_proj.weight": "model-00013-of-00044.safetensors", "model.layers.58.self_attn.o_proj.weight": "model-00013-of-00044.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00013-of-00044.safetensors", "model.layers.74.self_attn.q_proj.weight": "model-00014-of-00044.safetensors", "model.layers.66.self_attn.q_proj.weight": "model-00014-of-00044.safetensors", "model.layers.58.self_attn.q_proj.weight": "model-00014-of-00044.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00014-of-00044.safetensors", "model.layers.74.self_attn.v_proj.weight": "model-00014-of-00044.safetensors", "model.layers.66.self_attn.v_proj.weight": "model-00014-of-00044.safetensors", "model.layers.58.self_attn.v_proj.weight": "model-00014-of-00044.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00014-of-00044.safetensors", "model.layers.75.input_layernorm.weight": "model-00014-of-00044.safetensors", "model.layers.67.input_layernorm.weight": "model-00014-of-00044.safetensors", "model.layers.59.input_layernorm.weight": "model-00014-of-00044.safetensors", "model.layers.19.input_layernorm.weight": "model-00014-of-00044.safetensors", "model.layers.75.mlp.down_proj.weight": "model-00014-of-00044.safetensors", "model.layers.67.mlp.down_proj.weight": "model-00014-of-00044.safetensors", "model.layers.59.mlp.down_proj.weight": "model-00014-of-00044.safetensors", "model.layers.19.mlp.down_proj.weight": "model-00014-of-00044.safetensors", "model.layers.75.mlp.gate_proj.weight": "model-00014-of-00044.safetensors", "model.layers.67.mlp.gate_proj.weight": "model-00014-of-00044.safetensors", "model.layers.59.mlp.gate_proj.weight": "model-00014-of-00044.safetensors", "model.layers.19.mlp.gate_proj.weight": "model-00015-of-00044.safetensors", "model.layers.75.mlp.up_proj.weight": "model-00015-of-00044.safetensors", "model.layers.67.mlp.up_proj.weight": "model-00015-of-00044.safetensors", "model.layers.59.mlp.up_proj.weight": "model-00015-of-00044.safetensors", "model.layers.19.mlp.up_proj.weight": "model-00015-of-00044.safetensors", "model.layers.75.post_attention_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.67.post_attention_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.59.post_attention_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.75.self_attn.k_proj.weight": "model-00015-of-00044.safetensors", "model.layers.67.self_attn.k_proj.weight": "model-00015-of-00044.safetensors", "model.layers.59.self_attn.k_proj.weight": "model-00015-of-00044.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00015-of-00044.safetensors", "model.layers.75.self_attn.o_proj.weight": "model-00015-of-00044.safetensors", "model.layers.67.self_attn.o_proj.weight": "model-00015-of-00044.safetensors", "model.layers.59.self_attn.o_proj.weight": "model-00015-of-00044.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00015-of-00044.safetensors", "model.layers.75.self_attn.q_proj.weight": "model-00015-of-00044.safetensors", "model.layers.67.self_attn.q_proj.weight": "model-00015-of-00044.safetensors", "model.layers.59.self_attn.q_proj.weight": "model-00015-of-00044.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00015-of-00044.safetensors", "model.layers.75.self_attn.v_proj.weight": "model-00015-of-00044.safetensors", "model.layers.67.self_attn.v_proj.weight": "model-00015-of-00044.safetensors", "model.layers.59.self_attn.v_proj.weight": "model-00015-of-00044.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00015-of-00044.safetensors", "model.layers.42.input_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.2.input_layernorm.weight": "model-00015-of-00044.safetensors", "model.layers.42.mlp.down_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.mlp.down_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.mlp.gate_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.mlp.gate_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.mlp.up_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.mlp.up_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.post_attention_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.42.self_attn.k_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.self_attn.o_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.self_attn.q_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00016-of-00044.safetensors", "model.layers.42.self_attn.v_proj.weight": "model-00016-of-00044.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00016-of-00044.safetensors", "model.layers.76.input_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.68.input_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.60.input_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.20.input_layernorm.weight": "model-00016-of-00044.safetensors", "model.layers.76.mlp.down_proj.weight": "model-00016-of-00044.safetensors", "model.layers.68.mlp.down_proj.weight": "model-00017-of-00044.safetensors", "model.layers.60.mlp.down_proj.weight": "model-00017-of-00044.safetensors", "model.layers.20.mlp.down_proj.weight": "model-00017-of-00044.safetensors", "model.layers.76.mlp.gate_proj.weight": "model-00017-of-00044.safetensors", "model.layers.68.mlp.gate_proj.weight": "model-00017-of-00044.safetensors", "model.layers.60.mlp.gate_proj.weight": "model-00017-of-00044.safetensors", "model.layers.20.mlp.gate_proj.weight": "model-00017-of-00044.safetensors", "model.layers.76.mlp.up_proj.weight": "model-00017-of-00044.safetensors", "model.layers.68.mlp.up_proj.weight": "model-00018-of-00044.safetensors", "model.layers.60.mlp.up_proj.weight": "model-00018-of-00044.safetensors", "model.layers.20.mlp.up_proj.weight": "model-00018-of-00044.safetensors", "model.layers.76.post_attention_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.68.post_attention_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.60.post_attention_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.76.self_attn.k_proj.weight": "model-00018-of-00044.safetensors", "model.layers.68.self_attn.k_proj.weight": "model-00018-of-00044.safetensors", "model.layers.60.self_attn.k_proj.weight": "model-00018-of-00044.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00018-of-00044.safetensors", "model.layers.76.self_attn.o_proj.weight": "model-00018-of-00044.safetensors", "model.layers.68.self_attn.o_proj.weight": "model-00018-of-00044.safetensors", "model.layers.60.self_attn.o_proj.weight": "model-00018-of-00044.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00018-of-00044.safetensors", "model.layers.76.self_attn.q_proj.weight": "model-00018-of-00044.safetensors", "model.layers.68.self_attn.q_proj.weight": "model-00018-of-00044.safetensors", "model.layers.60.self_attn.q_proj.weight": "model-00018-of-00044.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00018-of-00044.safetensors", "model.layers.76.self_attn.v_proj.weight": "model-00018-of-00044.safetensors", "model.layers.68.self_attn.v_proj.weight": "model-00018-of-00044.safetensors", "model.layers.60.self_attn.v_proj.weight": "model-00018-of-00044.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00018-of-00044.safetensors", "model.layers.77.input_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.69.input_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.61.input_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.21.input_layernorm.weight": "model-00018-of-00044.safetensors", "model.layers.77.mlp.down_proj.weight": "model-00018-of-00044.safetensors", "model.layers.69.mlp.down_proj.weight": "model-00018-of-00044.safetensors", "model.layers.61.mlp.down_proj.weight": "model-00019-of-00044.safetensors", "model.layers.21.mlp.down_proj.weight": "model-00019-of-00044.safetensors", "model.layers.77.mlp.gate_proj.weight": "model-00019-of-00044.safetensors", "model.layers.69.mlp.gate_proj.weight": "model-00019-of-00044.safetensors", "model.layers.61.mlp.gate_proj.weight": "model-00019-of-00044.safetensors", "model.layers.21.mlp.gate_proj.weight": "model-00019-of-00044.safetensors", "model.layers.77.mlp.up_proj.weight": "model-00019-of-00044.safetensors", "model.layers.69.mlp.up_proj.weight": "model-00019-of-00044.safetensors", "model.layers.61.mlp.up_proj.weight": "model-00020-of-00044.safetensors", "model.layers.21.mlp.up_proj.weight": "model-00020-of-00044.safetensors", "model.layers.77.post_attention_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.69.post_attention_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.61.post_attention_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.77.self_attn.k_proj.weight": "model-00020-of-00044.safetensors", "model.layers.69.self_attn.k_proj.weight": "model-00020-of-00044.safetensors", "model.layers.61.self_attn.k_proj.weight": "model-00020-of-00044.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00020-of-00044.safetensors", "model.layers.77.self_attn.o_proj.weight": "model-00020-of-00044.safetensors", "model.layers.69.self_attn.o_proj.weight": "model-00020-of-00044.safetensors", "model.layers.61.self_attn.o_proj.weight": "model-00020-of-00044.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00020-of-00044.safetensors", "model.layers.77.self_attn.q_proj.weight": "model-00020-of-00044.safetensors", "model.layers.69.self_attn.q_proj.weight": "model-00020-of-00044.safetensors", "model.layers.61.self_attn.q_proj.weight": "model-00020-of-00044.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00020-of-00044.safetensors", "model.layers.77.self_attn.v_proj.weight": "model-00020-of-00044.safetensors", "model.layers.69.self_attn.v_proj.weight": "model-00020-of-00044.safetensors", "model.layers.61.self_attn.v_proj.weight": "model-00020-of-00044.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00020-of-00044.safetensors", "model.layers.78.input_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.70.input_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.62.input_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.22.input_layernorm.weight": "model-00020-of-00044.safetensors", "model.layers.78.mlp.down_proj.weight": "model-00020-of-00044.safetensors", "model.layers.70.mlp.down_proj.weight": "model-00020-of-00044.safetensors", "model.layers.62.mlp.down_proj.weight": "model-00020-of-00044.safetensors", "model.layers.22.mlp.down_proj.weight": "model-00021-of-00044.safetensors", "model.layers.78.mlp.gate_proj.weight": "model-00021-of-00044.safetensors", "model.layers.70.mlp.gate_proj.weight": "model-00021-of-00044.safetensors", "model.layers.62.mlp.gate_proj.weight": "model-00021-of-00044.safetensors", "model.layers.22.mlp.gate_proj.weight": "model-00021-of-00044.safetensors", "model.layers.78.mlp.up_proj.weight": "model-00021-of-00044.safetensors", "model.layers.70.mlp.up_proj.weight": "model-00021-of-00044.safetensors", "model.layers.62.mlp.up_proj.weight": "model-00021-of-00044.safetensors", "model.layers.22.mlp.up_proj.weight": "model-00022-of-00044.safetensors", "model.layers.78.post_attention_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.70.post_attention_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.62.post_attention_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.78.self_attn.k_proj.weight": "model-00022-of-00044.safetensors", "model.layers.70.self_attn.k_proj.weight": "model-00022-of-00044.safetensors", "model.layers.62.self_attn.k_proj.weight": "model-00022-of-00044.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00022-of-00044.safetensors", "model.layers.78.self_attn.o_proj.weight": "model-00022-of-00044.safetensors", "model.layers.70.self_attn.o_proj.weight": "model-00022-of-00044.safetensors", "model.layers.62.self_attn.o_proj.weight": "model-00022-of-00044.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00022-of-00044.safetensors", "model.layers.78.self_attn.q_proj.weight": "model-00022-of-00044.safetensors", "model.layers.70.self_attn.q_proj.weight": "model-00022-of-00044.safetensors", "model.layers.62.self_attn.q_proj.weight": "model-00022-of-00044.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00022-of-00044.safetensors", "model.layers.78.self_attn.v_proj.weight": "model-00022-of-00044.safetensors", "model.layers.70.self_attn.v_proj.weight": "model-00022-of-00044.safetensors", "model.layers.62.self_attn.v_proj.weight": "model-00022-of-00044.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00022-of-00044.safetensors", "model.layers.79.input_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.71.input_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.63.input_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.23.input_layernorm.weight": "model-00022-of-00044.safetensors", "model.layers.79.mlp.down_proj.weight": "model-00022-of-00044.safetensors", "model.layers.71.mlp.down_proj.weight": "model-00022-of-00044.safetensors", "model.layers.63.mlp.down_proj.weight": "model-00022-of-00044.safetensors", "model.layers.23.mlp.down_proj.weight": "model-00022-of-00044.safetensors", "model.layers.79.mlp.gate_proj.weight": "model-00023-of-00044.safetensors", "model.layers.71.mlp.gate_proj.weight": "model-00023-of-00044.safetensors", "model.layers.63.mlp.gate_proj.weight": "model-00023-of-00044.safetensors", "model.layers.23.mlp.gate_proj.weight": "model-00023-of-00044.safetensors", "model.layers.79.mlp.up_proj.weight": "model-00023-of-00044.safetensors", "model.layers.71.mlp.up_proj.weight": "model-00023-of-00044.safetensors", "model.layers.63.mlp.up_proj.weight": "model-00023-of-00044.safetensors", "model.layers.23.mlp.up_proj.weight": "model-00023-of-00044.safetensors", "model.layers.79.post_attention_layernorm.weight": "model-00023-of-00044.safetensors", "model.layers.71.post_attention_layernorm.weight": "model-00023-of-00044.safetensors", "model.layers.63.post_attention_layernorm.weight": "model-00023-of-00044.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00023-of-00044.safetensors", "model.layers.79.self_attn.k_proj.weight": "model-00023-of-00044.safetensors", "model.layers.71.self_attn.k_proj.weight": "model-00023-of-00044.safetensors", "model.layers.63.self_attn.k_proj.weight": "model-00023-of-00044.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00023-of-00044.safetensors", "model.layers.79.self_attn.o_proj.weight": "model-00024-of-00044.safetensors", "model.layers.71.self_attn.o_proj.weight": "model-00024-of-00044.safetensors", "model.layers.63.self_attn.o_proj.weight": "model-00024-of-00044.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00024-of-00044.safetensors", "model.layers.79.self_attn.q_proj.weight": "model-00024-of-00044.safetensors", "model.layers.71.self_attn.q_proj.weight": "model-00024-of-00044.safetensors", "model.layers.63.self_attn.q_proj.weight": "model-00024-of-00044.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00024-of-00044.safetensors", "model.layers.79.self_attn.v_proj.weight": "model-00024-of-00044.safetensors", "model.layers.71.self_attn.v_proj.weight": "model-00024-of-00044.safetensors", "model.layers.63.self_attn.v_proj.weight": "model-00024-of-00044.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.input_layernorm.weight": "model-00024-of-00044.safetensors", "model.layers.24.mlp.down_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.mlp.gate_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.mlp.up_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00024-of-00044.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00024-of-00044.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00024-of-00044.safetensors", "model.layers.87.input_layernorm.weight": "model-00024-of-00044.safetensors", "model.layers.80.input_layernorm.weight": "model-00024-of-00044.safetensors", "model.layers.25.input_layernorm.weight": "model-00024-of-00044.safetensors", "model.layers.87.mlp.down_proj.weight": "model-00024-of-00044.safetensors", "model.layers.80.mlp.down_proj.weight": "model-00024-of-00044.safetensors", "model.layers.25.mlp.down_proj.weight": "model-00025-of-00044.safetensors", "model.layers.87.mlp.gate_proj.weight": "model-00025-of-00044.safetensors", "model.layers.80.mlp.gate_proj.weight": "model-00025-of-00044.safetensors", "model.layers.25.mlp.gate_proj.weight": "model-00025-of-00044.safetensors", "model.layers.87.mlp.up_proj.weight": "model-00025-of-00044.safetensors", "model.layers.80.mlp.up_proj.weight": "model-00025-of-00044.safetensors", "model.layers.25.mlp.up_proj.weight": "model-00025-of-00044.safetensors", "model.layers.87.post_attention_layernorm.weight": "model-00025-of-00044.safetensors", "model.layers.80.post_attention_layernorm.weight": "model-00025-of-00044.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00025-of-00044.safetensors", "model.layers.87.self_attn.k_proj.weight": "model-00025-of-00044.safetensors", "model.layers.80.self_attn.k_proj.weight": "model-00025-of-00044.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00025-of-00044.safetensors", "model.layers.87.self_attn.o_proj.weight": "model-00025-of-00044.safetensors", "model.layers.80.self_attn.o_proj.weight": "model-00025-of-00044.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00025-of-00044.safetensors", "model.layers.87.self_attn.q_proj.weight": "model-00025-of-00044.safetensors", "model.layers.80.self_attn.q_proj.weight": "model-00026-of-00044.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00026-of-00044.safetensors", "model.layers.87.self_attn.v_proj.weight": "model-00026-of-00044.safetensors", "model.layers.80.self_attn.v_proj.weight": "model-00026-of-00044.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00026-of-00044.safetensors", "model.layers.88.input_layernorm.weight": "model-00026-of-00044.safetensors", "model.layers.81.input_layernorm.weight": "model-00026-of-00044.safetensors", "model.layers.26.input_layernorm.weight": "model-00026-of-00044.safetensors", "model.layers.88.mlp.down_proj.weight": "model-00026-of-00044.safetensors", "model.layers.81.mlp.down_proj.weight": "model-00026-of-00044.safetensors", "model.layers.26.mlp.down_proj.weight": "model-00026-of-00044.safetensors", "model.layers.88.mlp.gate_proj.weight": "model-00026-of-00044.safetensors", "model.layers.81.mlp.gate_proj.weight": "model-00026-of-00044.safetensors", "model.layers.26.mlp.gate_proj.weight": "model-00026-of-00044.safetensors", "model.layers.88.mlp.up_proj.weight": "model-00026-of-00044.safetensors", "model.layers.81.mlp.up_proj.weight": "model-00027-of-00044.safetensors", "model.layers.26.mlp.up_proj.weight": "model-00027-of-00044.safetensors", "model.layers.88.post_attention_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.81.post_attention_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.88.self_attn.k_proj.weight": "model-00027-of-00044.safetensors", "model.layers.81.self_attn.k_proj.weight": "model-00027-of-00044.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00027-of-00044.safetensors", "model.layers.88.self_attn.o_proj.weight": "model-00027-of-00044.safetensors", "model.layers.81.self_attn.o_proj.weight": "model-00027-of-00044.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00027-of-00044.safetensors", "model.layers.88.self_attn.q_proj.weight": "model-00027-of-00044.safetensors", "model.layers.81.self_attn.q_proj.weight": "model-00027-of-00044.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00027-of-00044.safetensors", "model.layers.88.self_attn.v_proj.weight": "model-00027-of-00044.safetensors", "model.layers.81.self_attn.v_proj.weight": "model-00027-of-00044.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00027-of-00044.safetensors", "model.layers.89.input_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.82.input_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.27.input_layernorm.weight": "model-00027-of-00044.safetensors", "model.layers.89.mlp.down_proj.weight": "model-00027-of-00044.safetensors", "model.layers.82.mlp.down_proj.weight": "model-00027-of-00044.safetensors", "model.layers.27.mlp.down_proj.weight": "model-00027-of-00044.safetensors", "model.layers.89.mlp.gate_proj.weight": "model-00027-of-00044.safetensors", "model.layers.82.mlp.gate_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.mlp.gate_proj.weight": "model-00028-of-00044.safetensors", "model.layers.89.mlp.up_proj.weight": "model-00028-of-00044.safetensors", "model.layers.82.mlp.up_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.mlp.up_proj.weight": "model-00028-of-00044.safetensors", "model.layers.89.post_attention_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.82.post_attention_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.89.self_attn.k_proj.weight": "model-00028-of-00044.safetensors", "model.layers.82.self_attn.k_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00028-of-00044.safetensors", "model.layers.89.self_attn.o_proj.weight": "model-00028-of-00044.safetensors", "model.layers.82.self_attn.o_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00028-of-00044.safetensors", "model.layers.89.self_attn.q_proj.weight": "model-00028-of-00044.safetensors", "model.layers.82.self_attn.q_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00028-of-00044.safetensors", "model.layers.89.self_attn.v_proj.weight": "model-00028-of-00044.safetensors", "model.layers.82.self_attn.v_proj.weight": "model-00028-of-00044.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00028-of-00044.safetensors", "model.layers.90.input_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.83.input_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.28.input_layernorm.weight": "model-00028-of-00044.safetensors", "model.layers.90.mlp.down_proj.weight": "model-00028-of-00044.safetensors", "model.layers.83.mlp.down_proj.weight": "model-00029-of-00044.safetensors", "model.layers.28.mlp.down_proj.weight": "model-00029-of-00044.safetensors", "model.layers.90.mlp.gate_proj.weight": "model-00029-of-00044.safetensors", "model.layers.83.mlp.gate_proj.weight": "model-00029-of-00044.safetensors", "model.layers.28.mlp.gate_proj.weight": "model-00029-of-00044.safetensors", "model.layers.90.mlp.up_proj.weight": "model-00029-of-00044.safetensors", "model.layers.83.mlp.up_proj.weight": "model-00029-of-00044.safetensors", "model.layers.28.mlp.up_proj.weight": "model-00029-of-00044.safetensors", "model.layers.90.post_attention_layernorm.weight": "model-00029-of-00044.safetensors", "model.layers.83.post_attention_layernorm.weight": "model-00029-of-00044.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00029-of-00044.safetensors", "model.layers.90.self_attn.k_proj.weight": "model-00029-of-00044.safetensors", "model.layers.83.self_attn.k_proj.weight": "model-00029-of-00044.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00029-of-00044.safetensors", "model.layers.90.self_attn.o_proj.weight": "model-00029-of-00044.safetensors", "model.layers.83.self_attn.o_proj.weight": "model-00030-of-00044.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00030-of-00044.safetensors", "model.layers.90.self_attn.q_proj.weight": "model-00030-of-00044.safetensors", "model.layers.83.self_attn.q_proj.weight": "model-00030-of-00044.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00030-of-00044.safetensors", "model.layers.90.self_attn.v_proj.weight": "model-00030-of-00044.safetensors", "model.layers.83.self_attn.v_proj.weight": "model-00030-of-00044.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00030-of-00044.safetensors", "model.layers.91.input_layernorm.weight": "model-00030-of-00044.safetensors", "model.layers.84.input_layernorm.weight": "model-00030-of-00044.safetensors", "model.layers.29.input_layernorm.weight": "model-00030-of-00044.safetensors", "model.layers.91.mlp.down_proj.weight": "model-00030-of-00044.safetensors", "model.layers.84.mlp.down_proj.weight": "model-00030-of-00044.safetensors", "model.layers.29.mlp.down_proj.weight": "model-00030-of-00044.safetensors", "model.layers.91.mlp.gate_proj.weight": "model-00030-of-00044.safetensors", "model.layers.84.mlp.gate_proj.weight": "model-00030-of-00044.safetensors", "model.layers.29.mlp.gate_proj.weight": "model-00030-of-00044.safetensors", "model.layers.91.mlp.up_proj.weight": "model-00031-of-00044.safetensors", "model.layers.84.mlp.up_proj.weight": "model-00031-of-00044.safetensors", "model.layers.29.mlp.up_proj.weight": "model-00031-of-00044.safetensors", "model.layers.91.post_attention_layernorm.weight": "model-00031-of-00044.safetensors", "model.layers.84.post_attention_layernorm.weight": "model-00031-of-00044.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00031-of-00044.safetensors", "model.layers.91.self_attn.k_proj.weight": "model-00031-of-00044.safetensors", "model.layers.84.self_attn.k_proj.weight": "model-00031-of-00044.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00031-of-00044.safetensors", "model.layers.91.self_attn.o_proj.weight": "model-00031-of-00044.safetensors", "model.layers.84.self_attn.o_proj.weight": "model-00031-of-00044.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00031-of-00044.safetensors", "model.layers.91.self_attn.q_proj.weight": "model-00031-of-00044.safetensors", "model.layers.84.self_attn.q_proj.weight": "model-00031-of-00044.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00031-of-00044.safetensors", "model.layers.91.self_attn.v_proj.weight": "model-00031-of-00044.safetensors", "model.layers.84.self_attn.v_proj.weight": "model-00031-of-00044.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00031-of-00044.safetensors", "model.layers.43.input_layernorm.weight": "model-00031-of-00044.safetensors", "model.layers.3.input_layernorm.weight": "model-00031-of-00044.safetensors", "model.layers.43.mlp.down_proj.weight": "model-00031-of-00044.safetensors", "model.layers.3.mlp.down_proj.weight": "model-00031-of-00044.safetensors", "model.layers.43.mlp.gate_proj.weight": "model-00031-of-00044.safetensors", "model.layers.3.mlp.gate_proj.weight": "model-00032-of-00044.safetensors", "model.layers.43.mlp.up_proj.weight": "model-00032-of-00044.safetensors", "model.layers.3.mlp.up_proj.weight": "model-00032-of-00044.safetensors", "model.layers.43.post_attention_layernorm.weight": "model-00032-of-00044.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00032-of-00044.safetensors", "model.layers.43.self_attn.k_proj.weight": "model-00032-of-00044.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00032-of-00044.safetensors", "model.layers.43.self_attn.o_proj.weight": "model-00032-of-00044.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00032-of-00044.safetensors", "model.layers.43.self_attn.q_proj.weight": "model-00032-of-00044.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00032-of-00044.safetensors", "model.layers.43.self_attn.v_proj.weight": "model-00032-of-00044.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00032-of-00044.safetensors", "model.layers.92.input_layernorm.weight": "model-00032-of-00044.safetensors", "model.layers.85.input_layernorm.weight": "model-00032-of-00044.safetensors", "model.layers.30.input_layernorm.weight": "model-00032-of-00044.safetensors", "model.layers.92.mlp.down_proj.weight": "model-00032-of-00044.safetensors", "model.layers.85.mlp.down_proj.weight": "model-00032-of-00044.safetensors", "model.layers.30.mlp.down_proj.weight": "model-00032-of-00044.safetensors", "model.layers.92.mlp.gate_proj.weight": "model-00032-of-00044.safetensors", "model.layers.85.mlp.gate_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.mlp.gate_proj.weight": "model-00033-of-00044.safetensors", "model.layers.92.mlp.up_proj.weight": "model-00033-of-00044.safetensors", "model.layers.85.mlp.up_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.mlp.up_proj.weight": "model-00033-of-00044.safetensors", "model.layers.92.post_attention_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.85.post_attention_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.92.self_attn.k_proj.weight": "model-00033-of-00044.safetensors", "model.layers.85.self_attn.k_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00033-of-00044.safetensors", "model.layers.92.self_attn.o_proj.weight": "model-00033-of-00044.safetensors", "model.layers.85.self_attn.o_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00033-of-00044.safetensors", "model.layers.92.self_attn.q_proj.weight": "model-00033-of-00044.safetensors", "model.layers.85.self_attn.q_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00033-of-00044.safetensors", "model.layers.92.self_attn.v_proj.weight": "model-00033-of-00044.safetensors", "model.layers.85.self_attn.v_proj.weight": "model-00033-of-00044.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00033-of-00044.safetensors", "model.layers.93.input_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.86.input_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.31.input_layernorm.weight": "model-00033-of-00044.safetensors", "model.layers.93.mlp.down_proj.weight": "model-00033-of-00044.safetensors", "model.layers.86.mlp.down_proj.weight": "model-00034-of-00044.safetensors", "model.layers.31.mlp.down_proj.weight": "model-00034-of-00044.safetensors", "model.layers.93.mlp.gate_proj.weight": "model-00034-of-00044.safetensors", "model.layers.86.mlp.gate_proj.weight": "model-00034-of-00044.safetensors", "model.layers.31.mlp.gate_proj.weight": "model-00034-of-00044.safetensors", "model.layers.93.mlp.up_proj.weight": "model-00034-of-00044.safetensors", "model.layers.86.mlp.up_proj.weight": "model-00034-of-00044.safetensors", "model.layers.31.mlp.up_proj.weight": "model-00034-of-00044.safetensors", "model.layers.93.post_attention_layernorm.weight": "model-00034-of-00044.safetensors", "model.layers.86.post_attention_layernorm.weight": "model-00034-of-00044.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00034-of-00044.safetensors", "model.layers.93.self_attn.k_proj.weight": "model-00034-of-00044.safetensors", "model.layers.86.self_attn.k_proj.weight": "model-00034-of-00044.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00034-of-00044.safetensors", "model.layers.93.self_attn.o_proj.weight": "model-00034-of-00044.safetensors", "model.layers.86.self_attn.o_proj.weight": "model-00035-of-00044.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00035-of-00044.safetensors", "model.layers.93.self_attn.q_proj.weight": "model-00035-of-00044.safetensors", "model.layers.86.self_attn.q_proj.weight": "model-00035-of-00044.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00035-of-00044.safetensors", "model.layers.93.self_attn.v_proj.weight": "model-00035-of-00044.safetensors", "model.layers.86.self_attn.v_proj.weight": "model-00035-of-00044.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.input_layernorm.weight": "model-00035-of-00044.safetensors", "model.layers.32.mlp.down_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.mlp.gate_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.mlp.up_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.post_attention_layernorm.weight": "model-00035-of-00044.safetensors", "model.layers.32.self_attn.k_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.self_attn.o_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.self_attn.q_proj.weight": "model-00035-of-00044.safetensors", "model.layers.32.self_attn.v_proj.weight": "model-00035-of-00044.safetensors", "model.layers.33.input_layernorm.weight": "model-00035-of-00044.safetensors", "model.layers.33.mlp.down_proj.weight": "model-00035-of-00044.safetensors", "model.layers.33.mlp.gate_proj.weight": "model-00035-of-00044.safetensors", "model.layers.33.mlp.up_proj.weight": "model-00035-of-00044.safetensors", "model.layers.33.post_attention_layernorm.weight": "model-00035-of-00044.safetensors", "model.layers.33.self_attn.k_proj.weight": "model-00035-of-00044.safetensors", "model.layers.33.self_attn.o_proj.weight": "model-00036-of-00044.safetensors", "model.layers.33.self_attn.q_proj.weight": "model-00036-of-00044.safetensors", "model.layers.33.self_attn.v_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.input_layernorm.weight": "model-00036-of-00044.safetensors", "model.layers.34.mlp.down_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.mlp.gate_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.mlp.up_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.post_attention_layernorm.weight": "model-00036-of-00044.safetensors", "model.layers.34.self_attn.k_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.self_attn.o_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.self_attn.q_proj.weight": "model-00036-of-00044.safetensors", "model.layers.34.self_attn.v_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.input_layernorm.weight": "model-00036-of-00044.safetensors", "model.layers.35.mlp.down_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.mlp.gate_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.mlp.up_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.post_attention_layernorm.weight": "model-00036-of-00044.safetensors", "model.layers.35.self_attn.k_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.self_attn.o_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.self_attn.q_proj.weight": "model-00036-of-00044.safetensors", "model.layers.35.self_attn.v_proj.weight": "model-00036-of-00044.safetensors", "model.layers.36.input_layernorm.weight": "model-00036-of-00044.safetensors", "model.layers.36.mlp.down_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.mlp.gate_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.mlp.up_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.post_attention_layernorm.weight": "model-00037-of-00044.safetensors", "model.layers.36.self_attn.k_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.self_attn.o_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.self_attn.q_proj.weight": "model-00037-of-00044.safetensors", "model.layers.36.self_attn.v_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.input_layernorm.weight": "model-00037-of-00044.safetensors", "model.layers.37.mlp.down_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.mlp.gate_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.mlp.up_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.post_attention_layernorm.weight": "model-00037-of-00044.safetensors", "model.layers.37.self_attn.k_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.self_attn.o_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.self_attn.q_proj.weight": "model-00037-of-00044.safetensors", "model.layers.37.self_attn.v_proj.weight": "model-00037-of-00044.safetensors", "model.layers.38.input_layernorm.weight": "model-00037-of-00044.safetensors", "model.layers.38.mlp.down_proj.weight": "model-00037-of-00044.safetensors", "model.layers.38.mlp.gate_proj.weight": "model-00038-of-00044.safetensors", "model.layers.38.mlp.up_proj.weight": "model-00038-of-00044.safetensors", "model.layers.38.post_attention_layernorm.weight": "model-00038-of-00044.safetensors", "model.layers.38.self_attn.k_proj.weight": "model-00038-of-00044.safetensors", "model.layers.38.self_attn.o_proj.weight": "model-00038-of-00044.safetensors", "model.layers.38.self_attn.q_proj.weight": "model-00038-of-00044.safetensors", "model.layers.38.self_attn.v_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.input_layernorm.weight": "model-00038-of-00044.safetensors", "model.layers.39.mlp.down_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.mlp.gate_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.mlp.up_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.post_attention_layernorm.weight": "model-00038-of-00044.safetensors", "model.layers.39.self_attn.k_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.self_attn.o_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.self_attn.q_proj.weight": "model-00038-of-00044.safetensors", "model.layers.39.self_attn.v_proj.weight": "model-00038-of-00044.safetensors", "model.layers.44.input_layernorm.weight": "model-00038-of-00044.safetensors", "model.layers.4.input_layernorm.weight": "model-00038-of-00044.safetensors", "model.layers.44.mlp.down_proj.weight": "model-00038-of-00044.safetensors", "model.layers.4.mlp.down_proj.weight": "model-00038-of-00044.safetensors", "model.layers.44.mlp.gate_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.mlp.gate_proj.weight": "model-00039-of-00044.safetensors", "model.layers.44.mlp.up_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.mlp.up_proj.weight": "model-00039-of-00044.safetensors", "model.layers.44.post_attention_layernorm.weight": "model-00039-of-00044.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00039-of-00044.safetensors", "model.layers.44.self_attn.k_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00039-of-00044.safetensors", "model.layers.44.self_attn.o_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00039-of-00044.safetensors", "model.layers.44.self_attn.q_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00039-of-00044.safetensors", "model.layers.44.self_attn.v_proj.weight": "model-00039-of-00044.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00039-of-00044.safetensors", "model.layers.45.input_layernorm.weight": "model-00039-of-00044.safetensors", "model.layers.5.input_layernorm.weight": "model-00039-of-00044.safetensors", "model.layers.45.mlp.down_proj.weight": "model-00039-of-00044.safetensors", "model.layers.5.mlp.down_proj.weight": "model-00039-of-00044.safetensors", "model.layers.45.mlp.gate_proj.weight": "model-00039-of-00044.safetensors", "model.layers.5.mlp.gate_proj.weight": "model-00040-of-00044.safetensors", "model.layers.45.mlp.up_proj.weight": "model-00040-of-00044.safetensors", "model.layers.5.mlp.up_proj.weight": "model-00040-of-00044.safetensors", "model.layers.45.post_attention_layernorm.weight": "model-00040-of-00044.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00040-of-00044.safetensors", "model.layers.45.self_attn.k_proj.weight": "model-00040-of-00044.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00040-of-00044.safetensors", "model.layers.45.self_attn.o_proj.weight": "model-00040-of-00044.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00040-of-00044.safetensors", "model.layers.45.self_attn.q_proj.weight": "model-00040-of-00044.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00040-of-00044.safetensors", "model.layers.45.self_attn.v_proj.weight": "model-00040-of-00044.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00040-of-00044.safetensors", "model.layers.46.input_layernorm.weight": "model-00040-of-00044.safetensors", "model.layers.6.input_layernorm.weight": "model-00040-of-00044.safetensors", "model.layers.46.mlp.down_proj.weight": "model-00040-of-00044.safetensors", "model.layers.6.mlp.down_proj.weight": "model-00040-of-00044.safetensors", "model.layers.46.mlp.gate_proj.weight": "model-00040-of-00044.safetensors", "model.layers.6.mlp.gate_proj.weight": "model-00040-of-00044.safetensors", "model.layers.46.mlp.up_proj.weight": "model-00041-of-00044.safetensors", "model.layers.6.mlp.up_proj.weight": "model-00041-of-00044.safetensors", "model.layers.46.post_attention_layernorm.weight": "model-00041-of-00044.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00041-of-00044.safetensors", "model.layers.46.self_attn.k_proj.weight": "model-00041-of-00044.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00041-of-00044.safetensors", "model.layers.46.self_attn.o_proj.weight": "model-00041-of-00044.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00041-of-00044.safetensors", "model.layers.46.self_attn.q_proj.weight": "model-00041-of-00044.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00041-of-00044.safetensors", "model.layers.46.self_attn.v_proj.weight": "model-00041-of-00044.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00041-of-00044.safetensors", "model.layers.47.input_layernorm.weight": "model-00041-of-00044.safetensors", "model.layers.7.input_layernorm.weight": "model-00041-of-00044.safetensors", "model.layers.47.mlp.down_proj.weight": "model-00041-of-00044.safetensors", "model.layers.7.mlp.down_proj.weight": "model-00041-of-00044.safetensors", "model.layers.47.mlp.gate_proj.weight": "model-00041-of-00044.safetensors", "model.layers.7.mlp.gate_proj.weight": "model-00041-of-00044.safetensors", "model.layers.47.mlp.up_proj.weight": "model-00041-of-00044.safetensors", "model.layers.7.mlp.up_proj.weight": "model-00042-of-00044.safetensors", "model.layers.47.post_attention_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.47.self_attn.k_proj.weight": "model-00042-of-00044.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00042-of-00044.safetensors", "model.layers.47.self_attn.o_proj.weight": "model-00042-of-00044.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00042-of-00044.safetensors", "model.layers.47.self_attn.q_proj.weight": "model-00042-of-00044.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00042-of-00044.safetensors", "model.layers.47.self_attn.v_proj.weight": "model-00042-of-00044.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00042-of-00044.safetensors", "model.layers.48.input_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.8.input_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.48.mlp.down_proj.weight": "model-00042-of-00044.safetensors", "model.layers.8.mlp.down_proj.weight": "model-00042-of-00044.safetensors", "model.layers.48.mlp.gate_proj.weight": "model-00042-of-00044.safetensors", "model.layers.8.mlp.gate_proj.weight": "model-00042-of-00044.safetensors", "model.layers.48.mlp.up_proj.weight": "model-00042-of-00044.safetensors", "model.layers.8.mlp.up_proj.weight": "model-00042-of-00044.safetensors", "model.layers.48.post_attention_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00042-of-00044.safetensors", "model.layers.48.self_attn.k_proj.weight": "model-00042-of-00044.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00043-of-00044.safetensors", "model.layers.48.self_attn.o_proj.weight": "model-00043-of-00044.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00043-of-00044.safetensors", "model.layers.48.self_attn.q_proj.weight": "model-00043-of-00044.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00043-of-00044.safetensors", "model.layers.48.self_attn.v_proj.weight": "model-00043-of-00044.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.input_layernorm.weight": "model-00043-of-00044.safetensors", "model.layers.9.input_layernorm.weight": "model-00043-of-00044.safetensors", "model.layers.49.mlp.down_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.mlp.down_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.mlp.gate_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.mlp.gate_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.mlp.up_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.mlp.up_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.post_attention_layernorm.weight": "model-00043-of-00044.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00043-of-00044.safetensors", "model.layers.49.self_attn.k_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.self_attn.o_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00043-of-00044.safetensors", "model.layers.49.self_attn.q_proj.weight": "model-00043-of-00044.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00044-of-00044.safetensors", "model.layers.49.self_attn.v_proj.weight": "model-00044-of-00044.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00044-of-00044.safetensors", "model.norm.weight": "model-00044-of-00044.safetensors"}}
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff