mshojaei77 commited on
Commit
ac105f4
·
verified ·
1 Parent(s): 9e6a891

Upload folder using huggingface_hub

Browse files
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  base_model:
3
- - SaisExperiments/Gemma-2-2B-Opus-Instruct
4
  - mshojaei77/Gemma-2-2b-fa
5
  library_name: transformers
6
  tags:
@@ -20,7 +20,7 @@ This model was merged using the [SLERP](https://en.wikipedia.org/wiki/Slerp) mer
20
  ### Models Merged
21
 
22
  The following models were included in the merge:
23
- * [SaisExperiments/Gemma-2-2B-Opus-Instruct](https://huggingface.co/SaisExperiments/Gemma-2-2B-Opus-Instruct)
24
  * [mshojaei77/Gemma-2-2b-fa](https://huggingface.co/mshojaei77/Gemma-2-2b-fa)
25
 
26
  ### Configuration
@@ -30,10 +30,10 @@ The following YAML configuration was used to produce this model:
30
  ```yaml
31
 
32
  models:
33
- - model: SaisExperiments/Gemma-2-2B-Opus-Instruct
34
  - model: mshojaei77/Gemma-2-2b-fa
 
35
  merge_method: slerp
36
- base_model: mshojaei77/Gemma-2-2b-fa
37
  dtype: bfloat16
38
  parameters:
39
  t: [0, 0.5, 1, 0.5, 0]
 
1
  ---
2
  base_model:
3
+ - mshojaei77/gemma-2-2b-fa-v3
4
  - mshojaei77/Gemma-2-2b-fa
5
  library_name: transformers
6
  tags:
 
20
  ### Models Merged
21
 
22
  The following models were included in the merge:
23
+ * [mshojaei77/gemma-2-2b-fa-v3](https://huggingface.co/mshojaei77/gemma-2-2b-fa-v3)
24
  * [mshojaei77/Gemma-2-2b-fa](https://huggingface.co/mshojaei77/Gemma-2-2b-fa)
25
 
26
  ### Configuration
 
30
  ```yaml
31
 
32
  models:
 
33
  - model: mshojaei77/Gemma-2-2b-fa
34
+ - model: mshojaei77/gemma-2-2b-fa-v3
35
  merge_method: slerp
36
+ base_model: mshojaei77/gemma-2-2b-fa-v3
37
  dtype: bfloat16
38
  parameters:
39
  t: [0, 0.5, 1, 0.5, 0]
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "mshojaei77/Gemma-2-2b-fa",
3
  "architectures": [
4
  "Gemma2ForCausalLM"
5
  ],
 
1
  {
2
+ "_name_or_path": "mshojaei77/gemma-2-2b-fa-v3",
3
  "architectures": [
4
  "Gemma2ForCausalLM"
5
  ],
mergekit_config.yml CHANGED
@@ -1,9 +1,9 @@
1
 
2
  models:
3
- - model: SaisExperiments/Gemma-2-2B-Opus-Instruct
4
  - model: mshojaei77/Gemma-2-2b-fa
 
5
  merge_method: slerp
6
- base_model: mshojaei77/Gemma-2-2b-fa
7
  dtype: bfloat16
8
  parameters:
9
  t: [0, 0.5, 1, 0.5, 0]
 
1
 
2
  models:
 
3
  - model: mshojaei77/Gemma-2-2b-fa
4
+ - model: mshojaei77/gemma-2-2b-fa-v3
5
  merge_method: slerp
6
+ base_model: mshojaei77/gemma-2-2b-fa-v3
7
  dtype: bfloat16
8
  parameters:
9
  t: [0, 0.5, 1, 0.5, 0]
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2b1b46d8cd26ba53fbd908f8e66f37314e2cdf0f990a7fed11523768fc963794
3
  size 4959718480
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ea2312bba56074d49d73e5a6d77c1afd1b7e9c24212ca11b5a9ec87d83f51758
3
  size 4959718480
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e886c291f4f6f2a8f30a9d403726c44fdb284b47a495004d5374353e36d6ac8b
3
  size 268999016
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b27fbba0e33f32f0b5c48c25a39d6fafdd79e5e3a7ed7f5b6eead10c15e426b3
3
  size 268999016