gmonsoon commited on
Commit
d41ca0b
1 Parent(s): ee19064

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -2,13 +2,9 @@
2
  license: apache-2.0
3
  tags:
4
  - moe
5
- - frankenmoe
6
  - merge
7
  - mergekit
8
  - TinyLlama/TinyLlama-1.1B-Chat-v1.0
9
- - vihangd/DopeyTinyLlama-1.1B-v1
10
- - cognitivecomputations/TinyDolphin-2.8.1-1.1b
11
- - Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test
12
  base_model:
13
  - TinyLlama/TinyLlama-1.1B-Chat-v1.0
14
  - vihangd/DopeyTinyLlama-1.1B-v1
@@ -16,6 +12,8 @@ base_model:
16
  - Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test
17
  ---
18
 
 
 
19
  # TinyUltra-4x1.1B-Base-Alpha
20
 
21
  TinyUltra-4x1.1B-Base-Alpha is a Mixure of Experts (MoE) made with the following models using MergeKit:
 
2
  license: apache-2.0
3
  tags:
4
  - moe
 
5
  - merge
6
  - mergekit
7
  - TinyLlama/TinyLlama-1.1B-Chat-v1.0
 
 
 
8
  base_model:
9
  - TinyLlama/TinyLlama-1.1B-Chat-v1.0
10
  - vihangd/DopeyTinyLlama-1.1B-v1
 
12
  - Josephgflowers/Tinyllama-Cinder-1.3B-Reason-Test
13
  ---
14
 
15
+ ![image/jpeg](https://i.imgur.com/rx3ckCc.jpeg)
16
+
17
  # TinyUltra-4x1.1B-Base-Alpha
18
 
19
  TinyUltra-4x1.1B-Base-Alpha is a Mixure of Experts (MoE) made with the following models using MergeKit: