appvoid commited on
Commit
8dd1309
1 Parent(s): d0dbc02

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -8,17 +8,17 @@ tags:
8
 
9
  ![logo](https://huggingface.co/appvoid/palmer-003/resolve/main/003.png)
10
 
11
- Creative writing has never been so accesible, palmer-003 goes beyond what it was thought about small language models. This model is a "MErging of Experts" (MEoE) biased as an assistant without using any prompts—as a result of these efforts—palmer is better than most 1b language models on most benchmarks, despite being sometimes 40% smaller than its counterparts.
12
 
13
  ```
14
  MMLU ARC-C OBQA HellaSwag PIQA Winogrande Average
15
  tinyllama-chat | 0.2470 | 0.3285 | 0.3740 | 0.6037 | 0.7448 | 0.6022 | 0.4833 |
16
  zyte-1b | 0.2397 | 0.3353 | 0.3700 | 0.6086 | 0.7541 | 0.5998 | 0.4845 |
17
- palmer-meoe-003| 0.2534 | 0.3370 | 0.3740 | 0.6128 | 0.7486 | 0.6535 | 0.4965 |
18
  qwen-1-8 | 0.4536 | 0.3490 | 0.3320 | 0.5876 | 0.7307 | 0.5896 | 0.5070 |
19
  ```
20
 
21
- This work constitutes, given its compactness, an advancement towards SMLs, easily empowering edge devices such as mobile phones, raspberry pis and automated software/robots. Aditionally, palmer-003 deviates its main philosophy from palmer-family to become a more powerful model with more data instead of less.
22
 
23
  ```
24
  prompt: Reality is but
 
8
 
9
  ![logo](https://huggingface.co/appvoid/palmer-003/resolve/main/003.png)
10
 
11
+ Creative writing has never been so accesible, palmer goes beyond what it was thought about small language models. This model is a "MErging of Experts" (MEoE) biased as an assistant without using any prompts—as a result of these efforts—palmer is better than most 1b language models on most benchmarks, despite being sometimes 40% smaller than its counterparts.
12
 
13
  ```
14
  MMLU ARC-C OBQA HellaSwag PIQA Winogrande Average
15
  tinyllama-chat | 0.2470 | 0.3285 | 0.3740 | 0.6037 | 0.7448 | 0.6022 | 0.4833 |
16
  zyte-1b | 0.2397 | 0.3353 | 0.3700 | 0.6086 | 0.7541 | 0.5998 | 0.4845 |
17
+ palmer-002.5 | 0.2534 | 0.3370 | 0.3740 | 0.6128 | 0.7486 | 0.6535 | 0.4965 |
18
  qwen-1-8 | 0.4536 | 0.3490 | 0.3320 | 0.5876 | 0.7307 | 0.5896 | 0.5070 |
19
  ```
20
 
21
+ This work constitutes, given its compactness, an advancement towards SMLs, easily empowering edge devices such as mobile phones, raspberry pis and automated software/robots. Aditionally, palmer-002.5 deviates its main philosophy from palmer-family to become a more powerful model with more data instead of less.
22
 
23
  ```
24
  prompt: Reality is but