Steelskull commited on
Commit
3bdbce5
1 Parent(s): 3170216

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -10
README.md CHANGED
@@ -62,15 +62,13 @@ min-p: 0.02-0.1
62
 
63
  ## Evals:
64
 
65
- posted soon:
66
-
67
- * Avg:
68
- * ARC:
69
- * HellaSwag:
70
- * MMLU:
71
- * T-QA:
72
- * Winogrande:
73
- * GSM8K:
74
 
75
  ## Examples:
76
  ```
@@ -112,7 +110,6 @@ Umbra-v2-MoE-4x10.7 is a Mixure of Experts (MoE) made with the following models:
112
 
113
  ```python
114
  !pip install -qU transformers bitsandbytes accelerate
115
-
116
  from transformers import AutoTokenizer
117
  import transformers
118
  import torch
 
62
 
63
  ## Evals:
64
 
65
+ * Avg: 73.59
66
+ * ARC: 69.11
67
+ * HellaSwag: 87.57
68
+ * MMLU: 66.48
69
+ * T-QA: 66.75
70
+ * Winogrande: 83.11
71
+ * GSM8K: 68.69
 
 
72
 
73
  ## Examples:
74
  ```
 
110
 
111
  ```python
112
  !pip install -qU transformers bitsandbytes accelerate
 
113
  from transformers import AutoTokenizer
114
  import transformers
115
  import torch