Text Generation
Transformers
Safetensors
English
falcon_mamba
Eval Results
Inference Endpoints
IChahed commited on
Commit
e79efbe
1 Parent(s): e93d9bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -2
README.md CHANGED
@@ -219,13 +219,14 @@ Also, we evaluate our model on the benchmarks of the first leaderboard using `li
219
  |:-----------------------------|:------:|:---------:|:-----:|:----------:|:----------:|:-----:|:----------------:|
220
  | ***Pure SSM models*** | | | | | | | |
221
  | `FalconMamba-7B`<sup>*</sup> |62.03 | 80.82 | 62.11 | 73.64 | 53.42 | 52.54 | **64.09** |
222
- | `TRI-ML/mamba-7b-rw`<sup>*</sup> | 51.25 | 80.85 | 33.41 | 71.11 | 23.13 | 4.70 | 44.03 |
223
  |***Hybrid SSM-attention models***| | | | | | | |
224
  | `recurrentgemma-9b`<sup>**</sup> |52.00 | 80.40 | 60.50 | 73.60 | 38.60 | 42.60 | 57.95 |
225
- | `Zyphra/Zamba-7B-v1`<sup>*</sup> | 56.14 | 82.23 | 58.11 | 79.87 | 36.23 | 30.78 | 57.23 |
226
  |***Transformer models*** | | | | | | | |
227
  | `Falcon2-11B` | 59.73 | 82.91 | 58.37 | 78.30 | 52.56 | 53.83 | **64.28** |
228
  | `Meta-Llama-3-8B` | 60.24 | 82.23 | 66.70 | 78.45 | 42.93 | 45.19 | 62.62 |
 
229
  | `Mistral-7B-v0.1` | 59.98 | 83.31 | 64.16 | 78.37 | 42.15 | 37.83 | 60.97 |
230
  | `gemma-7B` | 61.09 | 82.20 | 64.56 | 79.01 | 44.79 | 50.87 | 63.75 |
231
 
 
219
  |:-----------------------------|:------:|:---------:|:-----:|:----------:|:----------:|:-----:|:----------------:|
220
  | ***Pure SSM models*** | | | | | | | |
221
  | `FalconMamba-7B`<sup>*</sup> |62.03 | 80.82 | 62.11 | 73.64 | 53.42 | 52.54 | **64.09** |
222
+ | `TRI-ML/mamba-7b-rw`<sup>*</sup> | 51.25 | 80.85 | 33.41 | 71.11 | 32.08 | 4.70 | 45.52 |
223
  |***Hybrid SSM-attention models***| | | | | | | |
224
  | `recurrentgemma-9b`<sup>**</sup> |52.00 | 80.40 | 60.50 | 73.60 | 38.60 | 42.60 | 57.95 |
225
+ | `Zyphra/Zamba-7B-v1`<sup>*</sup> | 56.14 | 82.23 | 58.11 | 79.87 | 52.88 | 30.78 | 60.00 |
226
  |***Transformer models*** | | | | | | | |
227
  | `Falcon2-11B` | 59.73 | 82.91 | 58.37 | 78.30 | 52.56 | 53.83 | **64.28** |
228
  | `Meta-Llama-3-8B` | 60.24 | 82.23 | 66.70 | 78.45 | 42.93 | 45.19 | 62.62 |
229
+ | `Meta-Llama-3.1-8B` | 58.53 | 82.13 | 66.43 | 74.35 | 44.29 | 47.92 | 62.28 |
230
  | `Mistral-7B-v0.1` | 59.98 | 83.31 | 64.16 | 78.37 | 42.15 | 37.83 | 60.97 |
231
  | `gemma-7B` | 61.09 | 82.20 | 64.56 | 79.01 | 44.79 | 50.87 | 63.75 |
232