OSainz commited on
Commit
bcb38ef
·
verified ·
1 Parent(s): 41caf01

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -262,8 +262,8 @@ The model was evaluated using the LM Evaluation harness library from Eleuther AI
262
  | Mixtral | 8x7B | 52.55 | 50.44 | 45.00 | 26.43 | 37.50 | 42.51 | 39.87 | 41.97 |
263
  | Yi | 34B | 52.22 | 54.56 | 43.90 | 27.30 | 34.66 | 42.57 | 39.68 | 42.05 |
264
  | Llama 2 | 70B | 51.62 | 33.56 | 42.55 | 24.16 | 27.84 | 38.43 | 33.08 | 35.47 |
265
- | **Latxa v1** | 70B | 67.57 | 71.78 | 59.37 | 48.19 | 49.72 | 57.84 | 51.68 | 58.02 |
266
- | **Latxa v1.1** | 70B | **69.76**| **64.89**| **61.66**| **60.61**| **53.69**| **61.52** | **54.48**| **60.94** |
267
 
268
 
269
  # **Environmental Impact**
 
262
  | Mixtral | 8x7B | 52.55 | 50.44 | 45.00 | 26.43 | 37.50 | 42.51 | 39.87 | 41.97 |
263
  | Yi | 34B | 52.22 | 54.56 | 43.90 | 27.30 | 34.66 | 42.57 | 39.68 | 42.05 |
264
  | Llama 2 | 70B | 51.62 | 33.56 | 42.55 | 24.16 | 27.84 | 38.43 | 33.08 | 35.47 |
265
+ | **Latxa v1** | 70B | 67.57 | **71.78** | 59.37 | 48.19 | 49.72 | 57.84 | 51.68 | 58.02 |
266
+ | **Latxa v1.1** | 70B | **69.76**| 64.89| **61.66**| **60.61**| **53.69**| **61.52** | **54.48**| **60.94** |
267
 
268
 
269
  # **Environmental Impact**