lucianosb commited on
Commit
47a6044
1 Parent(s): 9d5a1d9

Adiciona métricas do Openllm PT Leaderboard

Browse files
Files changed (1) hide show
  1. README.md +10 -29
README.md CHANGED
@@ -231,35 +231,7 @@ sequences = pipe(
231
  print(sequences[0]["generated_text"])
232
  ```
233
 
234
- ## Métricas
235
-
236
- | Tasks |Version| Filter |n-shot|Metric|Value | |Stderr|
237
- |---------|------:|-----------------------|-----:|------|-----:|---|-----:|
238
- |bluex | 1.1|all | 3|acc |0.0083|± |0.0020|
239
- |enem | 1.1|all | 3|acc |0.0014|± |0.0006|
240
- |oab_exams | 1.5|all | 3|acc |0.0096|± |0.0012|
241
- |assin2_rte| 1.1|all | 15|f1_macro|0.9032|± |0.0042|
242
- | | |all | 15|acc |0.9032|± |0.0042|
243
- |assin2_sts| 1.1|all | 15|pearson |0.4912|± |0.0141|
244
- | | |all | 15|mse |1.3185|± |N/A |
245
- |faquad_nli| 1.1|all | 15|f1_macro|0.6104|± |0.0137|
246
- | | |all | 15|acc |0.6292|± |0.0134|
247
- |hatebr_offensive_binary | 1|all | 25|f1_macro|0.7888|± |0.0078|
248
- | | |all | 25|acc |0.7936|± |0.0077|
249
- |portuguese_hate_speech_binary| 1|all | 25|f1_macro|0.5503|± |0.0121|
250
- | | |all | 25|acc |0.5523|± |0.0121|
251
-
252
-
253
- # Uploaded model
254
-
255
- - **Developed by:** lucianosb
256
- - **License:** apache-2.0
257
- - **Finetuned from model :** unsloth/mistral-7b-bnb-4bit
258
-
259
- This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
260
-
261
- [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
262
- # [Open Portuguese LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard)
263
  Detailed results can be found [here](https://huggingface.co/datasets/eduagarcia-temp/llm_pt_leaderboard_raw_results/tree/main/lucianosb/boto-7B)
264
 
265
  | Metric | Value |
@@ -275,3 +247,12 @@ Detailed results can be found [here](https://huggingface.co/datasets/eduagarcia-
275
  |PT Hate Speech Binary | 58.84|
276
  |tweetSentBR | 57.20|
277
 
 
 
 
 
 
 
 
 
 
 
231
  print(sequences[0]["generated_text"])
232
  ```
233
 
234
+ ## [Open Portuguese LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/eduagarcia/open_pt_llm_leaderboard)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
235
  Detailed results can be found [here](https://huggingface.co/datasets/eduagarcia-temp/llm_pt_leaderboard_raw_results/tree/main/lucianosb/boto-7B)
236
 
237
  | Metric | Value |
 
247
  |PT Hate Speech Binary | 58.84|
248
  |tweetSentBR | 57.20|
249
 
250
+ # Uploaded model
251
+
252
+ - **Developed by:** lucianosb
253
+ - **License:** apache-2.0
254
+ - **Finetuned from model :** unsloth/mistral-7b-bnb-4bit
255
+
256
+ This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
257
+
258
+ [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)