nicholasKluge commited on
Commit
b1035cf
1 Parent(s): 39282c7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -13
README.md CHANGED
@@ -11,13 +11,13 @@ pipeline_tag: text-generation
11
  tags:
12
  - text-generation-inference
13
  widget:
14
- - text: Astronomia é uma ciência natural que estuda
15
  example_title: Exemplo
16
- - text: Em um achado chocante, o cientista descobriu um
17
  example_title: Exemplo
18
- - text: Python é uma linguagem de
19
  example_title: Exemplo
20
- - text: O Gato de Schrödinger é uma experiência mental
21
  example_title: Exemplo
22
  inference:
23
  parameters:
@@ -67,7 +67,7 @@ This repository has the [source code](https://github.com/Nkluge-correa/Aira) use
67
  - [Accelerate](https://github.com/huggingface/accelerate)
68
  - [Codecarbon](https://github.com/mlco2/codecarbon)
69
 
70
- - ## Training Set-up
71
 
72
  These are the main arguments used in the training of this model:
73
 
@@ -177,14 +177,11 @@ for i, completion in enumerate(completions):
177
 
178
  ## Fine Tuning
179
 
180
- | Models | Average | [ARC](https://arxiv.org/abs/1803.05457) | [Hellaswag](https://arxiv.org/abs/1905.07830) | [MMLU](https://arxiv.org/abs/2009.03300) | [TruthfulQA](https://arxiv.org/abs/2109.07958) |
181
- |-------------------------------------------------------------------------------------|---------|-----------------------------------------|-----------------------------------------------|------------------------------------------|------------------------------------------------|
182
- | [Teeny Tiny Llama 162m](https://huggingface.co/nicholasKluge/Teeny-tiny-llama-162m) | 31.16 | 26.15 | 29.29 | 28.11 | 41.12 |
183
- | [Pythia-160m](https://huggingface.co/EleutherAI/pythia-160m-deduped) | 31.16 | 24.06 | 31.39 | 24.86 | 44.34 |
184
- | [OPT-125m](https://huggingface.co/facebook/opt-125m) | 30.80 | 22.87 | 31.47 | 26.02 | 42.87 |
185
- | [Gpt2-portuguese-small](https://huggingface.co/pierreguillou/gpt2-small-portuguese) | 30.22 | 22.48 | 29.62 | 27.36 | 41.44 |
186
- | [Gpt2-small](https://huggingface.co/gpt2) | 29.97 | 21.48 | 31.60 | 25.79 | 40.65 |
187
-
188
 
189
  ## Cite as 🤗
190
 
 
11
  tags:
12
  - text-generation-inference
13
  widget:
14
+ - text: "Astronomia é uma ciência natural que estuda"
15
  example_title: Exemplo
16
+ - text: "Em um achado chocante, o cientista descobriu um"
17
  example_title: Exemplo
18
+ - text: "Python é uma linguagem de"
19
  example_title: Exemplo
20
+ - text: "O Gato de Botas é conhecido por"
21
  example_title: Exemplo
22
  inference:
23
  parameters:
 
67
  - [Accelerate](https://github.com/huggingface/accelerate)
68
  - [Codecarbon](https://github.com/mlco2/codecarbon)
69
 
70
+ ## Training Set-up
71
 
72
  These are the main arguments used in the training of this model:
73
 
 
177
 
178
  ## Fine Tuning
179
 
180
+ | Models | [IMDB](https://huggingface.co/datasets/christykoh/imdb_pt) | [FaQuAD-NLI](https://huggingface.co/datasets/ruanchaves/faquad-nli) | [HateBr](https://huggingface.co/datasets/ruanchaves/hatebr) |
181
+ |--------------------------------------------------------------------------------------------|------------------------------------------------------------|---------------------------------------------------------------------|-------------------------------------------------------------|
182
+ | [Teeny Tiny Llama 162m](https://huggingface.co/nicholasKluge/TeenyTinyLlama-162m) | 91.14 | 90.00 | 90.71 |
183
+ | [Bert-base-portuguese-cased](https://huggingface.co/neuralmind/bert-base-portuguese-cased) | 92.22 | 93.07 | 91.28 |
184
+ | [Gpt2-small-portuguese](https://huggingface.co/pierreguillou/gpt2-small-portuguese) | 91.60 | 86.46 | 87.42 |
 
 
 
185
 
186
  ## Cite as 🤗
187