fix table
Browse files
README.md
CHANGED
@@ -160,8 +160,8 @@ You can try with other prompts that are not maths related as well! :hugs:
|
|
160 |
|
161 |
We benchmarked our model on the following tasks: [BoolQ](https://huggingface.co/datasets/boolq), [PIQA](https://huggingface.co/datasets/piqa), [WinoGrande](https://huggingface.co/datasets/winogrande), [OpenBookQA](https://huggingface.co/datasets/openbookqa).
|
162 |
|
163 |
-
|
|
164 |
-
| --- | ---
|
165 |
| Original LLaMA 7B | 76.5 | 79.8 | 70.1 | 57.2 | fp32 | 3 seconds |
|
166 |
| Original LLaMA 13B | 78.1 | 80.1 | 73 | 56.4 | fp32 | >5 seconds |
|
167 |
| LoRA LLaMA 7B | 63.9 | 51.3 | 48.9 | 31.4 | 8bit | 0.65 seconds |
|
|
|
160 |
|
161 |
We benchmarked our model on the following tasks: [BoolQ](https://huggingface.co/datasets/boolq), [PIQA](https://huggingface.co/datasets/piqa), [WinoGrande](https://huggingface.co/datasets/winogrande), [OpenBookQA](https://huggingface.co/datasets/openbookqa).
|
162 |
|
163 |
+
| | BoolQ | PIQA | WinoGrande | OpenBookQA | Precision | Inference time (s) |
|
164 |
+
| --- | --- | --- | --- | --- | --- | --- |
|
165 |
| Original LLaMA 7B | 76.5 | 79.8 | 70.1 | 57.2 | fp32 | 3 seconds |
|
166 |
| Original LLaMA 13B | 78.1 | 80.1 | 73 | 56.4 | fp32 | >5 seconds |
|
167 |
| LoRA LLaMA 7B | 63.9 | 51.3 | 48.9 | 31.4 | 8bit | 0.65 seconds |
|