Commit
•
4c1ddbe
1
Parent(s):
bcc175a
Update README.md
Browse files
README.md
CHANGED
@@ -33,7 +33,7 @@ According to the leaderboard description, here are the benchmarks used for the e
|
|
33 |
- [TruthfulQA](https://arxiv.org/abs/2109.07958) (0-shot) - a test to measure a model’s propensity to reproduce falsehoods commonly found online.
|
34 |
|
35 |
## Leaderboard Highlights (as of August 17, 2023)
|
36 |
-
- Godzilla 2 70B ranks 4th
|
37 |
- Godzilla 2 70B ranks #3 in the ARC challenge.
|
38 |
- Godzilla 2 70B ranks #5 in the TruthfulQA benchmark.
|
39 |
- *Godzilla 2 70B beats GPT-3.5 (ChatGPT) in terms of average performance and the HellaSwag benchmark (87.53 > 85.5).
|
@@ -86,6 +86,12 @@ python main.py --model hf-causal-experimental --model_args pretrained=MayaPH/God
|
|
86 |
### Response:
|
87 |
```
|
88 |
|
|
|
|
|
|
|
|
|
|
|
|
|
89 |
## Ethical Considerations
|
90 |
When using GodziLLa 2 70B, it is important to consider the following ethical considerations:
|
91 |
|
|
|
33 |
- [TruthfulQA](https://arxiv.org/abs/2109.07958) (0-shot) - a test to measure a model’s propensity to reproduce falsehoods commonly found online.
|
34 |
|
35 |
## Leaderboard Highlights (as of August 17, 2023)
|
36 |
+
- Godzilla 2 70B ranks 4th worldwide in the Open LLM Leaderboard.
|
37 |
- Godzilla 2 70B ranks #3 in the ARC challenge.
|
38 |
- Godzilla 2 70B ranks #5 in the TruthfulQA benchmark.
|
39 |
- *Godzilla 2 70B beats GPT-3.5 (ChatGPT) in terms of average performance and the HellaSwag benchmark (87.53 > 85.5).
|
|
|
86 |
### Response:
|
87 |
```
|
88 |
|
89 |
+
## Technical Considerations
|
90 |
+
|
91 |
+
When using GodziLLa 2 70B, kindly take note of the following:
|
92 |
+
- The default precision is `fp32`, and the total file size that would be loaded onto the RAM/VRAM is around 275 GB. Consider using a lower precision (fp16, int8, int4) to save memory.
|
93 |
+
- To further save on memory, set the `low_cpu_mem_usage` argument to True.
|
94 |
+
|
95 |
## Ethical Considerations
|
96 |
When using GodziLLa 2 70B, it is important to consider the following ethical considerations:
|
97 |
|