Update README.md
Browse files
README.md
CHANGED
@@ -6,6 +6,7 @@ This repo contains a low-rank adapter for LLaMA-7b finetuned on Ntropy proprieta
|
|
6 |
|
7 |
This version of the weights was trained with the following hyperparameters:
|
8 |
|
|
|
9 |
- Epochs: 10 (load from best epoch)
|
10 |
- Batch size: 32
|
11 |
- Cutoff length: 1024
|
|
|
6 |
|
7 |
This version of the weights was trained with the following hyperparameters:
|
8 |
|
9 |
+
- Base Model: decapoda-research/llama-7b-hf
|
10 |
- Epochs: 10 (load from best epoch)
|
11 |
- Batch size: 32
|
12 |
- Cutoff length: 1024
|