2 -> 3
#4
by
meg
HF staff
- opened
README.md
CHANGED
@@ -9,7 +9,7 @@ language:
|
|
9 |
# gpt4all-lora
|
10 |
|
11 |
An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
|
12 |
-
This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-
|
13 |
Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
|
14 |
|
15 |
## Model Details
|
|
|
9 |
# gpt4all-lora
|
10 |
|
11 |
An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
|
12 |
+
This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-3 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-3) is trained with three.
|
13 |
Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
|
14 |
|
15 |
## Model Details
|