English
zpn meg HF staff commited on
Commit
bcf5a1e
1 Parent(s): c572670

- 2 -> 3 (2cb0eab5a6ff07867e2f236143ae11ea3963d387)


Co-authored-by: Margaret Mitchell <meg@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -9,7 +9,7 @@ language:
9
  # gpt4all-lora
10
 
11
  An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
12
- This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-2 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2) is trained with three.
13
  Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
14
 
15
  ## Model Details
 
9
  # gpt4all-lora
10
 
11
  An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
12
+ This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-3 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-3) is trained with three.
13
  Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
14
 
15
  ## Model Details