English
meg HF staff commited on
Commit
3cd745a
1 Parent(s): 5ba998b

Small correction on number of epochs.

Browse files

Looks like the model description was copied over from https://huggingface.co/nomic-ai/gpt4all-lora , so needs to be slightly tweaked.

Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -8,10 +8,10 @@ language:
8
 
9
  # gpt4all-lora-epoch-3
10
 
11
- This is an intermediate (epoch 3 / 4) checkpoint from `nomic-ai/gpt4all-lora`.
12
 
13
  An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
14
- This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-2 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2) is trained with three.
15
  Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
16
 
17
  ## Model Details
 
8
 
9
  # gpt4all-lora-epoch-3
10
 
11
+ This is an intermediate (epoch 3 / 4) checkpoint from [`nomic-ai/gpt4all-lora`](https://huggingface.co/nomic-ai/gpt4all-lora).
12
 
13
  An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
14
+ This model is trained with three epochs of training, while the related [gpt4all-lora model](https://huggingface.co/nomic-ai/gpt4all-lora) is trained with four.
15
  Replication instructions and data: [https://github.com/nomic-ai/gpt4all](https://github.com/nomic-ai/gpt4all)
16
 
17
  ## Model Details