mkthoma commited on
Commit
dd785f5
1 Parent(s): cc30221

readme update

Browse files
Files changed (1) hide show
  1. README.md +16 -1
README.md CHANGED
@@ -10,4 +10,19 @@ pinned: false
10
  license: mit
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  license: mit
11
  ---
12
 
13
+ # GPT from scratch
14
+
15
+ This repo contains code to train a GPT from scratch. The dataset is taken from the [RedPajama 1 trillion data](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T-Sample). Only samples from this are taken and used for the training purposes. The implementation of the transformer is similar to the [LitGPT](https://github.com/Lightning-AI/lit-gpt).
16
+
17
+ The trained model has a parameter count of about 160M. The final training loss was found to be 3.2154.
18
+
19
+ ![image](https://github.com/mkthoma/gpt_from_scratch/assets/135134412/23195bda-97ce-4b13-a96b-53552ba2a57e)
20
+
21
+ The training details can be found in the attached notebooks. The initial training was stopped when the loss was around 4.
22
+
23
+ ![image](https://github.com/mkthoma/gpt_from_scratch/assets/135134412/f0122ba2-b9b3-430d-a6f3-cdde5263a674)
24
+
25
+
26
+ Using the checkpoint, the training was resumed and stopped when it went below 3.5.
27
+
28
+ Github link - https://github.com/mkthoma/gpt_from_scratch