--- language: - eng pipeline_tag: text-generation Trained: Pretrain Config file: 1.3B Data: English News Dataset 2GB --- # Model Card for Model ID Pretrained GPT-NeoX model with 2.06GB English news dataset. Took about 10 hours to reach 20,000 iterations. Trained on p3.16xlarge. Different hyperparameter: gradient_accumulation_step 4 ## Model Details ### Model Description - **Developed by:** Eunyoung Lee - **Model type:** GPT-NeoX - **Language(s) (NLP):** English