--- base_model: "cpayne1303/cp2024" language: - en datasets: - teknium/OpenHermes-2.5 language: - en library_name: transformers license: apache-2.0 --- ## Model Description This is a model using the llama2 architecture and only 30 million parameters. It is based off of this model and was finetuned on approximately 85 million tokens of instruct data from the first 20000 rows of the openhermes 2.5 dataset with a low learning rate of 2e-6 and context length of 512.