aloobun commited on
Commit
755fda8
1 Parent(s): 5ddf948

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ datasets:
15
 
16
  ![Reyna aloobun qwen4B](https://i.imgur.com/QfbOY6c.jpeg)
17
  - Finetuned [Qwen/Qwen1.5-4B](https://huggingface.co/Qwen/Qwen1.5-4B), with SFT on variety of CoT tasks including Reasoning, Closed Book Question Answering, Ethics, and more.
18
- - Datasets : Curated from - [kaist-ai/CoT-Collection](https://huggingface.co/datasets/kaist-ai/CoT-Collection), [euclaise/TinyCoT](https://huggingface.co/datasets/euclaise/TinyCoT) and a very small subset from private data + [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5).
19
  - This marks the fourth model in this series. This experiment aims to improve Chain of Thought (CoT) capabilities on smaller language models.
20
  - In the next run, I may rerun the finetuning experiment using an iterative rationale-bootstrapping procedure inspired by euclaise/Memphis-CoT-3B.
21
  - Hyperparameter: adamw with eps of 1e-8, cosine decay with 20% warmup, lr=2e-5
 
15
 
16
  ![Reyna aloobun qwen4B](https://i.imgur.com/QfbOY6c.jpeg)
17
  - Finetuned [Qwen/Qwen1.5-4B](https://huggingface.co/Qwen/Qwen1.5-4B), with SFT on variety of CoT tasks including Reasoning, Closed Book Question Answering, Ethics, and more.
18
+ - Datasets : Curated from - [kaist-ai/CoT-Collection](https://huggingface.co/datasets/kaist-ai/CoT-Collection), [euclaise/TinyCoT](https://huggingface.co/datasets/euclaise/TinyCoT) and a very small subset from [teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5).
19
  - This marks the fourth model in this series. This experiment aims to improve Chain of Thought (CoT) capabilities on smaller language models.
20
  - In the next run, I may rerun the finetuning experiment using an iterative rationale-bootstrapping procedure inspired by euclaise/Memphis-CoT-3B.
21
  - Hyperparameter: adamw with eps of 1e-8, cosine decay with 20% warmup, lr=2e-5