--- tags: - generated_from_trainer model-index: - name: TinyStories-3M-val-Hebrew results: [] license: mit language: - he widget: - text: היה פעם - text: פעם אחת - text: החתול שלך מאוד חמוד ו pipeline_tag: text-generation --- # TinyStories-3M-val-Hebrew This model is trained upon [Norod78/TinyStoriesV2-GPT4-valid_heb-lineByLine-EoT](https://huggingface.co/datasets/Norod78/TinyStoriesV2-GPT4-valid_heb-lineByLine-EoT) Dataset is a machine translation of [TinyStoriesV2-GPT4-valid.txt](https://huggingface.co/datasets/roneneldan/TinyStories/blob/main/TinyStoriesV2-GPT4-valid.txt) by [roneneldan](https://huggingface.co/roneneldan) Trasnlation was done using [this](https://huggingface.co/datasets/Norod78/TinyStoriesV2-GPT4-valid_heb-lineByLine-EoT/blob/main/translate_file_2.py) script Original [Dataset](https://huggingface.co/datasets/roneneldan/TinyStories) containing synthetically generated (by GPT-3.5 and GPT-4) short stories that only use a small vocabulary. ## Model description A very very small model (8M params) tarined on a very small dataset A [sample inference script](https://huggingface.co/Norod78/TinyStories-3M-val-Hebrew/blob/main/TinyStories-3M-val-Hebrew-inference.py) is available ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 24 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_steps: 500 - num_epochs: 300.0 ### Framework versions - Transformers 4.31.0.dev0 - Pytorch 2.0.0 - Datasets 2.13.1 - Tokenizers 0.13.3