--- license: apache-2.0 base_model: mistralai/Mistral-7B-v0.1 tags: - generated_from_trainer - GEITje - conversational model-index: - name: Mistral-7B-v0.1-chat-nl results: [] datasets: - Rijgersberg/no_robots_nl - Rijgersberg/ultrachat_10k_nl language: - nl pipeline_tag: text-generation --- # Mistral-7B-v0.1-chat-nl This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the Rijgersberg/no_robots_nl and Rijgersberg/ultrachat_10k_nl datasets. It achieves the following results on the evaluation set: - Loss: 1.0263 ## Model description In order to investigate the effect of pretraining [Rijgersberg/GEITje-7B](https://huggingface.co/Rijgersberg/GEITje-7B-chat) on the finetuning of [Rijgersberg/GEITje-7B-chat](https://huggingface.co/Rijgersberg/GEITje-7B-chat), I also subjected the base model Mistral 7B v0.1 to the exact same training. This model is called Mistral-7B-v0.1-chat-nl. ## More info Read more about GEITje and GEITje-chat in the [📄 README](https://github.com/Rijgersberg/GEITje/blob/main/README-en.md) on GitHub. ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.2404 | 0.2 | 236 | 1.1166 | | 1.2103 | 0.4 | 472 | 1.1101 | | 1.0357 | 0.6 | 708 | 1.0739 | | 1.27 | 0.8 | 944 | 1.0540 | | 1.3557 | 1.0 | 1180 | 1.0330 | | 0.7919 | 1.2 | 1416 | 1.0368 | | 0.8701 | 1.4 | 1652 | 1.0193 | | 0.8851 | 1.6 | 1888 | 1.0009 | | 0.7562 | 1.8 | 2124 | 0.9791 | | 0.6838 | 2.0 | 2360 | 0.9823 | | 0.5011 | 2.2 | 2596 | 1.0271 | | 0.4495 | 2.39 | 2832 | 1.0267 | | 0.5625 | 2.59 | 3068 | 1.0250 | | 0.4486 | 2.79 | 3304 | 1.0262 | | 0.5706 | 2.99 | 3540 | 1.0263 | ### Framework versions - Transformers 4.36.0.dev0 - Pytorch 2.1.1+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0