File size: 1,087 Bytes
8612dcc e3e1d92 0490a52 e3e1d92 f9fe2ae 9451082 07ac8dc 9451082 6fea33e 07ac8dc 6fea33e 9451082 f9fe2ae 8612dcc e3e1d92 059e60b e3e1d92 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 |
---
license: apache-2.0
datasets:
- Locutusque/hercules-v1.0
- Open-Orca/SlimOrca-Dedup
language:
- en
base_model: Locutusque/TinyMistral-248M-v2.5
tags:
- chemistry
- biology
- not-for-all-audiences
- merge
- code
inference:
parameters:
do_sample: true
renormalize_logits: false
temperature: 0.8
top_p: 0.14
top_k: 12
min_new_tokens: 2
max_new_tokens: 96
repetition_penalty: 1.15
no_repeat_ngram_size: 5
epsilon_cutoff: 0.002
widget:
- text: "<|im_start|>user\nWrite me a Python program that calculates the factorial of n. <|im_end|>\n<|im_start|>assistant\n"
---
# Model description
Fine-tuned Locutusque/TinyMistral-248m-v2.5 on SlimOrca-Dedup and Hercules-v1.0. Averaged a loss of 1.5 during training. This model's performance is excellent considering it's size.
This model may output X-rated content. You and you alone are responsible for downloading and using the model and it's outputs. You have been warned.
You can use the ChatML prompt format for this model.
# Evaluation
This model will be submitted to the Open LLM Leaderboard. |