--- base_model: Felladrin/TinyMistral-248M-SFT-v4 datasets: - OpenAssistant/oasst_top1_2023-08-25 inference: false license: apache-2.0 model_creator: Felladrin model_name: TinyMistral-248M-SFT-v4 pipeline_tag: text-generation quantized_by: afrideva tags: - text-generation - gguf - ggml - quantized - q2_k - q3_k_m - q4_k_m - q5_k_m - q6_k - q8_0 widget: - text: '<|im_start|>user Invited some friends to come home today. Give me some ideas for games to play with them!<|im_end|> <|im_start|>assistant' - text: '<|im_start|>user How do meteorologists predict how much air pollution will be produced in the next year?<|im_end|> <|im_start|>assistant' - text: '<|im_start|>user Who is Mona Lisa?<|im_end|> <|im_start|>assistant' - text: '<|im_start|>user Heya!<|im_end|> <|im_start|>assistant Hi! How may I help you today?<|im_end|> <|im_start|>user I need to build a simple website. Where should I start learning about web development?<|im_end|> <|im_start|>assistant' - text: '<|im_start|>user What are some potential applications for quantum computing?<|im_end|> <|im_start|>assistant' - text: '<|im_start|>user Write the specs of a game about dragons and warriors in a fantasy world.<|im_end|> <|im_start|>assistant' - text: "<|im_start|>user\nGot a question for you!<|im_end|>\n<|im_start|>assistant\nSure! What's it?<|im_end|>\n<|im_start|>user\nWhy do you love cats so much!? \U0001F408<|im_end|>\n<|im_start|>assistant" - text: '<|im_start|>user Tell me about the pros and cons of social media.<|im_end|> <|im_start|>assistant' --- # Felladrin/TinyMistral-248M-SFT-v4-GGUF Quantized GGUF model files for [TinyMistral-248M-SFT-v4](https://huggingface.co/Felladrin/TinyMistral-248M-SFT-v4) from [Felladrin](https://huggingface.co/Felladrin) | Name | Quant method | Size | | ---- | ---- | ---- | | [tinymistral-248m-sft-v4.fp16.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.fp16.gguf) | fp16 | 497.75 MB | | [tinymistral-248m-sft-v4.q2_k.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q2_k.gguf) | q2_k | 116.20 MB | | [tinymistral-248m-sft-v4.q3_k_m.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q3_k_m.gguf) | q3_k_m | 131.01 MB | | [tinymistral-248m-sft-v4.q4_k_m.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q4_k_m.gguf) | q4_k_m | 156.60 MB | | [tinymistral-248m-sft-v4.q5_k_m.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q5_k_m.gguf) | q5_k_m | 180.16 MB | | [tinymistral-248m-sft-v4.q6_k.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q6_k.gguf) | q6_k | 205.20 MB | | [tinymistral-248m-sft-v4.q8_0.gguf](https://huggingface.co/afrideva/TinyMistral-248M-SFT-v4-GGUF/resolve/main/tinymistral-248m-sft-v4.q8_0.gguf) | q8_0 | 265.26 MB | ## Original Model Card: # Locutusque's TinyMistral-248M trained on OpenAssistant TOP-1 Conversation Threads - Base model: [Locutusque/TinyMistral-248M](https://huggingface.co/Locutusque/TinyMistral-248M) - Dataset: [OpenAssistant/oasst_top1_2023-08-25](https://huggingface.co/datasets/OpenAssistant/oasst_top1_2023-08-25) - Availability in other ML formats: - GGUF: [Felladrin/gguf-TinyMistral-248M-SFT-v4](https://huggingface.co/Felladrin/gguf-TinyMistral-248M-SFT-v4) - ONNX: [Felladrin/onnx-TinyMistral-248M-SFT-v4](https://huggingface.co/Felladrin/onnx-TinyMistral-248M-SFT-v4) ## Recommended Prompt Format ``` <|im_start|>user {message}<|im_end|> <|im_start|>assistant ```