--- language: - sv - en license: mit tags: - pretrained - flashback - web - conversational - chat datasets: - timpal0l/OpenHermes-2.5-sv - teknium/OpenHermes-2.5 pipeline_tag: text-generation --- # 🐈‍⬛ Mistral-7B-v0.1-flashback-v2-instruct [Mistral-7B-v0.1-flashback-v2-instruct](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2-instruct) is an instruct based version of the base model [timpal0l/Mistral-7B-v0.1-flashback-v2](https://huggingface.co/timpal0l/Mistral-7B-v0.1-flashback-v2). It has been finetuned on a the machine translated instruct dataset [OpenHermes2.5](https://huggingface.co/datasets/timpal0l/OpenHermes-2.5-sv). ## How to use: ```python from transformers import pipeline pipe = pipeline( "text-generation", "timpal0l/Mistral-7B-v0.1-flashback-v2-instruct", device_map="auto" ) text = """ Hur mĂ„nga Ă€gg har jag? Jag hade 10 Ă€gg, sen gav jag bort 5 Ă€gg. Sen fick jag 3 Ă€gg av en kompis. """ generated = pipe(f"USER:{text}ASSISTANT:", max_length=512, temperature=0.6) print(generated[0]["generated_text"].split("ASSISTANT: ")[1:][0]) ``` Output: ```html Du har 8 Ă€gg. HĂ€r Ă€r resonemanget: 1. Du börjar med 10 Ă€gg 2. Du ger bort 5 Ă€gg, vilket lĂ€mnar dig med 10 - 5 = 5 Ă€gg 3. Sedan fĂ„r du 3 Ă€gg av en kompis, vilket gör att du har 5 + 3 = 8 Ă€gg. ```