--- license: apache-2.0 language: - en tags: - mistral - instruct - finetune - chatml - gpt4 ---
Open Autolycus

Support me at Ko-fi

Autolycus is a son of Hermes. Autolycus-Mistral is a language/content refinement of OpenHermes 2.5 Mistral, intended to take its output from stilted GPT-4 robotic gobbledygook into something approaching natural English - at the cost of only a very slight increase in prevarication, exaggeration and downright BS. 7-billion models are not known for their complete honesty. The most brazen examples of 'making things up', were those occasions where Autolycus actually quoted a source; usually a book title or author, sometimes a date, but which you find to be nothing more than a load of hogwash when you check it out for yourself. ## Example Compare this example (Llama-precise, with Low top_p), where the Autolycus (bottom image) improves on the response by adding extra material - making it more informative, more relevant and personal ('Visit Japan') - and at the same time gives the whole thing an earthy, almost human touch. The OpenHermes Mistral (top image) responds, rather impersonally, in the dry tones of GPT-4. - Original model: [OpenHermes 2.5 Mistral 7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)