Mayonnaise LLM
Mayo is a language model fine-tuned on the Mayo dataset using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the Mistral 7b Model
Features
- Utilizes SFT and TRL techniques for improved performance
- Supports English language
Usage
To use the Mayo LLM, you can load the model using the Hugging Face Transformers library:
from transformers import pipeline
pipe = pipeline("text-generation", model="nroggendorff/mayo")
question = "What color is the sky?"
conv = [{"role": "user", "content": question}]
response = pipe(conv, max_new_tokens=32)[0]['generated_text'][-1]['content']
print(response)
License
This project is licensed under the MIT License.
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for nroggendorff/mayo
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3