metadata
language:
- bg
license: mit
pipeline_tag: text-generation
model-index:
- name: chef-gpt-en
results: []
chef-gpt
Fine-tuned GPT-2 on recipe generation. This is the dataset that it's trained on.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
MODEL_ID = "auhide/chef-gpt-en"
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
chef_gpt = AutoModelForCausalLM.from_pretrained(MODEL_ID)
ingredients = ", ".join([
"spaghetti",
"tomatoes",
"basel",
"salt",
"chicken",
])
prompt = f"Ingredients: {ingredients}; Recipe:"
tokens = chef_gpt.tokenizer(prompt, return_tensors="pt")
recipe = chef_gpt.generate(**tokens, max_length=124)
print(recipe)
Here is a sample result of the prompt:
Ingredients: spaghetti, tomatoes, basel, salt, chicken; Recipe: Bring a large pot of water to a boil in a medium heat; add enough water to cover the bottom of the pot. Squeeze cooked pasta out of the water,