meme-llama / README.md
bickett's picture
Update model card
c7c5d78
metadata
language: en
tags:
  - autotrain
  - text-generation
  - llm
  - memes
library_name: transformers
library_version:
  - latest version at the time of training
model_type: llama 2
widget:
  - text: 'When you try to code without coffee, '

Llama 2 Meme Generator

Model Description

This model is a fine-tuned version of the llama 2 model, specifically tailored for generating meme captions. It captures the essence and humor commonly found in popular internet memes and offers a unique approach to meme creation. Just provide a prompt or a meme context, and let the model generate a fitting caption!

Training Data

The model was trained using a diverse dataset of meme captions, spanning various internet trends, jokes, and pop culture references. This ensures a wide range of meme generation capabilities, from classic meme formats to contemporary internet humor.

Training Procedure

The model was fine-tuned using the autotrain llm command with optimal hyperparameters for meme generation. Special care was taken to avoid overfitting, ensuring the model can generalize well across various meme contexts.

Usage

To generate a meme caption using this model, you can use the following code:

from transformers import AutoTokenizer, AutoModelWithLMHead

tokenizer = AutoTokenizer.from_pretrained("bickett/meme-llama")
model = AutoModelWithLMHead.from_pretrained("bickett/meme-llama")

input_text = "When you try to code without coffee"
input_ids = tokenizer.encode(input_text, return_tensors="pt")
output = model.generate(input_ids)

print(tokenizer.decode(output[0], skip_special_tokens=True))