File size: 1,618 Bytes
bc8dfff
c7c5d78
bc8dfff
 
 
c7c5d78
 
 
 
 
bc8dfff
c7c5d78
bc8dfff
 
c7c5d78
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
language: en
tags:
- autotrain
- text-generation
- llm
- memes
library_name: transformers
library_version: [latest version at the time of training]
model_type: llama 2
widget:
- text: "When you try to code without coffee, "
---

# Llama 2 Meme Generator

## Model Description

This model is a fine-tuned version of the `llama 2` model, specifically tailored for generating meme captions. It captures the essence and humor commonly found in popular internet memes and offers a unique approach to meme creation. Just provide a prompt or a meme context, and let the model generate a fitting caption!

## Training Data

The model was trained using a diverse dataset of meme captions, spanning various internet trends, jokes, and pop culture references. This ensures a wide range of meme generation capabilities, from classic meme formats to contemporary internet humor.

## Training Procedure

The model was fine-tuned using the `autotrain llm` command with optimal hyperparameters for meme generation. Special care was taken to avoid overfitting, ensuring the model can generalize well across various meme contexts.

## Usage

To generate a meme caption using this model, you can use the following code:

```python
from transformers import AutoTokenizer, AutoModelWithLMHead

tokenizer = AutoTokenizer.from_pretrained("bickett/meme-llama")
model = AutoModelWithLMHead.from_pretrained("bickett/meme-llama")

input_text = "When you try to code without coffee"
input_ids = tokenizer.encode(input_text, return_tensors="pt")
output = model.generate(input_ids)

print(tokenizer.decode(output[0], skip_special_tokens=True))