GPT2-Home

This model is fine-tuned using GPT-2 on amazon home products metadata. It can generate descriptions for your home products by getting a text prompt. https://github.com/HamidRezaAttar/GPT2-Home

Model description

GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.

Live Demo

For testing model with special configuration, please visit Demo

Blog Post

For more detailed information about project development please refer to my blog post.

How to use

For best experience and clean outputs, you can use Live Demo mentioned above, also you can use the notebook mentioned in my GitHub

You can use this model directly with a pipeline for text generation.

>>> from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
>>> tokenizer = AutoTokenizer.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
>>> model = AutoModelForCausalLM.from_pretrained("HamidRezaAttar/gpt2-product-description-generator")
>>> generator = pipeline('text-generation', model, tokenizer=tokenizer, config={'max_length':100})
>>> generated_text = generator("This bed is very comfortable.")

Citation info

@misc{GPT2-Home,
  author = {HamidReza Fatollah Zadeh Attar},
  title = {GPT2-Home the English home product description generator},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/HamidRezaAttar/GPT2-Home}},
}
Downloads last month
124
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for HamidRezaAttar/gpt2-product-description-generator

Finetunes
3 models
Quantizations
1 model

Spaces using HamidRezaAttar/gpt2-product-description-generator 2