Edit model card

News Title(Headline) Generator 📰

This document details the development of our innovative News Title Generator, designed to produce compelling and informative titles for your news articles. Check the Live Demo Here.

I've tested several other news headline generators on Hugging Face and across the internet, and I can confidently say that this one is the best. 🤗

Model Architecture:

  • Foundation: The T5 base model from the Transformers library is our title generator's foundation. This powerful pre-trained model is adept at various text-to-text tasks, making it an ideal choice for our application.
  • Fine-Tuning: To optimize performance specifically for news title generation, we fine-tuned the T5 base model on a curated dataset from Hugging Face https://huggingface.co/datasets/Ateeqq/news-title-generator. This dataset consists of over 78,000 training examples, ensuring the model learns the nuances and structure of effective news titles.

How to use?

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("Ateeqq/news-title-generator")
model = AutoModelForSeq2SeqLM.from_pretrained("Ateeqq/news-title-generator")
def generate_title(input_text):

  input_ids = tokenizer.encode(input_text, return_tensors="pt")
  output = model.generate(input_ids)
  decoded_text = tokenizer.decode(output[0], skip_special_tokens=True)
  return decoded_text

input_text = "A group of scientists discovered a new planet."
generated_title = generate_title(input_text)

print(f"Generated Title: {generated_title}")

Technical Specifications

  • Framework: PyTorch, a popular deep learning framework, provides the foundation for our model's development and execution.
  • Dataset Split: The training data is strategically divided into two sets: 78,720 examples for training and 19,681 examples for testing. This split ensures the model is adequately trained while reserving a portion for evaluating its generalizability.
  • Model Parameters: The fine-tuned model boasts 223 million trainable parameters, allowing it to capture the intricate relationships between text elements that contribute to strong news titles.

Training Configuration

  • Batch Size: 8
  • Maximum Epochs: The training process iterates through the entire dataset three times (epochs) to ensure thorough learning.
  • Global Seed: A fixed random seed (42) is set to guarantee reproducibility of training results.
  • Token Length Limits: The source text (article content) is restricted to a maximum of 128 tokens, while the generated titles are capped at 50 tokens.

Key Takeaways

Our News Title Generator leverages the power of the T5 base model, fine-tuned on a comprehensive news title dataset, to deliver exceptional results. The model's architecture and training configuration are meticulously designed to produce high-quality, informative titles within an appropriate character length. This tool empowers creators and journalists to craft impactful headlines that effectively capture readers' attention.

Contact us (at exnrt.com/contact-us) today to learn more about integrating the News Title Generator into your editorial workflow and unlock the full potential of AI-driven journalism.

Downloads last month
76

Dataset used to train Ateeqq/news-title-generator