--- license: llama2 language: - en --- Base Model: https://huggingface.co/meta-llama/Llama-2-7b-chat-hf --- Model fine-tuned on a real news dataset and optimized for neural news generation. ```python from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline # Load model and tokenizer tokenizer = AutoTokenizer.from_pretrained("meta-llama/Llama-2-7b-chat-hf") model = AutoModelForSequenceClassification.from_pretrained('tum-nlp/neural-news-generator-llama-2-7b-chat-en') # Create the pipeline for neural news generation and set the repetition penalty >1.1 to punish repetition. generator = pipeline('text-generation', model=model, tokenizer=tokenizer, repetition_penalty=1.2) # Define the prompt prompt = "Headline: UK headline inflation rate drops sharply to 6.8% in July, in line with expectations Article: LONDON U.K. headline inflation cooled sharply in July to [EOP]" # Generate generator(prompt, max_length=1000, num_return_sequences=1) ``` Trained on 6k datapoints (including all splits) from: https://paperswithcode.com/dataset/cc-news