NielsV's picture
Create README.md
1dbc4dd
|
raw
history blame
No virus
934 Bytes
metadata
datasets:
  - reddit
metrics:
  - rouge

Distilbart-cnn-6-6 finetuned on the reddit dataset

Example usage:

# Load finetuned model
tokenizer = BartTokenizer.from_pretrained("NielsV/distilbart-cnn-6-6-reddit")
model = BartForConditionalGeneration.from_pretrained("NielsV/distilbart-cnn-6-6-reddit")

input_text = "..."  # The text you want summarized

# Tokenize the text, summarize and decode the result 
inputs = tokenizer(input_txt, max_length=1024, return_tensors="pt")
summary_ids = model.generate(inputs["input_ids"], num_beams=2, min_length=0, max_length=60)
summary = tokenizer.batch_decode(summary_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]

# The string summary contains the tldr

For more information, check out this repository