Edit model card

Facebook/bart-large-cnn model

This model is intended to summarize political texts regarding climate change, the environment and energy. The model was fine-tuned on 7k political party press releases from 66 parties in 12 different countries and is intended to identify the primary issue of the press release, the position of the party on the primary issue, and a 1-2 sentence summary. Training Data primarily consists of GPT-4 responses asking for summaries of the press releases. Small modifications were also made to the summaries from GPT-4 when validating the responses. I also made all the training text summaries lower case by accident (oops!), so outputs are lowercase.

Note The model is pretty good at identifying the primary issue of any text, but it'll refer to the author of the text as 'the party' and summarize the "position" of the party as such.

Countries included in Training Data = ['Italy', 'Sweden', 'Switzerland', 'Netherlands', 'Germany', 'Denmark', 'Spain', 'UK', 'Austria', 'Poland', 'Ireland', 'France']

Citation:

@online{dickson2023,
  title = {Bart-large CNN Climate Change Summarization},
  url = {https://huggingface.co/z-dickson/bart-large-cnn-climate-change-summarization},
  author = {Dickson, Zachary P},
  year = 2023,
  urldate = {\today},
}
Downloads last month
94
Safetensors
Model size
406M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.