Edit model card

bart-base-job-info-summarizer

This model is a fine-tuned version of google-t5/t5-base on the private daily log of Bangkit bootcamp in Indonesia.

  • Rouge1: 0.639237038471346
  • Rouge2: 0.45630749696717915
  • Rougel: 0.5747263252831926
  • Rougelsum: 0.5747263252831925

Intended use and limitations:

This model can be used to summarize daily diary log into weekly summarization

How to use:

!pip install transformers

from transformers import T5Tokenizer, T5ForConditionalGeneration

# Load the model and tokenizer
model_name = "avisena/t5-base-weekly-diary-summarization"
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)

# Set up model arguments
model_args = {
    "max_length": 512,  # Increase max_length to handle longer outputs
    "length_penalty": -9.7,
    "num_beams":5,  # Use beam search for better results
    "early_stopping": True,
    "temperature": 1.7
}

# Tokenize input text
input_text = """summarize: 
- I organized a large-scale professional conference and managed all logistical details, including venue selection, scheduling, and coordination with speakers. I ensured all necessary permits and insurance were in place to cover the event.
- I conducted a detailed review of the conference objectives to ensure they aligned with the industry’s standards and goals. This involved working with the conference committee to define the agenda, target audience, and key outcomes.
- I coordinated with a diverse group of speakers and panelists, reviewing their presentations and ensuring they were aligned with the conference themes. I also worked with suppliers to arrange audiovisual equipment, catering, and other event essentials.
- The conference was structured into three main segments, starting with the most intensive one, which required meticulous planning due to its complexity and the need for precise timing and coordination.
- In our final planning session, we reviewed the conference layout, assigned roles to team members, and established backup plans for potential issues such as speaker cancellations or technical failures.
- We developed extensive contingency plans, including alternative session formats and additional technical support, to address any potential disruptions.
- To ensure the conference ran smoothly, I organized several rehearsals and pre-event briefings to test all aspects of the event and make necessary adjustments. We also coordinated with volunteers to ensure everyone was prepared for their roles.
- I managed the marketing and promotion of the conference, including designing promotional materials, managing social media outreach, and engaging with industry publications to boost attendance and interest.
- On the day of the conference, I oversaw all activities, ensured that the schedule was adhered to, and addressed any issues that arose promptly. I worked closely with speakers, staff, and attendees to ensure a successful and productive event.
- The setup for the first segment was particularly challenging due to its complexity and the need for precise execution. Despite facing several hurdles, I implemented effective solutions and worked closely with the team to ensure a successful start to the conference.
- After the conference, I conducted a thorough review to evaluate its success and gather feedback from attendees, speakers, and staff. This feedback provided valuable insights for future conferences and highlighted areas for improvement.
"""
input_ids = tokenizer.encode(input_text, return_tensors="pt", max_length=250, truncation=True)

# Generate summary
summary_ids = model.generate(
    input_ids,
    max_length=model_args["max_length"],
    length_penalty=model_args["length_penalty"],
    num_beams=model_args["num_beams"],
    early_stopping=model_args["early_stopping"],
    temperature=model_args["temperature"]
)

# Decode summary
summary = tokenizer.decode(summary_ids[0], skip_special_tokens=True,  max_length=512)
print(summary)
Downloads last month
12
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for avisena/t5-base-weekly-diary-summarization

Base model

google-t5/t5-base
Finetuned
(407)
this model

Evaluation results