Billsum-T5-Small
Model Description
Billsum-T5-Small is a fine-tuned version of the T5-small model specifically tailored for the task of text summarization on U.S. Congressional bills. The model has been trained on the BillSum dataset, which contains the full texts of U.S. Congressional bills and their corresponding summaries. This model is capable of generating concise and informative summaries from lengthy legislative texts, making it useful for legal professionals, policymakers, and researchers who need to quickly understand the essence of complex documents.
Model Details
- Model Architecture: T5-Small
- Base Model: T5-Small
- Fine-tuning Dataset: BillSum
- Number of Parameters: 60 million
Training Data
The model was fine-tuned on the BillSum dataset's training split. The dataset contains texts of U.S. Congressional bills and their summaries. We used 80% of the dataset for training and 20% for evaluation.
Training Procedure
- Optimizer: AdamW
- Learning Rate: 2e-5
- Batch Size: 4
- Epochs: 3
- **Evaluation Metric: ROUGE (ROUGE-1, ROUGE-2, ROUGE-L)
Direct Use
Model Card Authors [optional]
[Mouwiya S. A. Al-Qaisieh]
Model Card Contact
- Downloads last month
- 11