mstatt commited on
Commit
f768d1f
1 Parent(s): 94ef452

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -19,7 +19,7 @@ The **T5 Large for Medical Text Summarization** is a specialized variant of the
19
 
20
  The T5 Large model, known as "t5-large," is pre-trained on a broad range of medical literature, enabling it to capture intricate medical terminology, extract crucial information, and produce meaningful summaries. The fine-tuning process for this model is meticulous, with attention to hyperparameter settings, including batch size and learning rate, to ensure optimal performance in the field of medical text summarization.
21
 
22
- During the fine-tuning process, a batch size of 4 is chosen for efficiency, and a learning rate of 1e-5 is selected to strike a balance between convergence speed and model optimization. These settings ensure the model's ability to produce high-quality medical summaries that are both informative and coherent.
23
 
24
  The fine-tuning dataset consists of diverse medical documents, clinical studies, and healthcare research, along with human-generated summaries. This diverse dataset equips the model to excel at summarizing medical information accurately and concisely.
25
 
 
19
 
20
  The T5 Large model, known as "t5-large," is pre-trained on a broad range of medical literature, enabling it to capture intricate medical terminology, extract crucial information, and produce meaningful summaries. The fine-tuning process for this model is meticulous, with attention to hyperparameter settings, including batch size and learning rate, to ensure optimal performance in the field of medical text summarization.
21
 
22
+ During the fine-tuning process, a batch size of 8 is chosen for efficiency, and a learning rate of 2e-5 is selected to strike a balance between convergence speed and model optimization. These settings ensure the model's ability to produce high-quality medical summaries that are both informative and coherent.
23
 
24
  The fine-tuning dataset consists of diverse medical documents, clinical studies, and healthcare research, along with human-generated summaries. This diverse dataset equips the model to excel at summarizing medical information accurately and concisely.
25