yatharth97 commited on
Commit
e9dc102
1 Parent(s): 64754bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -14
README.md CHANGED
@@ -1,31 +1,28 @@
1
  ---
2
- license: apache-2.0
3
- base_model: t5-base
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
- - name: T5-base-10K-summarization
8
  results: []
9
  ---
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
 
14
- # T5-base-10K-summarization
15
-
16
- This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on an unknown dataset.
17
 
18
  ## Model description
19
 
20
- More information needed
21
 
22
  ## Intended uses & limitations
23
 
24
- More information needed
25
 
26
  ## Training and evaluation data
27
 
28
- More information needed
29
 
30
  ## Training procedure
31
 
@@ -42,6 +39,6 @@ The following hyperparameters were used during training:
42
 
43
  ### Framework versions
44
 
45
- - Transformers 4.39.3
46
- - Pytorch 2.2.2+cu121
47
- - Tokenizers 0.15.2
 
1
  ---
 
 
2
  tags:
3
  - generated_from_trainer
4
+ - summarization
5
+ - finance
6
  model-index:
7
+ - name: T5-Base-10K-Summarization
8
  results: []
9
  ---
10
 
11
+ # T5-Base-10K-Summarization
 
12
 
13
+ This model is a fine-tuned version of Google's T5-Base model tailored for summarizing financial 10K report sections.
 
 
14
 
15
  ## Model description
16
 
17
+ T5-Base-10K-Summarization is optimized to condense lengthy 10K reports into manageable summaries, enabling quick insights into financial data and trends.
18
 
19
  ## Intended uses & limitations
20
 
21
+ Ideal for use by financial analysts and regulatory agencies needing rapid insights from 10K reports. It may not be suited for summarizing non-financial documents or informal texts.
22
 
23
  ## Training and evaluation data
24
 
25
+ Trained on a diverse collection of 10K reports from various industries, annotated for summarization to ensure broad applicability and accuracy.
26
 
27
  ## Training procedure
28
 
 
39
 
40
  ### Framework versions
41
 
42
+ - Transformers 4.40.0
43
+ - Pytorch 2.2.1+cu121
44
+ - Tokenizers 0.19.1