ahmedrachid commited on
Commit
ecd8dcd
1 Parent(s): dac16b5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -8,10 +8,10 @@ tags:
8
  **FinancialBERT** is a BERT model pre-trained on a large corpora of financial communications. The purpose is to enhance financial NLP research and practice in financial domain, we hope financial practitioners and researchers can benefit from our model without the necessity of the significant computational resources required to train the model.
9
 
10
  Our model was trained on a large corpus of financial texts:
11
- - TRC2-financial: 1.8M news articles that were published by Reuters between 2008 and 2010.
12
- - Bloomberg News: 400,000 articles between 2006 and 2013.
13
- - Corporate Reports: 192,000 transcripts (10-K & 10-Q)
14
- - Earning Calls: 42,156 documents.
15
 
16
  More details on `FinancialBERT`'s pre-training process can be found at: https://wandb.ai/ahmedrachid/huggingface/reports/Financial-BERT--VmlldzoxMzQwMTgy
17
 
 
8
  **FinancialBERT** is a BERT model pre-trained on a large corpora of financial communications. The purpose is to enhance financial NLP research and practice in financial domain, we hope financial practitioners and researchers can benefit from our model without the necessity of the significant computational resources required to train the model.
9
 
10
  Our model was trained on a large corpus of financial texts:
11
+ - *TRC2-financial*: 1.8M news articles that were published by Reuters between 2008 and 2010.
12
+ - *Bloomberg News*: 400,000 articles between 2006 and 2013.
13
+ - *Corporate Reports*: 192,000 transcripts (10-K & 10-Q)
14
+ - *Earning Calls*: 42,156 documents.
15
 
16
  More details on `FinancialBERT`'s pre-training process can be found at: https://wandb.ai/ahmedrachid/huggingface/reports/Financial-BERT--VmlldzoxMzQwMTgy
17