File size: 1,465 Bytes
7e8b619
 
 
53efde7
e143a30
7e8b619
 
 
 
ffdad8c
3552967
6c27d5d
ecd8dcd
 
 
 
7e8b619
1ab60da
 
0b0cec4
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: en
widget:
- text: Tesla remains one of the highest [MASK] stocks on the market. Meanwhile, Aurora Innovation is a pre-revenue upstart that shows promise.
- text: Asian stocks [MASK] from a one-year low on Wednesday as U.S. share futures and oil recovered from the previous day's selloff, but uncertainty over the impact of the Omicron
- text: U.S. stocks were set to rise on Monday, led by [MASK] in Apple which neared $3 trillion in market capitalization, while investors braced for a Federal Reserve meeting later this week.
tags:
- fill-mask
---
**FinancialBERT** is a BERT model pre-trained on a large corpora of financial texts. The purpose is to enhance financial NLP research and practice in financial domain, hoping that financial practitioners and researchers can benefit from it without the necessity of the significant computational resources required to train the model. 

The model was trained on a large corpus of financial texts:
 - *TRC2-financial*: 1.8M news articles that were published by Reuters between 2008 and 2010.
 - *Bloomberg News*: 400,000 articles between 2006 and 2013.
 - *Corporate Reports*: 192,000 transcripts (10-K & 10-Q)
 - *Earning Calls*: 42,156 documents.
 
More details on `FinancialBERT` can be found at: https://www.researchgate.net/publication/358284785_FinancialBERT_-_A_Pretrained_Language_Model_for_Financial_Text_Mining


> Created by [Ahmed Rachid Hazourli](https://www.linkedin.com/in/ahmed-rachid/)