FLANG-Roberta / README.md
Raj-Sanjay-Shah's picture
Update README.md
af3dc20
|
raw
history blame
1.27 kB
metadata
language: en
tags:
  - Financial Language Modelling
widget:
  - text: Stocks rallied and the British pound <mask>.

##FLANG FLANG is a set of large language models for Financial LANGuage tasks. These models use domain specific pre-training with preferential masking to build more robust representations for the domain. The models in the set are:
FLANG-BERT
FLANG-SpanBERT
FLANG-DistilBERT
FLANG-Roberta
Flang-ELECTRA

##FLANG-Roberta FLANG-Roberta is a pre-trained language model which uses financial keywords and phrases for preferential masking of domain specific terms. It is built by further training the RoBerta language model in the finance domain with improved performance over previous models due to the use of domain knowledge and vocabulary.

Contact information

Please contact Raj Sanjay Shah (rajsanjayshah[at]gatech[dot]edu) or Sudheer Chava (schava6[at]gatech[dot]edu) or Diyi Yang (diyiy[at]stanford[dot]edu) about any FLANG-Roberta related issues and questions.


license: afl-3.0