FinText-Small (further)
Collection
FinText (base version) models, pre-trained on the RoBERTa architecture with 51.48 million parameters, have undergone further pre-training.
•
17 items
•
Updated
This page features the Small (further) version of FinText for 2021. For details on FinText models and access to other versions and years, visit this link.
For further reading on FinText and citation, please refer to the paper:
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4963618