izumi-lab commited on
Commit
b12854a
1 Parent(s): fe4d803

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -15
README.md CHANGED
@@ -8,12 +8,6 @@ tags:
8
 
9
  - finance
10
 
11
- datasets:
12
-
13
- - wikipedia
14
- - securities reports
15
- - summaries of financial results
16
-
17
  widget:
18
 
19
  - text: 流動[MASK]は、1億円となりました。
@@ -61,15 +55,15 @@ The models are trained with the same configuration as BERT small in the [origina
61
  **There will be another paper for this pretrained model. Be sure to check here again when you cite.**
62
 
63
  ```
64
- @inproceedings{suzuki2021fin-bert-electra,
65
- title={金融文書を用いた事前学習言語モデルの構築と検証},
66
- % title={Construction and Validation of a Pre-Trained Language Model Using Financial Documents},
67
- author={鈴木 雅弘 and 坂地 泰紀 and 平野 正徳 and 和泉 潔},
68
- % author={Masahiro Suzuki and Hiroki Sakaji and Masanori Hirano and Kiyoshi Izumi},
69
- booktitle={人工知能学会第27回金融情報学研究会(SIG-FIN)},
70
- % booktitle={Proceedings of JSAI Special Interest Group on Financial Infomatics (SIG-FIN) 27},
71
- pages={5-10},
72
- year={2021}
73
  }
74
  ```
75
 
8
 
9
  - finance
10
 
 
 
 
 
 
 
11
  widget:
12
 
13
  - text: 流動[MASK]は、1億円となりました。
55
  **There will be another paper for this pretrained model. Be sure to check here again when you cite.**
56
 
57
  ```
58
+ @article{Suzuki-etal-2023-ipm,
59
+ title = {Constructing and analyzing domain-specific language model for financial text mining}
60
+ author = {Masahiro Suzuki and Hiroki Sakaji and Masanori Hirano and Kiyoshi Izumi},
61
+ journal = {Information Processing & Management},
62
+ volume = {60},
63
+ number = {2},
64
+ pages = {103194},
65
+ year = {2023},
66
+ doi = {10.1016/j.ipm.2022.103194}
67
  }
68
  ```
69