Fill-Mask
Transformers
PyTorch
Japanese
deberta-v2
Inference Endpoints
retarfi commited on
Commit
3976856
1 Parent(s): 78c2ada

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -6
README.md CHANGED
@@ -48,12 +48,6 @@ We used the following corpora for pre-training:
48
  - [Japanese Wikinews as of July 28, 2023](https://huggingface.co/datasets/izumi-lab/wikinews-ja-20230728)
49
 
50
 
51
- We pretrained with the corpora mentioned above for 900k steps, and additionally pretrained with the following financial corpora for 100k steps:
52
- - Summaries of financial results from October 9, 2012, to December 31, 2022
53
- - Securities reports from February 8, 2018, to December 31, 2022
54
- - News articles
55
-
56
-
57
  ## Training Parameters
58
 
59
  learning_rate in parentheses indicate the learning rate for additional pre-training with the financial corpus.
 
48
  - [Japanese Wikinews as of July 28, 2023](https://huggingface.co/datasets/izumi-lab/wikinews-ja-20230728)
49
 
50
 
 
 
 
 
 
 
51
  ## Training Parameters
52
 
53
  learning_rate in parentheses indicate the learning rate for additional pre-training with the financial corpus.