Citation paper supplement
Browse files
README.md
CHANGED
@@ -46,7 +46,7 @@ You can view the introduction of the **Chinese version** through [this link](htt
|
|
46 |
|
47 |
|
48 |
|
49 |
-
## Further Pre-training
|
50 |
|
51 |
**Compared with the previous pre-trained models, `bert-ancient-chinese` mainly has the following characteristics:**
|
52 |
|
@@ -57,7 +57,7 @@ You can view the introduction of the **Chinese version** through [this link](htt
|
|
57 |
|
58 |
|
59 |
|
60 |
-
## How to use
|
61 |
|
62 |
### Huggingface Transformers
|
63 |
|
@@ -73,7 +73,7 @@ model = AutoModel.from_pretrained("Jihuai/bert-ancient-chinese")
|
|
73 |
|
74 |
|
75 |
|
76 |
-
## Download PTM
|
77 |
|
78 |
The model we provide is the `PyTorch` version.
|
79 |
|
@@ -93,7 +93,7 @@ Download address:
|
|
93 |
|
94 |
|
95 |
|
96 |
-
## Evaluation & Results
|
97 |
|
98 |
We tested and compared different pre-trained models on the training and test sets provided by the competition [EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan). We compare the performance of the models by fine-tuning them on the downstream tasks of `Chinese Word Segmentation(CWS)` and `part-of-speech tagging(POS Tagging)`.
|
99 |
|
@@ -138,19 +138,19 @@ We use `BERT+CRF` as the baseline model to compare the performance of `siku-bert
|
|
138 |
</table>
|
139 |
|
140 |
|
141 |
-
## Citing
|
142 |
|
143 |
If our content is helpful for your research work, please quote it in the paper.
|
144 |
|
145 |
|
146 |
|
147 |
-
## Disclaim
|
148 |
|
149 |
The experimental results presented in the report only show the performance under a specific data set and hyperparameter combination, and cannot represent the essence of each model. The experimental results may change due to random number seeds and computing equipment. **Users can use the model arbitrarily within the scope of the license, but we are not responsible for the direct or indirect losses caused by using the content of the project.**
|
150 |
|
151 |
|
152 |
|
153 |
-
## Acknowledgment
|
154 |
|
155 |
`bert-ancient-chinese` is based on [bert-base-chinese](https://huggingface.co/bert-base-chinese) to continue training.
|
156 |
|
@@ -158,6 +158,6 @@ Thanks to Prof. [Xipeng Qiu](https://xpqiu.github.io/) and the [Natural Language
|
|
158 |
|
159 |
|
160 |
|
161 |
-
## Contact us
|
162 |
|
163 |
Pengyu Wang:wpyjihuai@gmail.com
|
|
|
46 |
|
47 |
|
48 |
|
49 |
+
## **Further Pre-training**
|
50 |
|
51 |
**Compared with the previous pre-trained models, `bert-ancient-chinese` mainly has the following characteristics:**
|
52 |
|
|
|
57 |
|
58 |
|
59 |
|
60 |
+
## **How to use**
|
61 |
|
62 |
### Huggingface Transformers
|
63 |
|
|
|
73 |
|
74 |
|
75 |
|
76 |
+
## **Download PTM**
|
77 |
|
78 |
The model we provide is the `PyTorch` version.
|
79 |
|
|
|
93 |
|
94 |
|
95 |
|
96 |
+
## **Evaluation & Results**
|
97 |
|
98 |
We tested and compared different pre-trained models on the training and test sets provided by the competition [EvaHan 2022](https://circse.github.io/LT4HALA/2022/EvaHan). We compare the performance of the models by fine-tuning them on the downstream tasks of `Chinese Word Segmentation(CWS)` and `part-of-speech tagging(POS Tagging)`.
|
99 |
|
|
|
138 |
</table>
|
139 |
|
140 |
|
141 |
+
## **Citing**
|
142 |
|
143 |
If our content is helpful for your research work, please quote it in the paper.
|
144 |
|
145 |
|
146 |
|
147 |
+
## **Disclaim**
|
148 |
|
149 |
The experimental results presented in the report only show the performance under a specific data set and hyperparameter combination, and cannot represent the essence of each model. The experimental results may change due to random number seeds and computing equipment. **Users can use the model arbitrarily within the scope of the license, but we are not responsible for the direct or indirect losses caused by using the content of the project.**
|
150 |
|
151 |
|
152 |
|
153 |
+
## **Acknowledgment**
|
154 |
|
155 |
`bert-ancient-chinese` is based on [bert-base-chinese](https://huggingface.co/bert-base-chinese) to continue training.
|
156 |
|
|
|
158 |
|
159 |
|
160 |
|
161 |
+
## **Contact us**
|
162 |
|
163 |
Pengyu Wang:wpyjihuai@gmail.com
|