w775739733 commited on
Commit
16bb34f
•
1 Parent(s): eceb38d

Update README.md

Browse files

![img.png](https://s3.amazonaws.com/moonup/production/uploads/1661697350102-621a2b96100edd793f521ab6.png)

Files changed (1) hide show
  1. README.md +5 -2
README.md CHANGED
@@ -1,9 +1,12 @@
1
  ---
2
  license: afl-3.0
3
  ---
4
- This project page is about the pytorch code implementation of GlyphBERT by the HITsz-TMG research group.
5
 
6
- GlyphBERT is a Chinese pre-training model that includes Chinese character glyph features.It renders the input characters into images and designs them in the form of multi-channel location feature maps, and designs a two-layer residual convolutional neural network module to extract the image features of the characters for training.
 
 
 
7
 
8
  The experimental results show that the performance of the pre-training model can be well improved by fusing the features of Chinese glyphs. GlyphBERT is much better than BERT in multiple downstream tasks, and has strong transferability.
9
 
 
1
  ---
2
  license: afl-3.0
3
  ---
4
+ This project page is about the pytorch code implementation of GlyphBERT by the HITsz-TMG research group.
5
 
6
+
7
+ GlyphBERT is a Chinese pre-training model that includes Chinese character glyph features.It renders the input characters into images and designs them in the form of multi-channel location feature maps, and designs a two-layer residual convolutional neural network module to extract the image features of the characters for training.
8
+
9
+ ![img.png](https://s3.amazonaws.com/moonup/production/uploads/1661697350102-621a2b96100edd793f521ab6.png)
10
 
11
  The experimental results show that the performance of the pre-training model can be well improved by fusing the features of Chinese glyphs. GlyphBERT is much better than BERT in multiple downstream tasks, and has strong transferability.
12