Erlangshen-Longformer-110M

简介 Brief Introduction

善于处理长文本,采用旋转位置编码的中文版1.1亿参数的Longformer-base

The Chinese Longformer-base (110M), which uses rotating positional encoding, is adept at handling lengthy text.

模型分类 Model Taxonomy

需求 Demand 任务 Task 系列 Series 模型 Model 参数 Parameter 额外 Extra
通用 General 自然语言理解 NLU 二郎神 Erlangshen Longformeer 110M 中文 Chinese

模型信息 Model Information

遵循Longformer-base的设计,我们基于chinese_roformer_L-12_H-768_A-12,在悟道语料库(180 GB版本)上进行了继续预训练。特别的,我们采用旋转位置嵌入(RoPE)来避免预训练语料库的不均匀序列长度问题。

Following the design of Longformer-base, we performed continual pre-training on the WuDao corpus (180 GB) based on chinese_roformer_L-12_H-768_A-12. Particularly, we employed rotational position embedding (RoPE) to avoid the uneven sequence length of the pre-trained corpus.

使用 Usage

因为transformers库中是没有Longformer-base相关的模型结构的,所以你可以在我们的Fengshenbang-LM中找到并且运行代码。

Since there is no structure of Longformer-base in transformers library, you can find the structure of Longformer-base and run the codes in Fengshenbang-LM.

git clone https://github.com/IDEA-CCNL/Fengshenbang-LM.git

加载模型 Loading Models

from fengshen import LongformerModel    
from fengshen import LongformerConfig
from transformers import BertTokenizer

tokenizer = BertTokenizer.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-110M")
config = LongformerConfig.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-110M")
model = LongformerModel.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-110M")

引用 Citation

如果您在您的工作中使用了我们的模型,可以引用我们的论文

If you are using the resource for your work, please cite the our paper:

@article{fengshenbang,
  author    = {Jiaxing Zhang and Ruyi Gan and Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen},
  title     = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
  journal   = {CoRR},
  volume    = {abs/2209.02970},
  year      = {2022}
}

也可以引用我们的网站:

You can also cite our website:

@misc{Fengshenbang-LM,
  title={Fengshenbang-LM},
  author={IDEA-CCNL},
  year={2021},
  howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
Downloads last month
174
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for IDEA-CCNL/Erlangshen-Longformer-110M

Finetunes
4 models