Edit model card

Alibaba PAI BERT Base Chinese

This project provides Chinese pre-trained language models and various types of NLP tools. The models are pre-trained on the large-scale corpora hosted by the Alibaba PAI team. It is developed based on the EasyNLP framework (https://github.com/alibaba/EasyNLP).

Citation

If you find the resource is useful, please cite the following paper in your work:

@article{easynlp, 
title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing},   		
publisher = {arXiv}, 
author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei}, 
url = {https://arxiv.org/abs/2205.00258}, 
year = {2022} 
} 
Downloads last month
50
Safetensors
Model size
103M params
Tensor type
F32
Β·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.