ERNIE-Gram-zh

Introduction

ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding

More detail: https://arxiv.org/abs/2010.12148

Released Model Info

Model Name Language Model Structure
ernie-gram-zh Chinese Layer:12, Hidden:768, Heads:12

This released Pytorch model is converted from the officially released PaddlePaddle ERNIE model and a series of experiments have been conducted to check the accuracy of the conversion.

How to use

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-gram-zh")
model = AutoModel.from_pretrained("nghuyong/ernie-gram-zh")
Downloads last month
233
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for nghuyong/ernie-gram-zh

Finetunes
1 model
Quantizations
1 model