File size: 592 Bytes
95a202a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
# Pangu-Alpha 2.6B

## Usage

Currently Pangu model is not supported by transformers, 
so `trust_remote_code=True` is required to execute custom model.

```python
from transformers import TextGenerationPipeline, AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("imone/pangu_2.6B", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("imone/pangu_2.6B", trust_remote_code=True)

text_generator = TextGenerationPipeline(model, tokenizer)
text_generator("中国和美国和日本和法国和加拿大和澳大利亚的首都分别是哪里?")
```