pangu_2_6B / README.md
imone's picture
resolve symlinks
95a202a

Pangu-Alpha 2.6B

Usage

Currently Pangu model is not supported by transformers, so trust_remote_code=True is required to execute custom model.

from transformers import TextGenerationPipeline, AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("imone/pangu_2.6B", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("imone/pangu_2.6B", trust_remote_code=True)

text_generator = TextGenerationPipeline(model, tokenizer)
text_generator("中国和美国和日本和法国和加拿大和澳大利亚的首都分别是哪里?")