This is a beginner's test model.
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
modelName='MingZhuang/mz_model_merged'
model = AutoModelForCausalLM.from_pretrained(modelName)
tokenizer = AutoTokenizer.from_pretrained(modelName)
generate = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
)
Result = generate(
"</s>Human: How can I find a girlfriend?<s> Assistant:",
max_new_tokens=120
)
Result
- Downloads last month
- 0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.