File size: 1,079 Bytes
32ac228 1d7bf94 32ac228 1d7bf94 480b25b d1221b1 480b25b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 |
---
language:
- zh
license: apache-2.0
metrics:
- accuracy
pipeline_tag: text-generation
---
>> It's not a chat model, just using Wizard-LM-Chinese-instruct-evol datesets training with several steps for test the model typical Chinese skill, this is version1, will release version2 for more long context windows and Chat model
____________________________
Train scenario:
2k context
datasets:Wizard-LM-Chinese-instruct-evol
batchsize:8
steps:500
epchos:2
____________________________________________________
How to use?
Follow common huggingface-api is enough or using other framework like VLLM, support continue training.
____________________________________________________
import transformers
import torch
model_id = "/aml/llama3-ft"
pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)
pipeline("川普和拜登谁能赢得大选??")
>> [{'generated_text': '川普和拜登谁能赢得大选?](https://www.voachinese.com'}]
Wechat:18618377979, Gmail:zhouboyang1983@gmail.com |