--- language: - zh license: apache-2.0 metrics: - accuracy pipeline_tag: text-generation --- >> It's not a chat model, just using Wizard-LM-Chinese-instruct-evol datesets training with several steps for test the model typical Chinese skill, this is version1, will release version2 for more long context windows and Chat model ____________________________ Train scenario: 2k context datasets:Wizard-LM-Chinese-instruct-evol batchsize:8 steps:500 epchos:2 ____________________________________________________ How to use? Follow common huggingface-api is enough or using other framework like VLLM, support continue training. ____________________________________________________ import transformers import torch model_id = "/aml/llama3-ft" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto" ) pipeline("川普和拜登谁能赢得大选??") >> [{'generated_text': '川普和拜登谁能赢得大选?](https://www.voachinese.com'}] Wechat:18618377979, Gmail:zhouboyang1983@gmail.com