--- license: mit datasets: - iryneko571/CCMatrix-v1-Ja_Zh-fused language: - ja - zh library_name: transformers pipeline_tag: translation widget: - text: <-ja2zh-> フェルディナント・ラッサール \n は、プロイセンの政治学者、哲学者、法学者、社会主義者、労働運動指導者。ドイツ社会民主党の母体となる全ドイツ労働者同盟の創設者である。社会主義共和政の統一ドイツを目指しつつも、…… --- # 测试用colab笔记,test notebook 不需要自己装环境即可使用!!No environment needed, use colab to test
https://colab.research.google.com/drive/1PA30HPgRooCTV-H9Wr_DZXHqC42PrgTO?usp=sharing
现在翻译能力就是人工吗喽,不是词汇不够,是学不会了
this model has problem learning more due to the 300M size and my poor techniques # 模型公开声明 * 这个模型由 mt5-translation-ja_zh 启发(其实就是在它上面改的),使用mt5-small,整体较小 * 使用了CCMatrix-v1-Ja_Zh, 1e-4学习率, 7 个epoch, 大概1.7的 val loss,下不去了 # Release Notes * this model is finetuned from mt5-small, training methods and datasets refers to larryvrh/mt5-translation-ja_zh * used a trimmed and fused dataset CCMatrix-v1-Ja_Zh 1e-4 for 7 epoch no weight decay,arraived at about 1.7 val loss, it somehow stalls there # A more precise example using it # 使用指南 ```python from transformers import pipeline model_name="iryneko571/mt5-small-translation-ja_zh" #pipe = pipeline("translation",model=model_name,tokenizer=model_name,repetition_penalty=1.4,batch_size=1,max_length=256) pipe = pipeline("translation", model=model_name, repetition_penalty=1.4, batch_size=1, max_length=256 ) def translate_batch(batch, language='<-ja2zh->'): # batch is an array of string i=0 # quickly format the list while i