--- language: zh tags: - roformer-v2 - pytorch - tf2.0 inference: False --- ## 介绍 ### tf版本 https://github.com/ZhuiyiTechnology/roformer-v2 ### pytorch版本+tf2.0版本 https://github.com/JunnYu/RoFormer_pytorch ### 安装 - pip install roformer==0.4.0 ## pytorch & tf2.0使用 ```python import torch import tensorflow as tf from transformers import BertTokenizer from roformer import RoFormerForMaskedLM, TFRoFormerForMaskedLM text = "今天[MASK]很好,我[MASK]去公园玩。" tokenizer = BertTokenizer.from_pretrained("junnyu/roformer_v2_chinese_char_base") pt_model = RoFormerForMaskedLM.from_pretrained("junnyu/roformer_v2_chinese_char_base") tf_model = TFRoFormerForMaskedLM.from_pretrained( "junnyu/roformer_v2_chinese_char_base", from_pt=True ) pt_inputs = tokenizer(text, return_tensors="pt") tf_inputs = tokenizer(text, return_tensors="tf") # pytorch with torch.no_grad(): pt_outputs = pt_model(**pt_inputs).logits[0] pt_outputs_sentence = "pytorch: " for i, id in enumerate(tokenizer.encode(text)): if id == tokenizer.mask_token_id: tokens = tokenizer.convert_ids_to_tokens(pt_outputs[i].topk(k=5)[1]) pt_outputs_sentence += "[" + "||".join(tokens) + "]" else: pt_outputs_sentence += "".join( tokenizer.convert_ids_to_tokens([id], skip_special_tokens=True) ) print(pt_outputs_sentence) # tf tf_outputs = tf_model(**tf_inputs, training=False).logits[0] tf_outputs_sentence = "tf: " for i, id in enumerate(tokenizer.encode(text)): if id == tokenizer.mask_token_id: tokens = tokenizer.convert_ids_to_tokens(tf.math.top_k(tf_outputs[i], k=5)[1]) tf_outputs_sentence += "[" + "||".join(tokens) + "]" else: tf_outputs_sentence += "".join( tokenizer.convert_ids_to_tokens([id], skip_special_tokens=True) ) print(tf_outputs_sentence) # small # pytorch: 今天[的||,||是||很||也]很好,我[要||会||是||想||在]去公园玩。 # tf: 今天[的||,||是||很||也]很好,我[要||会||是||想||在]去公园玩。 # base # pytorch: 今天[我||天||晴||园||玩]很好,我[想||要||会||就||带]去公园玩。 # tf: 今天[我||天||晴||园||玩]很好,我[想||要||会||就||带]去公园玩。 # large # pytorch: 今天[天||气||我||空||阳]很好,我[又||想||会||就||爱]去公园玩。 # tf: 今天[天||气||我||空||阳]很好,我[又||想||会||就||爱]去公园玩。 ``` ## 引用 Bibtex: ```tex @misc{su2021roformer, title={RoFormer: Enhanced Transformer with Rotary Position Embedding}, author={Jianlin Su and Yu Lu and Shengfeng Pan and Bo Wen and Yunfeng Liu}, year={2021}, eprint={2104.09864}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```tex @techreport{roformerv2, title={RoFormerV2: A Faster and Better RoFormer - ZhuiyiAI}, author={Jianlin Su, Shengfeng Pan, Bo Wen, Yunfeng Liu}, year={2022}, url="https://github.com/ZhuiyiTechnology/roformer-v2", } ```