ReGPT-125M-200G / README.md
wangyulong's picture
Update README.md
81b7b62
metadata
language:
  - en
tags:
  - text generation
  - pytorch
  - causal-lm
license: apache-2.0

ReGPT-125M-200G

This model was trained on GPT-Neo-125M with Mengzi Retrieval LM.

For more details, please refer to this document.

How to use

You have to use a forked transformers: https://github.com/Langboat/transformers

from transformers import Re_gptForCausalLM
model = Re_gptForCausalLM.from_pretrained('Langboat/ReGPT-125M-200G')