Edit model card

ReGPT-125M-200G

This model was trained on GPT-Neo-125M with Mengzi Retrieval LM.

For more details, please refer to this document.

How to use

You have to use a forked transformers: https://github.com/Langboat/transformers

from transformers import Re_gptForCausalLM
model = Re_gptForCausalLM.from_pretrained('Langboat/ReGPT-125M-200G')
Downloads last month
12

Space using Langboat/ReGPT-125M-200G 1