Salesforce CodeGen
ported salesforce codegen models to work on huggingface transformers without any extra code (the model specific code is bundled)
how to use:
trust_remote_code
is needed because the torch modules for the custom codegen model is bundled.
from transformers import AutoModelForCausalLM, GPT2Tokenizer
tokenizer = GPT2Tokenizer.from_pretrained(model_folder, local_files_only=True)
model = AutoModelForCausalLM.from_pretrained(model_folder, local_files_only=True, trust_remote_code=True)