Loading model via local config file

#5
by dparmar16 - opened

Due to a firewall issue, I cannot load in the model via the Hugging Face website configuration file. Currently it's loaded in like this:

from transformers import pipeline
summarizer = pipeline("summarization", model="philschmid/bart-large-cnn-samsum")

Is it possible to load in like this?

summarizer = pipeline("summarization", model="path_to_local_config_file.json")

You have to save the repository into a directory, e.g. distilbart/ and then you can do

summarizer = pipeline("summarization", model="distilbart/")

This is great, thank you! Can I just take the distilbart portion of the transformers repo? Or do I need to download the whole transformers repo?

What I'm seeing is here:
https://github.com/huggingface/transformers/tree/main/src/transformers/models/distilbert

Sign up or log in to comment