Hub Python Library documentation
Mixins & serialization methods
Mixins & serialization methods
Mixins
The huggingface_hub library offers a range of mixins that can be used as a parent class for your
objects, in order to provide simple uploading and downloading functions.
PyTorch
A Generic Base Model Hub Mixin. Define your own mixin for anything by
inheriting from this class and overwriting _from_pretrained and
_save_pretrained to define custom logic for saving/loading your classes.
See huggingface_hub.PyTorchModelHubMixin for an example.
from_pretrained
< source >( pretrained_model_name_or_path: str force_download: bool = False resume_download: bool = False proxies: typing.Dict = None use_auth_token: typing.Optional[str] = None cache_dir: typing.Optional[str] = None local_files_only: bool = False **model_kwargs )
Parameters
-
pretrained_model_name_or_path (
stroros.PathLike) — Can be either:- A string, the
model idof a pretrained model hosted inside a model repo on huggingface.co. Valid model ids can be located at the root-level, likebert-base-uncased, or namespaced under a user or organization name, likedbmdz/bert-base-german-cased. - You can add
revisionby appending@at the end of model_id simply like this:dbmdz/bert-base-german-cased@mainRevision is the specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, sorevisioncan be any identifier allowed by git. - A path to a
directorycontaining model weights saved using save_pretrained, e.g.,./my_model_directory/. Noneif you are both providing the configuration and state dictionary (resp. with keyword argumentsconfigandstate_dict).
- A string, the
-
force_download (
bool, optional, defaults toFalse) — Whether to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist. -
resume_download (
bool, optional, defaults toFalse) — Whether to delete incompletely received files. Will attempt to resume the download if such a file exists. -
proxies (
Dict[str, str], optional) — A dictionary of proxy servers to use by protocol or endpoint, e.g.,{'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request. -
use_auth_token (
strorbool, optional) — The token to use as HTTP bearer authorization for remote files. IfTrue, will use the token generated when runningtransformers-cli login(stored in~/.huggingface). -
cache_dir (
Union[str, os.PathLike], optional) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used. -
local_files_only(
bool, optional, defaults toFalse) — Whether to only look at local files (i.e., do not try to download the model). -
model_kwargs (
Dict, optional) — model_kwargs will be passed to the model during initialization
Instantiate a pretrained PyTorch model from a pre-trained model
configuration from huggingface-hub. The model is set in
evaluation mode by default using model.eval() (Dropout modules
are deactivated). To train the model, you should first set it
back in training mode with model.train().
Passing use_auth_token=True is required when you want to use a
private model.
push_to_hub
< source >( repo_path_or_name: typing.Optional[str] = None repo_url: typing.Optional[str] = None commit_message: typing.Optional[str] = 'Add model' organization: typing.Optional[str] = None private: bool = False api_endpoint: typing.Optional[str] = None use_auth_token: typing.Union[bool, str, NoneType] = None git_user: typing.Optional[str] = None git_email: typing.Optional[str] = None config: typing.Optional[dict] = None skip_lfs_files: bool = False repo_id: typing.Optional[str] = None token: typing.Optional[str] = None branch: typing.Optional[str] = None create_pr: typing.Optional[bool] = None allow_patterns: typing.Union[typing.List[str], str, NoneType] = None ignore_patterns: typing.Union[typing.List[str], str, NoneType] = None )
Parameters
-
repo_id (
str, optional) — Repository name to which push. -
commit_message (
str, optional) — Message to commit while pushing. -
private (
bool, optional, defaults toFalse) — Whether the repository created should be private. -
api_endpoint (
str, optional) — The API endpoint to use when pushing the model to the hub. -
token (
str, optional) — The token to use as HTTP bearer authorization for remote files. If not set, will use the token set when logging in withtransformers-cli login(stored in~/.huggingface). -
branch (
str, optional) — The git branch on which to push the model. This defaults to the default branch as specified in your repository, which defaults to"main". -
create_pr (
boolean, optional) — Whether or not to create a Pull Request frombranchwith that commit. Defaults toFalse. -
config (
dict, optional) — Configuration object to be saved alongside the model weights. -
allow_patterns (
List[str]orstr, optional) — If provided, only files matching at least one pattern are pushed. -
ignore_patterns (
List[str]orstr, optional) — If provided, files matching any of the patterns are not pushed.
Upload model checkpoint to the Hub.
Use allow_patterns and ignore_patterns to precisely filter which files
should be pushed to the hub. See upload_folder() reference for more details.
save_pretrained
< source >( save_directory: str config: typing.Optional[dict] = None push_to_hub: bool = False **kwargs )
Parameters
-
save_directory (
str) — Specify directory in which you want to save weights. -
config (
dict, optional) — Specify config (must be dict) in case you want to save it. -
push_to_hub (
bool, optional, defaults toFalse) — Whether or not to push your model to the Hugging Face model hub after saving it. You can specify the repository you want to push to withrepo_id(will default to the name ofsave_directoryin your namespace). kwargs — Additional key word arguments passed along to the~utils.PushToHubMixin.push_to_hubmethod.
Save weights in local directory.
Keras
huggingface_hub.from_pretrained_keras
< source >( *args **kwargs )
Parameters
-
pretrained_model_name_or_path (
stroros.PathLike) — Can be either:- A string, the
model idof a pretrained model hosted inside a model repo on huggingface.co. Valid model ids can be located at the root-level, likebert-base-uncased, or namespaced under a user or organization name, likedbmdz/bert-base-german-cased. - You can add
revisionby appending@at the end of model_id simply like this:dbmdz/bert-base-german-cased@mainRevision is the specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a git-based system for storing models and other artifacts on huggingface.co, sorevisioncan be any identifier allowed by git. - A path to a
directorycontaining model weights saved using save_pretrained, e.g.,./my_model_directory/. Noneif you are both providing the configuration and state dictionary (resp. with keyword argumentsconfigandstate_dict).
- A string, the
-
force_download (
bool, optional, defaults toFalse) — Whether to force the (re-)download of the model weights and configuration files, overriding the cached versions if they exist. -
resume_download (
bool, optional, defaults toFalse) — Whether to delete incompletely received files. Will attempt to resume the download if such a file exists. -
proxies (
Dict[str, str], optional) — A dictionary of proxy servers to use by protocol or endpoint, e.g.,{'http': 'foo.bar:3128', 'http://hostname': 'foo.bar:4012'}. The proxies are used on each request. -
use_auth_token (
strorbool, optional) — The token to use as HTTP bearer authorization for remote files. IfTrue, will use the token generated when runningtransformers-cli login(stored in~/.huggingface). -
cache_dir (
Union[str, os.PathLike], optional) — Path to a directory in which a downloaded pretrained model configuration should be cached if the standard cache should not be used. -
local_files_only(
bool, optional, defaults toFalse) — Whether to only look at local files (i.e., do not try to download the model). -
model_kwargs (
Dict, optional) — model_kwargs will be passed to the model during initialization
Instantiate a pretrained Keras model from a pre-trained model from the Hub. The model is expected to be in SavedModel format.```
Passing use_auth_token=True is required when you want to use a private
model.
huggingface_hub.push_to_hub_keras
< source >( model repo_path_or_name: typing.Optional[str] = None repo_url: typing.Optional[str] = None log_dir: typing.Optional[str] = None commit_message: typing.Optional[str] = 'Add model' organization: typing.Optional[str] = None private: bool = False api_endpoint: typing.Optional[str] = None use_auth_token: typing.Union[bool, str, NoneType] = True git_user: typing.Optional[str] = None git_email: typing.Optional[str] = None config: typing.Optional[dict] = None include_optimizer: typing.Optional[bool] = False tags: typing.Union[list, str, NoneType] = None plot_model: typing.Optional[bool] = True token: typing.Optional[str] = True repo_id: typing.Optional[str] = None branch: typing.Optional[str] = None create_pr: typing.Optional[bool] = None allow_patterns: typing.Union[typing.List[str], str, NoneType] = None ignore_patterns: typing.Union[typing.List[str], str, NoneType] = None **model_save_kwargs )
Parameters
-
model (
Keras.Model) — The Keras model you’d like to push to the Hub. The model must be compiled and built. -
repo_id (
str) — Repository name to which push -
commit_message (
str, optional, defaults to “Add message”) — Message to commit while pushing. -
private (
bool, optional, defaults toFalse) — Whether the repository created should be private. -
api_endpoint (
str, optional) — The API endpoint to use when pushing the model to the hub. -
token (
str, optional) — The token to use as HTTP bearer authorization for remote files. If not set, will use the token set when logging in withhuggingface-cli login(stored in~/.huggingface). -
branch (
str, optional) — The git branch on which to push the model. This defaults to the default branch as specified in your repository, which defaults to"main". -
create_pr (
boolean, optional) — Whether or not to create a Pull Request frombranchwith that commit. Defaults toFalse. -
config (
dict, optional) — Configuration object to be saved alongside the model weights. -
allow_patterns (
List[str]orstr, optional) — If provided, only files matching at least one pattern are pushed. -
ignore_patterns (
List[str]orstr, optional) — If provided, files matching any of the patterns are not pushed. -
log_dir (
str, optional) — TensorBoard logging directory to be pushed. The Hub automatically hosts and displays a TensorBoard instance if log files are included in the repository. -
include_optimizer (
bool, optional, defaults toFalse) — Whether or not to include optimizer during serialization. -
tags (Union[
list,str], optional) — List of tags that are related to model or string of a single tag. See example tags here. -
plot_model (
bool, optional, defaults toTrue) — Setting this toTruewill plot the model and put it in the model card. Requires graphviz and pydot to be installed. -
model_save_kwargs(
dict, optional) — model_save_kwargs will be passed totf.keras.models.save_model().
Upload model checkpoint or tokenizer files to the Hub while synchronizing a
local clone of the repo in repo_path_or_name.
Use allow_patterns and ignore_patterns to precisely filter which files should be
pushed to the hub. See upload_folder() reference for more details.
huggingface_hub.save_pretrained_keras
< source >( model save_directory: str config: typing.Union[typing.Dict[str, typing.Any], NoneType] = None include_optimizer: typing.Optional[bool] = False plot_model: typing.Optional[bool] = True tags: typing.Union[list, str, NoneType] = None **model_save_kwargs )
Parameters
-
model (
Keras.Model) — The Keras model you’d like to save. The model must be compiled and built. -
save_directory (
str) — Specify directory in which you want to save the Keras model. -
config (
dict, optional) — Configuration object to be saved alongside the model weights. -
include_optimizer(
bool, optional, defaults toFalse) — Whether or not to include optimizer in serialization. -
plot_model (
bool, optional, defaults toTrue) — Setting this toTruewill plot the model and put it in the model card. Requires graphviz and pydot to be installed. -
tags (Union[
str,list], optional) — List of tags that are related to model or string of a single tag. See example tags here. -
model_save_kwargs(
dict, optional) — model_save_kwargs will be passed totf.keras.models.save_model().
Saves a Keras model to save_directory in SavedModel format. Use this if you’re using the Functional or Sequential APIs.
Mixin to provide model Hub upload/download capabilities to Keras models. Override this class to obtain the following internal methods:
_from_pretrained, to load a model from the Hub or from local files._save_pretrained, to save a model in theSavedModelformat.
Fastai
huggingface_hub.from_pretrained_fastai
< source >( repo_id: str revision: typing.Optional[str] = None )
Parameters
-
repo_id (
str) — The location where the pickled fastai.Learner is. It can be either of the two:- Hosted on the Hugging Face Hub. E.g.: ‘espejelomar/fatai-pet-breeds-classification’ or ‘distilgpt2’.
You can add a
revisionby appending@at the end ofrepo_id. E.g.:dbmdz/bert-base-german-cased@main. Revision is the specific model version to use. Since we use a git-based system for storing models and other artifacts on the Hugging Face Hub, it can be a branch name, a tag name, or a commit id. - Hosted locally.
repo_idwould be a directory containing the pickle and a pyproject.toml indicating the fastai and fastcore versions used to build thefastai.Learner. E.g.:./my_model_directory/.
- Hosted on the Hugging Face Hub. E.g.: ‘espejelomar/fatai-pet-breeds-classification’ or ‘distilgpt2’.
You can add a
-
revision (
str, optional) — Revision at which the repo’s files are downloaded. See documentation ofsnapshot_download.
Load pretrained fastai model from the Hub or from a local directory.
huggingface_hub.push_to_hub_fastai
< source >( learner repo_id: str commit_message: typing.Optional[str] = 'Add model' private: bool = False token: typing.Optional[str] = None config: typing.Optional[dict] = None branch: typing.Optional[str] = None create_pr: typing.Optional[bool] = None allow_patterns: typing.Union[typing.List[str], str, NoneType] = None ignore_patterns: typing.Union[typing.List[str], str, NoneType] = None api_endpoint: typing.Optional[str] = None git_user: typing.Optional[str] = None git_email: typing.Optional[str] = None )
Parameters
- learner (Learner) — The *fastai.Learner’ you’d like to push to the Hub.
- repo_id (str) — The repository id for your model in Hub in the format of “namespace/repo_name”. The namespace can be your individual account or an organization to which you have write access (for example, ‘stanfordnlp/stanza-de’).
-
commit_message (str`, optional*) — Message to commit while pushing. Will default to
"add model". - private (bool, optional, defaults to False) — Whether or not the repository created should be private.
-
token (str, optional) —
The Hugging Face account token to use as HTTP bearer authorization for remote files. If
None, the token will be asked by a prompt. - config (dict, optional) — Configuration object to be saved alongside the model weights.
- branch (str, optional) — The git branch on which to push the model. This defaults to the default branch as specified in your repository, which defaults to “main”.
- create_pr (boolean, optional) — Whether or not to create a Pull Request from branch with that commit. Defaults to False.
- api_endpoint (str, optional) — The API endpoint to use when pushing the model to the hub.
- allow_patterns (List[str] or str, optional) — If provided, only files matching at least one pattern are pushed.
- ignore_patterns (List[str] or str, optional) — If provided, files matching any of the patterns are not pushed.
Upload learner checkpoint files to the Hub.
Use allow_patterns and ignore_patterns to precisely filter which files should be pushed to the hub. See [upload_folder] reference for more details.
Raises the following error:
- ValueError if the user is not log on to the Hugging Face Hub.