Open-Source AI Cookbook documentation

Enterprise Hub Cookbook

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Enterprise Hub Cookbook

The Enterprise Hub Cookbook is designed for power users and enterprises who want to go beyond the standard free features of the Hugging Face Hub and integrate machine learning deeper into their production workflows. The cookbook guides you through a selection of recipes (Jupyter Notebooks) with copy-pastable code to help you get started with the Hub’s advanced features.

Interactive Development in HF Spaces

With JupyterLab Spaces you can spin up your personal Jupyter Notebook like in Google Colab, only with a wider selection of more reliable CPUs and GPUs (e.g. H100 or 4xA10G) that you can select and switch on the fly. Moreover, by activating Spaces Dev Mode you can also use this cloud hardware from your local IDE (e.g. VS Code). Read this recipe to learn how to spin up a GPU and connect to it via your local IDE.

For more details, read also the JupyterLab Spaces and the Dev Mode documentation.

Inference API (Serverless)

With our serverless Inference API, you can test a range of open source models with simple API calls (e.g. generative LLMs, efficient embedding models, or image generators). The serverless Inference API is rate limited and mostly intended for initial testing or low-volume use. Read this recipe to learn how to query the serverless Inference API.

For more details, read also the serverless API documentation.

Inference Endpoints (dedicated) (coming soon)

With our dedicated Inference Endpoints, you can easily deploy any model on a wide range of hardware, essentially creating your personal production-ready API in a few clicks. Read this recipe to learn how to create and configure your own dedicated Endpoint.

For more details, read also the dedicated Endpoint documentation.

Data Annotation with Argilla Spaces

Whether you’re zero-shot testing an LLM or training your own model, creating good test or train data is maybe the highest-value investment you can make at the beginning of your machine learning journey. Argilla is a free, open-source data annotation tool that enables you to create high-quality data for text, image, or audio tasks. Read this recipe to learn how to create a data annotation workflow (alone or in a larger team) in your browser.

See also the Argilla documentation and the HF Argilla Spaces integration for more details.

AutoTrain Spaces (coming soon)

With AutoTrain Spaces, you can train your own machine learning models in a simple interface without any code. Read this recipe to learn how to fine-tune your own LLM in an AutoTrain Space on the Hub on a wide range of GPUs.

See also the AutoTrain documentation to learn more.

Creating private demos with Spaces and Gradio (coming soon)

Visual demos speak louder than words. Demos are particularly important if you want to convince stakeholders of a machine learning minimum viable product (MVP). Read this recipe to learn how to create a private machine learning demo on Spaces with Gradio.

See also the Spaces and Gradio Spaces documentation to learn more.

Advanced collaboration on the Hub (coming soon)

As your team and use cases grow, managing datasets, models, and team members becomes more complex. Read this recipe to learn about advanced collaboration features such as private datasets for specific resource groups, git-based versioning, and YAML tags in model cards.

Take a look at the Hub and Hub Python Library documentation for more information.

< > Update on GitHub