The AI community building the future.

Build, train and deploy state of the art models powered by the reference open source in machine learning.


Home of Machine Learning

Create, discover and collaborate on ML better.
Join the community to start your ML journey.

Sign Up
Hugging Face Hub dashboard

Open Source


Transformers is our natural language processing library and our hub is now open to all ML models, with support from libraries like Flair, Asteroid, ESPnet, Pyannote, and more to come.

Read documentation
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForMaskedLM.from_pretrained("bert-base-uncased")

On demand

Inference API

Serve your models directly from Hugging Face infrastructure and run large scale NLP models in milliseconds with just a few lines of code.

Learn more
Mask token: [MASK]
This model can be loaded on the Inference API on-demand.
Token Classification
This model can be loaded on the Inference API on-demand.

Our Research contributions

We’re on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.



Hierarchical Multi-Task Learning

Our paper has been accepted to AAAI 2019. We have open-sourced code and demo.

Read more


Thomas Wolf et AL.

Meta-learning for language modeling

Our workshop paper on Meta-Learning a Dynamical Language Model was accepted to ICLR 2018. We use our implementation to power 🤗.

Read more


Auto-complete your thoughts

Write with Transformers

This web app, built by the Hugging Face team, is the official demo of the Transformers repository's text generation capabilities.

Start writing


State of the art


Our coreference resolution module is now the top open source library for coreference. You can train it on your own dataset and language.

Read more


Victor Sanh et AL. 2019


Distilllation. A smaller, faster, lighter, cheaper version of BERT. Code and weights are available through Transformers.

Read more