
Allen Institute for AI
non-profit • 47 models
Build, train and deploy state of the art models powered by the reference open source in natural language processing.
Model Hub
Browse the model hub to discover, experiment and contribute to new state of the art models.
Explore modelsOn demand
Serve your models directly from Hugging Face infrastructure and run large scale NLP models in milliseconds with just a few lines of code.
See pricinghappiness
0.036
survival
0.031
salvation
0.017
freedom
0.017
unity
0.015
Open Source
Transformers is our natural language processing library and our hub is now open to all ML models, with support from libraries like Flair, Asteroid, ESPnet, Pyannote, and more to come.
Check documentationWe’re on a journey to advance and democratize NLP for everyone. Along the way, we contribute to the development of technology for the better.
📚
HMTL
Our paper has been accepted to AAAI 2019. We have open-sourced code and demo.
Read more🐸
Thomas Wolf et AL.
Our workshop paper on Meta-Learning a Dynamical Language Model was accepted to ICLR 2018. We use our implementation to power 🤗.
Read more🦄
Auto-complete your thoughts
This web app, built by the Hugging Face team, is the official demo of the Transformers repository's text generation capabilities.
Start writing🤖
State of the art
Our coreference resolution module is now the top open source library for coreference. You can train it on your own dataset and language.
Read more🐎
Victor Sanh et AL. 2019
Distilllation. A smaller, faster, lighter, cheaper version of BERT. Code and weights are available through Transformers.
Read more