[ANNOUNCEMENT] πŸ“£ HuggingChat is closing for now

#747
by julien-c - opened
Hugging Chat org
β€’
edited 11 days ago

I have bittersweet news to share. 😒

We are closing down huggingchat for now.

banner-huggingchat-closing.png

Huggingchat was launched in April 2023 when ChatGPT was just 5 months old and it was still quite hard to deploy a similar service. The goal of this app was to showcase that it was now possible to build an open source alternative to ChatGPT, building on top of the community's models and building blocks. πŸ’ͺ

We also built Huggingchat as a testbed to optimize inference which led to the creation of text-generation-inference which in turn led to HF contributions to SGLang, vLLM, llama.cpp, and other LLM runtime engines since then.

In the past 27 months we've launched:

  • πŸ§™ Assistants, an easy way to bundle a prompt, a model, and some tools into an easily-shareable URL you can send your friends
  • πŸ› οΈ HuggingChat Tools, exposing the full ecosystem of Gradio Spaces straight from within the chat interface
  • πŸ’¬ Many model launches were supported on HuggingChat: from OpenAssistant, all the LLamas, Phi-4, to Qwen, DeepSeek, Gemma.... πŸ”₯

The app has always been free and experimental. Today we are closing it to make room for something new and more integrated with the HF ecosystem. You can export your past conversations as a Zip file, to load them into another interface.

If you are looking for a cool alternative interface you should take a look at:

  • LibreChat a clean and simple yet feature-packed chat interface
  • openwebui – does it even need an intro anymore?
  • Scira MCP, an up and coming MCP-centric chat interface

Finally, chat-ui (the codebase powering HuggingChat) is still maintained (GitHub repo).
You can 1-click deploy your own instance using the Chat UI Spaces Docker template.

What's happening to huggingchat  πŸ˜’ ?

julien-c pinned discussion

I hope it's not too long.

no 😭

Also, how do I export my past conversations?

Hi everyone! πŸ‘‹

This is also my last week at Hugging Face and I'm currently working on a simple page to let you export your personal data & conversation history from HuggingChat so you can reuse it on other platforms in the future. Let me know if you have any questions, I'm always happy to answer them.

It was a pleasure building this app for y'all 🫑

This comment has been hidden

Thats sad.

I wished this never happened

This comment has been hidden (marked as Off-Topic)
This comment has been hidden (marked as Spam)

Hi all,

I previously setup an instance of chat-ui with several open-source LLMs. I recently switched to self-hosted instance of Mistral Small 3.2 (FP8) on vLLM with no message logging - would love any feedback or suggestions (discord: @realmrfakename)!

https://chat.mrfake.name/

what api are you using or are you using huggingchats api?

For the default model I am self-hosting the Mistral model (so there is no logging), for the other models I use either the Mistral API or Groq API on the free tiers so they may log. For no logging use the default model which is Mistral Small 3.2. Happy to consider hosting another model for other purposes if people would find that more useful (though I can realistically only host one model at a time)

someone gave me a very valuable service for free and now wants to get some money for it having proven it's actually valuable.
i'm shocked. how dare they treat me like an adult!

someone gave me a very valuable service for free and now wants to get some money for it having proven it's actually valuable.
i'm shocked. how dare they treat me like an adult!

I'm more than happy to pay for a service like HC. Mistral Pro is $15/mo. and I would pay that much for HC, it was that good. What made HC unlike anything out there is not that they weren't charging, it's that you could select any open-source model you want, along with really good UI, chat history, reasoning, Web-search, etc. I'm happy even to pay for a purely private instance of HC with pay-per-compute, as long as the setup is "one-click". I don't want to be configuring my own VPS, etc. too much headache. I just want to click-and-run on selectable, frontier, open source models. That was the unique selling-point of HC, and this is absolutely a business-model that HF could implement to generate some revenue on the side. Also, HF has brand-value that I trust and that made HC all the more valuable to me. I don't trust the proprietary players, they keep getting caught with their hand in the cookie-jar. I'm happy to pay for a service, like a grown-up, but the service has to exist in order for it to be paid for!

Hi all,

I previously setup an instance of chat-ui with several open-source LLMs. I recently switched to self-hosted instance of Mistral Small 3.2 (FP8) on vLLM with no message logging - would love any feedback or suggestions (discord: @realmrfakename)!

https://chat.mrfake.name/

what api are you using or are you using huggingchats api?

For the default model I am self-hosting the Mistral model (so there is no logging), for the other models I use either the Mistral API or Groq API on the free tiers so they may log. For no logging use the default model which is Mistral Small 3.2. Happy to consider hosting another model for other purposes if people would find that more useful (though I can realistically only host one model at a time)

Ohh okay thanks pretty cool ngl

I feel like I’m going mad. where’s the export data button everyone mentioned. The button was there yesterday but now it’s gone :/ I thought I have two weeks to download

IMG_0459.jpeg
IMG_0460.jpeg

Hi all,

I previously setup an instance of chat-ui with several open-source LLMs. I recently switched to self-hosted instance of Mistral Small 3.2 (FP8) on vLLM with no message logging - would love any feedback or suggestions (discord: @realmrfakename)!

https://chat.mrfake.name/

what api are you using or are you using huggingchats api?

For the default model I am self-hosting the Mistral model (so there is no logging), for the other models I use either the Mistral API or Groq API on the free tiers so they may log. For no logging use the default model which is Mistral Small 3.2. Happy to consider hosting another model for other purposes if people would find that more useful (though I can realistically only host one model at a time)

Self hosting a model for public use must have cost lots of resource.
Quite appreciating your generosity and kindness, though I'm able to host mine for personal usage.

Sign up or log in to comment