Is this model down?
Can't get any response from this model. All others seem to be working. No "model is overloaded" message. What's up?
I've also been getting no response when I try to use the command-r+ model on huggingchat since about 2 and a half hours ago. It may be for the same reason.
I've also been getting no response when I try to use the command-r+ model on huggingchat since about 2 and a half hours ago. It may be for the same reason.
I've since noticed these things:
Text Generation
Inference API (serverless) has been turned off for this model.
And then in history, I see that user saurabhdash changed this single line of code:
inference: false
Hey, saurabhdash, can you turn it back on?
I have also not been able to use it, no matter what I test or try with the model, it always is stuck on the loading bars ever since earlier this morning.
It was edited about 30 minutes ago, but it seems the settings haven't changed at all, just the formatting...
This is so disappointing. Every time we get a quality model, it disappears after a month or two. I don't know what they're planning next, but this sort of thing sucks.
I thought, I was the only one, I tried everything. Sad, part is their going to fix this model and change what made it good.
Cohere's official free trial has no usage restrictions, but there seems to be a rate limit, and you can't opt out of learning.
I would like to be able to use it again on huggingchat.
Cohere's official free trial has no usage restrictions, but there seems to be a rate limit, and you can't opt out of learning.
What do you mean? Like on their website or something, or through API? And I was under the impression that conversations here were private and not used for learning or anything else. To quote from HuggingChat's privacy policy, "your conversations are private to you and will not be shared with anyone, including model authors, for any purpose, including for research or model training purposes."
To quote from HuggingChat's privacy policy, "your conversations are private to you and will not be shared with anyone, including model authors, for any purpose, including for research or model training purposes."
That's right, so I wrote that I don't want to use it in places other than huggingface (including the cohere platform).
That's right, so I wrote that I don't want to use it in places other than huggingface (including the cohere platform).
But how could it be used in other places if HF isn't sharing it with them?
I don't know why, but Command R+ is now working again in HuggingChat…
Looks like it’s back online!? 🙌
Amazing.
If it was saurabhdash or sarahooker, thank you. If it was anyone else, thank you too!
We're so back.
Every model today has started either saying "Miodel Overloaded" ort just three loading dots and neve starting a response to any prompt. :(
Hey @TheAGames10
Will get this fixed in a bit. Thanks for reporting.
Hey @TheAGames10
Will get this fixed in a bit. Thanks for reporting.
Started seeing the same three dots never starting a prompt response again.
The issue is still persisted, I am unable to use Cohere no matter what prompt I try and ask it or try and do. :(
Hey @TheAGames10
Sorry that you're facing issues on HF chat using R+.
Did you try out using R+ via our HF space ?
The UI experience is same as HF chat. Let me know if this works as an alternative for you. Thanks!
Hey @TheAGames10
Sorry that you're facing issues on HF chat using R+.
Did you try out using R+ via our HF space ?
The UI experience is same as HF chat. Let me know if this works as an alternative for you. Thanks!
I am unable to use Command R+ space, it always forces a restart and deletes every chat I have, unlike CoHere on default HuggingChat, which is STILL giving the same issue to this day.
Ah yeah, saving chats might be an issue with HF space. Can check on this though. I would recommend using Coral offered by Cohere in case you want access to more mature UI features while using R+ model as an alternative to Hugging Chat.
Also for issues specific to Hugging Chat, I would recommend reporting it here . That way HF staff can directly pick it up.