Command R Plus more often than not loads forever and never gives an answer

#499
by ElvisM - opened

It's a pretty annoying behavior since it's the only model I use and the one that allows upload of files. Sometimes, starting a new chat or restarting the interface makes it work again, but it tends to get stuck on loading forever.

I've noticed this bug today too about an hour ago after waking up. There is no visible error when this bug happens, but the console shows "Uncaught (in promise) TypeError: network error"

image.png

Other models, like Llama-3-70b, are working fine to my knowledge.

c4ai-command-r-plus is loading forever as well

Daily overload

Obviously, its quantitative version or 35B version should be used as the default model.

The issue seems to be fixed (at least on my end with using Command-R+) as of 00:20 Pacific Time.

Edit: Never mind, now the website shows an error "Service Unavailable"
Edit 2: Back to uncaught error.

It seems to have stopped working again😭 anyone facing the same

yep

It seems to have stopped working again😭 anyone facing the same

Same here. It really makes me want to buy whatever card has the most VRAM to be able to use the model offline.

Hi all, just wanted to share here that going forward, you can also access Command-R models using our HF Space. We've recently upgraded it and the UI experience is same as Hugging Chat so feel free to try this out too in case you're facing issues with using R/R+ on Hugging Chat.

Sign up or log in to comment