Model works fine on A100 (80GB) GPU

#541
by aigeek0x0 - opened

Hi, I have run the tests as suggested on the "About" page of the HF Open LLM Leaderboard, and they worked perfectly fine. Could you please help me understand why they might have failed on your end? Thank you!

https://huggingface.co/datasets/open-llm-leaderboard/requests/blob/main/AIGeekLabs/radiantloom-mixtral-8x7b-fusion_eval_request_False_float16_Original.json

Hugging Face H4 org

Hi!
Your model actually ran properly, don't know why it was flagged as failed cc: @SaylorTwift
Your results are here, I'll update your request file accordingly :)

clefourrier changed discussion status to closed
Hugging Face H4 org

Hi, sorry for the inconvenience, it was a bug on our end when checking that evals were done successfully, it should be fixed :)

Thank you.... it shows up now..

Sign up or log in to comment