This is a MLC converted weight from DeepHermes-3-Llama-3-8B-Preview model in MLC format q4f16_1.
q4f16_1
The model can be used for projects MLC-LLM and WebLLM.
Base model