-
-
-
-
-
-
Active filters:
DPO
NousResearch/Hermes-2-Theta-Llama-3-8B
Text Generation
•
Updated
•
7.19k
•
111
NousResearch/Hermes-2-Pro-Llama-3-8B
Text Generation
•
Updated
•
65.7k
•
351
aaditya/Llama3-OpenBioLLM-70B
Text Generation
•
Updated
•
52.1k
•
275
NousResearch/Hermes-2-Theta-Llama-3-8B-GGUF
Updated
•
11.8k
•
59
NousResearch/Hermes-2-Pro-Mistral-7B
Text Generation
•
Updated
•
23.3k
•
467
NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF
Updated
•
42.2k
•
136
aaditya/Llama3-OpenBioLLM-8B
Text Generation
•
Updated
•
22.5k
•
109
ruslanmv/Medical-Llama3-8B
Text Generation
•
Updated
•
3.02k
•
25
NousResearch/Hermes-2-Pro-Mistral-7B-GGUF
Updated
•
25.8k
•
211
TheBloke/Nous-Hermes-2-Mixtral-8x7B-DPO-GPTQ
Text Generation
•
Updated
•
39.5k
•
25
mradermacher/OpenBioLLM-Llama3-70B-GGUF
Updated
•
693
•
3
LiteLLMs/Llama3-OpenBioLLM-70B-GGUF
Updated
•
497
•
2
solidrust/Hermes-2-Pro-Llama-3-8B-AWQ
Text Generation
•
Updated
•
6.26k
•
10
bartowski/Hermes-2-Pro-Llama-3-8B-GGUF
Text Generation
•
Updated
•
66.6k
•
8
Writer/Palmyra-Med-70B
Text Generation
•
Updated
•
129
•
1
bartowski/Hermes-2-Theta-Llama-3-8B-exl2
Text Generation
•
Updated
•
10
•
1
jtatman/tinymistral-248-DPO-lora
Text Generation
•
Updated
xDAN-AI/xDAN-L1-Chat-RL-v1
Text Generation
•
Updated
•
3.38k
•
63
xDAN-AI/xDAN-L1-Chat-RL-v1-awq
Text Generation
•
Updated
LoneStriker/xDAN-L1-Chat-RL-v1-3.0bpw-h6-exl2
Text Generation
•
Updated
•
3
LoneStriker/xDAN-L1-Chat-RL-v1-4.0bpw-h6-exl2
Text Generation
•
Updated
•
2
LoneStriker/xDAN-L1-Chat-RL-v1-5.0bpw-h6-exl2
Text Generation
•
Updated
•
2
LoneStriker/xDAN-L1-Chat-RL-v1-6.0bpw-h6-exl2
Text Generation
•
Updated
•
2
•
1
LoneStriker/xDAN-L1-Chat-RL-v1-8.0bpw-h8-exl2
Text Generation
•
Updated
•
5
TheBloke/xDAN-L1-Chat-RL-v1-GGUF
Updated
•
376
•
12
TheBloke/xDAN-L1-Chat-RL-v1-AWQ
Text Generation
•
Updated
•
3
•
1
TheBloke/xDAN-L1-Chat-RL-v1-GPTQ
Text Generation
•
Updated
•
2
•
3
bartowski/xDAN-L1-Chat-RL-v1-exl2
Text Generation
•
Updated
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO-adapter
NousResearch/Nous-Hermes-2-Mixtral-8x7B-DPO
Text Generation
•
Updated
•
28k
•
372