Text Generation
Transformers
PyTorch
mistral
openchat
C-RLFT
conversational
Inference Endpoints
text-generation-inference
imone commited on
Commit
0be788e
•
1 Parent(s): b6c8863

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -22,8 +22,8 @@ license: apache-2.0
22
  **🤖 #1 Open-source model on MT-bench scoring 7.81, outperforming 70B models 🤖**
23
 
24
  <div style="display: flex; justify-content: center; align-items: center">
25
- <img src="https://raw.githubusercontent.com/imoneoi/openchat/imoneoi-add-grok-baseline/assets/openchat.png" style="width: 45%;">
26
- <img src="https://raw.githubusercontent.com/imoneoi/openchat/imoneoi-add-grok-baseline/assets/openchat_grok.png" style="width: 45%;">
27
  </div>
28
 
29
  OpenChat is an innovative library of open-source language models, fine-tuned with [C-RLFT](https://arxiv.org/pdf/2309.11235.pdf) - a strategy inspired by offline reinforcement learning. Our models learn from mixed-quality data without preference labels, delivering exceptional performance on par with ChatGPT, even with a 7B model. Despite our simple approach, we are committed to developing a high-performance, commercially viable, open-source large language model, and we continue to make significant strides toward this vision.
 
22
  **🤖 #1 Open-source model on MT-bench scoring 7.81, outperforming 70B models 🤖**
23
 
24
  <div style="display: flex; justify-content: center; align-items: center">
25
+ <img src="https://raw.githubusercontent.com/imoneoi/openchat/master/assets/openchat.png" style="width: 45%;">
26
+ <img src="https://raw.githubusercontent.com/imoneoi/openchat/master/assets/openchat_grok.png" style="width: 45%;">
27
  </div>
28
 
29
  OpenChat is an innovative library of open-source language models, fine-tuned with [C-RLFT](https://arxiv.org/pdf/2309.11235.pdf) - a strategy inspired by offline reinforcement learning. Our models learn from mixed-quality data without preference labels, delivering exceptional performance on par with ChatGPT, even with a 7B model. Despite our simple approach, we are committed to developing a high-performance, commercially viable, open-source large language model, and we continue to make significant strides toward this vision.