teknium commited on
Commit
51b7f98
1 Parent(s): 9c16135

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -30,6 +30,8 @@ The model was trained on over 1,000,000 entries of primarily GPT-4 generated dat
30
 
31
  This is the SFT + DPO version of Mixtral Hermes 2, we will also be providing an SFT only version, for people to find which works best for them.
32
 
 
 
33
  # Table of Contents
34
  1. [Example Outputs](#example-outputs)
35
  2. [Benchmark Results](#benchmark-results)
 
30
 
31
  This is the SFT + DPO version of Mixtral Hermes 2, we will also be providing an SFT only version, for people to find which works best for them.
32
 
33
+ ## Huge shout out to Together.ai for sponsoring our compute during the many experiments both training Mixtral and working on DPO!
34
+
35
  # Table of Contents
36
  1. [Example Outputs](#example-outputs)
37
  2. [Benchmark Results](#benchmark-results)