mipo57 commited on
Commit
9e1afa1
1 Parent(s): b4d3ef5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -7,6 +7,8 @@ datasets:
7
  ---
8
  # Jaskier 7b DPO
9
 
 
 
10
  Model based on `mindy-labs/mindy-7b-v2` (downstream version of Mistral7B) finetuned using Direct Preference Optimization on Intel/orca_dpo_pairs.
11
 
12
  ## How to use
 
7
  ---
8
  # Jaskier 7b DPO
9
 
10
+ **This is work-in-progress model, may not be ready for production use**
11
+
12
  Model based on `mindy-labs/mindy-7b-v2` (downstream version of Mistral7B) finetuned using Direct Preference Optimization on Intel/orca_dpo_pairs.
13
 
14
  ## How to use