FPHam commited on
Commit
59a0188
1 Parent(s): 0e4c279

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -23,16 +23,17 @@ tags:
23
 
24
  Autolycus is a son of Hermes.
25
 
26
- Autolycus-Mistral is a language/content refinement of Open Hermes 2.5 Mistral, with the aim of taking it's responses from stilted, gpt-4 robotic gibberish into something approaching actual English, but at the expense of a slight increase in lying, fabrication and general BS.
27
 
28
  7-billion models are not known for their complete honesty.
29
 
30
- The most blatant instances of "making stuff up", of course, are those times when Autolycus actually cites some reference (usually a book title or name, or date), but which you find to be nothing more than a load of hogwash when you check it out for yourself.
31
 
32
  ## Example
33
 
34
- After you examine the following two examples (LLama-Precise with low top_p) you can see how much better the response got after the Autolycus improved it by adding more content and making it more relevant and personal ("Visit Japan!") and also giving the whole shebang an Earthy, almost humanoid touch.
35
- In return, the OpenHermes Mistral response comes back impersonally GPT-4 dry.
 
36
 
37
  - Original model: [OpenHermes 2.5 Mistral 7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)
38
 
 
23
 
24
  Autolycus is a son of Hermes.
25
 
26
+ Autolycus-Mistral is a language/content refinement of OpenHermes 2.5 Mistral, intended to take its output from stilted GPT-4 robotic gobbledygook into something approaching natural English - at the cost of only a very slight increase in prevarication, exaggeration and downright BS.
27
 
28
  7-billion models are not known for their complete honesty.
29
 
30
+ The most brazen examples of 'making things up', were those occasions where Autolycus actually quoted a source; usually a book title or author, sometimes a date, but which you find to be nothing more than a load of hogwash when you check it out for yourself.
31
 
32
  ## Example
33
 
34
+ Compare this example (Llama-precise, with Low top_p), where the Autolycus (bottom image) improves on the response by adding extra material - making it more informative, more relevant and personal ('Visit Japan') - and at the same time gives the whole thing an earthy, almost human touch.
35
+
36
+ The OpenHermes Mistral (top image) responds, rather impersonally, in the dry tones of GPT-4.
37
 
38
  - Original model: [OpenHermes 2.5 Mistral 7B](https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B)
39