BramVanroy commited on
Commit
7c68121
β€’
1 Parent(s): dc966c6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -27,13 +27,16 @@ inference: false
27
  <em>An open and efficient LLM for Dutch</em>
28
  </div>
29
 
30
- <blockquote class="tip">
31
  <p align="center">
32
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b">πŸ‘±β€β™€οΈ Base version</a> (this one) -
33
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-instruct">πŸ€– Instruct version</a> -
34
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat">πŸ’¬ Chat version</a> -
35
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-GGUF">πŸš€ GGUF of base model</a>
36
  </p>
 
 
 
37
  </blockquote>
38
 
39
  Fietje is an adapated version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2), tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like [GEITje 7B Ultra](https://huggingface.co/BramVanroy/GEITje-7B-ultra).
 
27
  <em>An open and efficient LLM for Dutch</em>
28
  </div>
29
 
30
+ <blockquote class="tip" style="padding: 1.5em; border: 0">
31
  <p align="center">
32
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b">πŸ‘±β€β™€οΈ Base version</a> (this one) -
33
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-instruct">πŸ€– Instruct version</a> -
34
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat">πŸ’¬ Chat version</a> -
35
  <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-GGUF">πŸš€ GGUF of base model</a>
36
  </p>
37
+ <p align="center" style="text-align: center; margin: 0">
38
+ <a href="https://huggingface.co/spaces/BramVanroy/fietje-2b"><strong>Chat with Fietje here!</strong></a>
39
+ </p>
40
  </blockquote>
41
 
42
  Fietje is an adapated version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2), tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like [GEITje 7B Ultra](https://huggingface.co/BramVanroy/GEITje-7B-ultra).