Afrizal Hasbi Azizy commited on
Commit
92195e6
1 Parent(s): 931ce8d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -16,7 +16,7 @@ language:
16
  <p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
17
  </center>
18
 
19
- ### Introducing the Kancil family of open models
20
 
21
  Selamat datang!
22
 
@@ -32,13 +32,13 @@ This is the very first working prototype, Kancil V0. It supports basic QA functi
32
 
33
  This model was fine-tuned with QLoRA using the amazing Unsloth framework! It was built on top of [unsloth/llama-3-8b-bnb-4bit](https://huggingface.co/unsloth/llama-3-8b-bnb-4bit) and subsequently merged with the adapter back to 4 bit (no visible difference with merging back to fp 16).
34
 
35
- ## Uses
36
 
37
- ### Direct Use
38
 
39
  This model is developed with research purposes for researchers or general AI hobbyists. However, it has one big application: You can have lots of fun with it!
40
 
41
- ### Out-of-Scope Use
42
 
43
  This is a minimally-functional research preview model with no safety curation. Do not use this model for commercial or practical applications.
44
 
@@ -89,7 +89,7 @@ FastLanguageModel.for_inference(model)
89
  inputs = tokenizer(
90
  [
91
  prompt_template.format(
92
- prompt="Bagaimana canting dan malam digunakan untuk menggambar pola batik?",
93
  response="",)
94
  ], return_tensors = "pt").to("cuda")
95
 
 
16
  <p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
17
  </center>
18
 
19
+ #### Introducing the Kancil family of open models
20
 
21
  Selamat datang!
22
 
 
32
 
33
  This model was fine-tuned with QLoRA using the amazing Unsloth framework! It was built on top of [unsloth/llama-3-8b-bnb-4bit](https://huggingface.co/unsloth/llama-3-8b-bnb-4bit) and subsequently merged with the adapter back to 4 bit (no visible difference with merging back to fp 16).
34
 
35
+ ### Uses
36
 
37
+ ## Direct Use
38
 
39
  This model is developed with research purposes for researchers or general AI hobbyists. However, it has one big application: You can have lots of fun with it!
40
 
41
+ ## Out-of-Scope Use
42
 
43
  This is a minimally-functional research preview model with no safety curation. Do not use this model for commercial or practical applications.
44
 
 
89
  inputs = tokenizer(
90
  [
91
  prompt_template.format(
92
+ prompt="Bagaimana canting dan lilin digunakan untuk menggambar pola batik?",
93
  response="",)
94
  ], return_tensors = "pt").to("cuda")
95