asyafiqe commited on
Commit
fd42157
1 Parent(s): 27bcff2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -7,12 +7,12 @@ language:
7
  - id
8
  ---
9
  # 🦚Merak-7B-v3-Mini-Orca🐳
10
-
11
  <img src="https://i.imgur.com/39sQd3h.png" alt="Merak Orca" width="300" height="300"/>
 
12
 
13
  **Merak-7B-v3-Mini-Orca** is Ichsan2895's [Merak-7B-v3](https://huggingface.co/Ichsan2895/Merak-7B-v3) fine-tuned on Bahasa Indonesia translated psmathur's [orca_mini_v1_dataset](https://huggingface.co/datasets/psmathur/orca_mini_v1_dataset).
14
 
15
- [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
16
 
17
  ## Usage
18
  This model fit on 16GB VRAM GPU (Google Collab T4 wil do), by using BitsandBytes it can run on 6GB VRAM GPU
@@ -42,6 +42,8 @@ USER: <prompt> (without the <>)
42
  ASSISTANT:
43
  ```
44
  ## Training details
 
 
45
  Merak-7B-v3-Mini-Orca was instruction fine-tuned on 2 x 3090-24GB for 6 hours. [LoRA](https://github.com/microsoft/LoRA), [DeepSpeed ZeRO-2](https://github.com/microsoft/DeepSpeed), and [FlashAttention](https://github.com/Dao-AILab/flash-attention) were implemented during training using [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
46
  Hyperparameter | value |
47
  | ------ | ------ |
 
7
  - id
8
  ---
9
  # 🦚Merak-7B-v3-Mini-Orca🐳
10
+ <p align="center">
11
  <img src="https://i.imgur.com/39sQd3h.png" alt="Merak Orca" width="300" height="300"/>
12
+ </p>
13
 
14
  **Merak-7B-v3-Mini-Orca** is Ichsan2895's [Merak-7B-v3](https://huggingface.co/Ichsan2895/Merak-7B-v3) fine-tuned on Bahasa Indonesia translated psmathur's [orca_mini_v1_dataset](https://huggingface.co/datasets/psmathur/orca_mini_v1_dataset).
15
 
 
16
 
17
  ## Usage
18
  This model fit on 16GB VRAM GPU (Google Collab T4 wil do), by using BitsandBytes it can run on 6GB VRAM GPU
 
42
  ASSISTANT:
43
  ```
44
  ## Training details
45
+ [<img src="https://raw.githubusercontent.com/OpenAccess-AI-Collective/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="100" height="16"/>](https://github.com/OpenAccess-AI-Collective/axolotl)
46
+
47
  Merak-7B-v3-Mini-Orca was instruction fine-tuned on 2 x 3090-24GB for 6 hours. [LoRA](https://github.com/microsoft/LoRA), [DeepSpeed ZeRO-2](https://github.com/microsoft/DeepSpeed), and [FlashAttention](https://github.com/Dao-AILab/flash-attention) were implemented during training using [Axolotl](https://github.com/OpenAccess-AI-Collective/axolotl).
48
  Hyperparameter | value |
49
  | ------ | ------ |