gvij commited on
Commit
b808f98
1 Parent(s): ad016bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -1
README.md CHANGED
@@ -1,9 +1,40 @@
1
  ---
 
 
2
  library_name: peft
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
4
  ## Training procedure
 
5
 
6
- ### Framework versions
 
 
7
 
 
 
 
 
 
 
 
 
 
8
 
9
  - PEFT 0.4.0
 
 
 
 
1
  ---
2
+ datasets:
3
+ - nampdn-ai/tiny-codes
4
  library_name: peft
5
+ tags:
6
+ - llama2
7
+ - llama2-7b
8
+ - code generation
9
+ - code-generation
10
+ - code
11
+ - instruct
12
+ - instruct-code
13
+ - code-alpaca
14
+ - alpaca-instruct
15
+ - alpaca
16
+ - llama7b
17
+ - gpt2
18
+ license: apache-2.0
19
  ---
20
  ## Training procedure
21
+ We finetuned [Llama 2 7B model](https://huggingface.co/meta-llama/Llama-2-7b-hf) from Meta on [nampdn-ai/tiny-codes](https://huggingface.co/datasets/nampdn-ai/tiny-codes) for ~ 10,000 steps using [MonsterAPI](https://monsterapi.ai) no-code [LLM finetuner](https://docs.monsterapi.ai/fine-tune-a-large-language-model-llm).
22
 
23
+ This dataset contains **1.63 million rows** and is a collection of short and clear code snippets that can help LLM models learn how to reason with both natural and programming languages. The dataset covers a wide range of programming languages, such as Python, TypeScript, JavaScript, Ruby, Julia, Rust, C++, Bash, Java, C#, and Go. It also includes two database languages: Cypher (for graph databases) and SQL (for relational databases) in order to study the relationship of entities.
24
+
25
+ The finetuning session got completed in 53 hours and costed us ~ `$125` for the entire finetuning run!
26
 
27
+ #### Hyperparameters & Run details:
28
+ - Model Path: meta-llama/Llama-2-7b-hf
29
+ - Dataset: nampdn-ai/tiny-codes
30
+ - Learning rate: 0.0002
31
+ - Number of epochs: 1 (10k steps)
32
+ - Data split: Training: 90% / Validation: 10%
33
+ - Gradient accumulation steps: 1
34
+
35
+ ### Framework versions
36
 
37
  - PEFT 0.4.0
38
+
39
+ ### Loss metrics:
40
+ ![training loss](train-loss.png "Training loss")