Mxode commited on
Commit
139a5db
1 Parent(s): 71fb641

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -1
README.md CHANGED
@@ -7,11 +7,20 @@ tags:
7
  - knowledge extraction
8
  - tiny
9
  - small
 
10
  ---
 
 
11
  A model that can **extract the knowledge points** from the given **C language code**.
12
 
13
  The base model is [pythia-70m](https://huggingface.co/EleutherAI/pythia-70m). This model was fine-tuned with 10 epochs using [Q-Lora](https://github.com/artidoro/qlora) method on my own training set.
14
 
 
 
 
 
 
 
15
  A usage example is as follows, first import the model and prepare the code:
16
 
17
  ```python
@@ -66,6 +75,8 @@ response = tokenizer.decode(tokens[0]).split('```')[-1].split('<')[0]
66
 
67
 
68
 
 
 
69
  However, in practical use, in order to achieve more diverse representations, it's recommended to do multiple inferences. Don't worry, it's really small so the inferences don't take much time, as follows:
70
 
71
  ```python
@@ -95,5 +106,5 @@ print(ans_dict)
95
  ### 'Quick sort': 25,
96
  ### 'Recurrence': 2,
97
  ### 'Queue': 1
98
- ###}
99
  ```
 
7
  - knowledge extraction
8
  - tiny
9
  - small
10
+ - C
11
  ---
12
+ ## Model info
13
+
14
  A model that can **extract the knowledge points** from the given **C language code**.
15
 
16
  The base model is [pythia-70m](https://huggingface.co/EleutherAI/pythia-70m). This model was fine-tuned with 10 epochs using [Q-Lora](https://github.com/artidoro/qlora) method on my own training set.
17
 
18
+
19
+
20
+ ## How to use
21
+
22
+ ### quick start
23
+
24
  A usage example is as follows, first import the model and prepare the code:
25
 
26
  ```python
 
75
 
76
 
77
 
78
+ ### and more
79
+
80
  However, in practical use, in order to achieve more diverse representations, it's recommended to do multiple inferences. Don't worry, it's really small so the inferences don't take much time, as follows:
81
 
82
  ```python
 
106
  ### 'Quick sort': 25,
107
  ### 'Recurrence': 2,
108
  ### 'Queue': 1
109
+ ### }
110
  ```