apcl
/

aakashba commited on
Commit
deb0afb
1 Parent(s): 2242ba3

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -5,14 +5,14 @@ datasets:
5
  - apcl/jm52m
6
  ---
7
 
8
- # Jam-sojm
9
- Jam-sojm is a GPT2-like model for research in fine-grained Java analysis. It is intended for fine-grained analysis of Java source code at the level of methods, statements, and variables, as a foundation for downstream tasks like code completion, comment generation, and automated bug repair.
10
 
11
  ---
12
 
13
- ## Jam-sojm Training Details
14
 
15
- - We trained the jam-sojm model using the training procedures from Daniel Grittner's [NanoGPT-LoRA](https://github.com/danielgrittner/nanoGPT-LoRA)
16
 
17
  - The datasets used to train our model are our own datasets [so13m dataset](https://huggingface.co/datasets/apcl/so13m) and [jm52m dataset](https://huggingface.co/datasets/apcl/jm52m).
18
 
 
5
  - apcl/jm52m
6
  ---
7
 
8
+ # Jam_sojm
9
+ Jam_sojm is a GPT2-like model for research in fine-grained Java analysis. It is intended for fine-grained analysis of Java source code at the level of methods, statements, and variables, as a foundation for downstream tasks like code completion, comment generation, and automated bug repair.
10
 
11
  ---
12
 
13
+ ## Jam_sojm Training Details
14
 
15
+ - We trained the jam_sojm model using the training procedures from Daniel Grittner's [NanoGPT-LoRA](https://github.com/danielgrittner/nanoGPT-LoRA)
16
 
17
  - The datasets used to train our model are our own datasets [so13m dataset](https://huggingface.co/datasets/apcl/so13m) and [jm52m dataset](https://huggingface.co/datasets/apcl/jm52m).
18