Transformers
Safetensors
atomformer
custom_code
Inference Endpoints
akore commited on
Commit
3ba9b79
·
verified ·
1 Parent(s): 4fec5d2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -75,6 +75,6 @@ The model is trained for 300 epochs with a batch size of 512, learning rate of 1
75
  Comparison between leveraging the pre-trained base model and training from scratch on the SMP task:
76
 
77
  | | base | scratch |
78
- |:----:|:----:|:----:|:----:|
79
  | val | 0.2304 | 0.1766 |
80
  | test | 1.077 | 1.13 |
 
75
  Comparison between leveraging the pre-trained base model and training from scratch on the SMP task:
76
 
77
  | | base | scratch |
78
+ |:----:|:----:|:----:|
79
  | val | 0.2304 | 0.1766 |
80
  | test | 1.077 | 1.13 |