cygu commited on
Commit
4b6addd
1 Parent(s): 92d9d30

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -26
README.md CHANGED
@@ -1,31 +1,13 @@
1
  ---
2
  tags:
3
  - generated_from_trainer
4
- model-index:
5
- - name: kgw_gamma0.25_delta2_640k_pythia-1.4b-lr1e-5
6
- results: []
7
  ---
8
 
9
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
- should probably proofread and complete it, then remove this comment. -->
11
-
12
- # kgw_gamma0.25_delta2_640k_pythia-1.4b-lr1e-5
13
-
14
- This model is a fine-tuned version of [/scr-ssd/cygu/weights/pythia-1.4b](https://huggingface.co//scr-ssd/cygu/weights/pythia-1.4b) on an unknown dataset.
15
-
16
  ## Model description
17
 
18
- More information needed
19
-
20
- ## Intended uses & limitations
21
-
22
- More information needed
23
-
24
- ## Training and evaluation data
25
-
26
- More information needed
27
-
28
- ## Training procedure
29
 
30
  ### Training hyperparameters
31
 
@@ -40,13 +22,9 @@ The following hyperparameters were used during training:
40
  - lr_scheduler_warmup_steps: 500
41
  - num_epochs: 1.0
42
 
43
- ### Training results
44
-
45
-
46
-
47
  ### Framework versions
48
 
49
  - Transformers 4.29.2
50
  - Pytorch 2.0.1+cu117
51
  - Datasets 2.13.1
52
- - Tokenizers 0.13.3
 
1
  ---
2
  tags:
3
  - generated_from_trainer
4
+ - pythia
5
+ license: apache-2.0
 
6
  ---
7
 
 
 
 
 
 
 
 
8
  ## Model description
9
 
10
+ Sampling-based watermark distilled [Pythia 1.4B](https://huggingface.co/EleutherAI/pythia-1.4b) using the KGW \\(\gamma=0.25, \delta=2\\) watermarking strategy in the paper [On the Learnability of Watermarks for Language Models](https://arxiv.org/abs/2312.04469).
 
 
 
 
 
 
 
 
 
 
11
 
12
  ### Training hyperparameters
13
 
 
22
  - lr_scheduler_warmup_steps: 500
23
  - num_epochs: 1.0
24
 
 
 
 
 
25
  ### Framework versions
26
 
27
  - Transformers 4.29.2
28
  - Pytorch 2.0.1+cu117
29
  - Datasets 2.13.1
30
+ - Tokenizers 0.13.3