ChakuChidiya commited on
Commit
678b7c3
1 Parent(s): a0c8175

Upload TFDistilBertForTokenClassification

Browse files
Files changed (3) hide show
  1. README.md +49 -0
  2. config.json +45 -0
  3. tf_model.h5 +3 -0
README.md ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: ChakuChidiya/distilbert-base-uncased-G1
4
+ tags:
5
+ - generated_from_keras_callback
6
+ model-index:
7
+ - name: distilbert-base-uncased-G2
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information Keras had access to. You should
12
+ probably proofread and complete it, then remove this comment. -->
13
+
14
+ # distilbert-base-uncased-G2
15
+
16
+ This model is a fine-tuned version of [ChakuChidiya/distilbert-base-uncased-G1](https://huggingface.co/ChakuChidiya/distilbert-base-uncased-G1) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 3285, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
38
+ - training_precision: float32
39
+
40
+ ### Training results
41
+
42
+
43
+
44
+ ### Framework versions
45
+
46
+ - Transformers 4.37.0
47
+ - TensorFlow 2.15.0
48
+ - Datasets 2.14.5
49
+ - Tokenizers 0.15.1
config.json ADDED
@@ -0,0 +1,45 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "ChakuChidiya/distilbert-base-uncased-G1",
3
+ "activation": "gelu",
4
+ "architectures": [
5
+ "DistilBertForTokenClassification"
6
+ ],
7
+ "attention_dropout": 0.1,
8
+ "dim": 768,
9
+ "dropout": 0.1,
10
+ "hidden_dim": 3072,
11
+ "id2label": {
12
+ "0": "O",
13
+ "1": "B-treatment",
14
+ "2": "I-treatment",
15
+ "3": "B-cancer",
16
+ "4": "I-cancer",
17
+ "5": "B-allergy_name",
18
+ "6": "I-allergy_name",
19
+ "7": "B-chronic_disease",
20
+ "8": "I-chronic_disease"
21
+ },
22
+ "initializer_range": 0.02,
23
+ "label2id": {
24
+ "B-allergy_name": 5,
25
+ "B-cancer": 3,
26
+ "B-chronic_disease": 7,
27
+ "B-treatment": 1,
28
+ "I-allergy_name": 6,
29
+ "I-cancer": 4,
30
+ "I-chronic_disease": 8,
31
+ "I-treatment": 2,
32
+ "O": 0
33
+ },
34
+ "max_position_embeddings": 512,
35
+ "model_type": "distilbert",
36
+ "n_heads": 12,
37
+ "n_layers": 6,
38
+ "pad_token_id": 0,
39
+ "qa_dropout": 0.1,
40
+ "seq_classif_dropout": 0.2,
41
+ "sinusoidal_pos_embds": false,
42
+ "tie_weights_": true,
43
+ "transformers_version": "4.37.0",
44
+ "vocab_size": 30522
45
+ }
tf_model.h5 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:44e77bc1148dec4a552444e6ac1ec6e449721efa570fc1ff217bd40d86829e9c
3
+ size 265606416