BroLaurens commited on
Commit
0296e8f
1 Parent(s): f867e3f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -2
README.md CHANGED
@@ -38,7 +38,6 @@ int2str = {
38
  str2int = {v:k for k,v in int2str.items()}
39
 
40
  # Load model dependencies
41
-
42
  model = AutoModelForTokenClassification.from_pretrained(
43
  "brolaurens/finer-distilbert", num_labels=len(int2str), id2label=int2str, label2id=str2int
44
  )
@@ -57,6 +56,18 @@ model_input = tokenizer(texts, return_tensors='pt')
57
  predictions = model(**model_input).logits
58
  predictions = predictions.argmax(axis=2)
59
  predicted_labels = [[int2str[x] for x in t] for t in predictions.tolist()]
 
 
 
60
 
 
61
 
62
- ```
 
 
 
 
 
 
 
 
 
38
  str2int = {v:k for k,v in int2str.items()}
39
 
40
  # Load model dependencies
 
41
  model = AutoModelForTokenClassification.from_pretrained(
42
  "brolaurens/finer-distilbert", num_labels=len(int2str), id2label=int2str, label2id=str2int
43
  )
 
56
  predictions = model(**model_input).logits
57
  predictions = predictions.argmax(axis=2)
58
  predicted_labels = [[int2str[x] for x in t] for t in predictions.tolist()]
59
+ ```
60
+
61
+ ## Training parameters
62
 
63
+ The model was trained using the following hyperparameters:
64
 
65
+ base_model: distilbert/distilbert-base-uncased
66
+ learning_rate: 2e-5
67
+ batch_size: 32
68
+ epochs: 3
69
+ optimizer: adamw
70
+ adam_beta1: 0.9
71
+ adam_beta2: 0.999
72
+ adam_epsilon: 1e-08
73
+ loss_function: cross entropy loss