DragosGorduza commited on
Commit
3bf17f7
1 Parent(s): 5a22e11

Pushing fiqa sent bert gpl 60000 steps trained model

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -86,7 +86,7 @@ The model was trained with the parameters:
86
 
87
  **DataLoader**:
88
 
89
- `torch.utils.data.dataloader.DataLoader` of length 35872 with parameters:
90
  ```
91
  {'batch_size': 48, 'sampler': 'torch.utils.data.sampler.SequentialSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
92
  ```
@@ -107,7 +107,7 @@ Parameters of the fit()-Method:
107
  "lr": 2e-05
108
  },
109
  "scheduler": "WarmupLinear",
110
- "steps_per_epoch": 60000,
111
  "warmup_steps": 1000,
112
  "weight_decay": 0.01
113
  }
@@ -118,7 +118,7 @@ Parameters of the fit()-Method:
118
  ```
119
  SentenceTransformer(
120
  (0): Transformer({'max_seq_length': 350, 'do_lower_case': False}) with Transformer model: RobertaModel
121
- (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False})
122
  )
123
  ```
124
 
 
86
 
87
  **DataLoader**:
88
 
89
+ `torch.utils.data.dataloader.DataLoader` of length 48151 with parameters:
90
  ```
91
  {'batch_size': 48, 'sampler': 'torch.utils.data.sampler.SequentialSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
92
  ```
 
107
  "lr": 2e-05
108
  },
109
  "scheduler": "WarmupLinear",
110
+ "steps_per_epoch": 90000,
111
  "warmup_steps": 1000,
112
  "weight_decay": 0.01
113
  }
 
118
  ```
119
  SentenceTransformer(
120
  (0): Transformer({'max_seq_length': 350, 'do_lower_case': False}) with Transformer model: RobertaModel
121
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
122
  )
123
  ```
124