SaulLu commited on
Commit
e9159ab
1 Parent(s): 2d4e9db

add animations to model card

Browse files
Files changed (1) hide show
  1. README.md +12 -2
README.md CHANGED
@@ -28,10 +28,14 @@ pipeline_tag: fill-mask
28
 
29
  ---
30
 
31
- <!-- TODO: change widget text -->
32
-
33
  # sahajBERT
34
 
 
 
 
 
 
 
35
  Collaboratively pre-trained model on Bengali language using masked language modeling (MLM) and Sentence Order Prediction (SOP) objectives.
36
 
37
  ## Model description
@@ -157,6 +161,12 @@ This model was trained in a collaborative manner by volunteer participants.
157
  |39|[Rounak](https://huggingface.co/Rounak)|0 days 00:26:10|
158
  |40|[kshmax](https://huggingface.co/kshmax)|0 days 00:06:38|
159
 
 
 
 
 
 
 
160
  ## Eval results
161
 
162
  We evaluate sahajBERT model quality and 2 other model benchmarks ([XLM-R-large](https://huggingface.co/xlm-roberta-large) and [IndicBert](https://huggingface.co/ai4bharat/indic-bert)) by fine-tuning 3 times their pre-trained models on two downstream tasks in Bengali:
28
 
29
  ---
30
 
 
 
31
  # sahajBERT
32
 
33
+
34
+ <iframe width="100%" height="1100" frameborder="0"
35
+ src="https://observablehq.com/embed/@huggingface/participants-bubbles-chart?cells=c_noaws%2Ct_noaws%2Cviewof+currentDate"></iframe>
36
+
37
+
38
+
39
  Collaboratively pre-trained model on Bengali language using masked language modeling (MLM) and Sentence Order Prediction (SOP) objectives.
40
 
41
  ## Model description
161
  |39|[Rounak](https://huggingface.co/Rounak)|0 days 00:26:10|
162
  |40|[kshmax](https://huggingface.co/kshmax)|0 days 00:06:38|
163
 
164
+
165
+ ### Hardware used
166
+
167
+ <iframe width="100%" height="251" frameborder="0"
168
+ src="https://observablehq.com/embed/@huggingface/sahajbert-hardware?cells=c1_noaws"></iframe>
169
+
170
  ## Eval results
171
 
172
  We evaluate sahajBERT model quality and 2 other model benchmarks ([XLM-R-large](https://huggingface.co/xlm-roberta-large) and [IndicBert](https://huggingface.co/ai4bharat/indic-bert)) by fine-tuning 3 times their pre-trained models on two downstream tasks in Bengali: