vasudevgupta commited on
Commit
39dbc92
1 Parent(s): 94d5a26
Files changed (3) hide show
  1. .gitattributes +1 -0
  2. README.md +2 -0
  3. bigbird.mov +3 -0
.gitattributes CHANGED
@@ -14,3 +14,4 @@
14
  *.pb filter=lfs diff=lfs merge=lfs -text
15
  *.pt filter=lfs diff=lfs merge=lfs -text
16
  *.pth filter=lfs diff=lfs merge=lfs -text
 
14
  *.pb filter=lfs diff=lfs merge=lfs -text
15
  *.pt filter=lfs diff=lfs merge=lfs -text
16
  *.pth filter=lfs diff=lfs merge=lfs -text
17
+ *.mov filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -9,6 +9,8 @@ widget:
9
 
10
  This checkpoint is obtained after training `FlaxBigBirdForQuestionAnswering` (with extra pooler head) on [`natural_questions`](https://huggingface.co/datasets/natural_questions) dataset on TPU v3-8. This dataset takes around ~100 GB on disk. But thanks to Cloud TPUs and Jax, each epoch took just 4.5 hours. Script for training can be found here: https://github.com/vasudevgupta7/bigbird
11
 
 
 
12
  **Use this model just like any other model from 🤗Transformers**
13
 
14
  ```python
9
 
10
  This checkpoint is obtained after training `FlaxBigBirdForQuestionAnswering` (with extra pooler head) on [`natural_questions`](https://huggingface.co/datasets/natural_questions) dataset on TPU v3-8. This dataset takes around ~100 GB on disk. But thanks to Cloud TPUs and Jax, each epoch took just 4.5 hours. Script for training can be found here: https://github.com/vasudevgupta7/bigbird
11
 
12
+ ![bigbird](bigbird.mov)
13
+
14
  **Use this model just like any other model from 🤗Transformers**
15
 
16
  ```python
bigbird.mov ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd095bed9b48849f3c7fe8d8303451cb560b10eafb81e12b7fee373d7d4eeee4
3
+ size 73463498