JaMe76 commited on
Commit
1cc840a
1 Parent(s): 1be880a

Create README.md

Browse files

Add tags, model description, how to use, how it was trained, how to fine tune

Files changed (1) hide show
  1. README.md +50 -0
README.md ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - Tensorflow
4
+ license: apache-2.0
5
+ datasets:
6
+ - Publaynet
7
+ ---
8
+
9
+
10
+ # Tensorpacks Cascade-RCNN with FPN and Group Normalization on ResNext32xd4-50 trained on Publaynet for Document Layout Analysis
11
+
12
+ The model and its training code has been mainly taken from: [Tensorpack](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) .
13
+
14
+ Please check: [Xu Zhong et. all. - PubLayNet: largest dataset ever for document layout analysis](https://arxiv.org/abs/1908.07836).
15
+
16
+ This model is different from the model used the paper.
17
+
18
+ The code has been adapted so that it can be used in a **deep**doctection pipeline.
19
+
20
+ ## How this model can be used
21
+
22
+ This model can be used with the **deep**doctection in a full pipeline, along with table recognition and OCR. Check the general instruction following this [Get_started](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Get_Started.ipynb) tutorial.
23
+
24
+ ## How this model was trained.
25
+
26
+ To recreate the model run on the **deep**doctection framework, run:
27
+
28
+ ```python
29
+ >>> import os
30
+ >>> from deep_doctection.datasets import DatasetRegistry
31
+ >>> from deep_doctection.eval import MetricRegistry
32
+ >>> from deep_doctection.utils import get_configs_dir_path
33
+ >>> from deep_doctection.train import train_faster_rcnn
34
+ publaynet = DatasetRegistry.get_dataset("publaynet")
35
+ path_config_yaml=os.path.join(get_configs_dir_path(),"tp/layout/conf_frcnn_layout.yaml")
36
+ path_weights = ""
37
+ dataset_train = publaynet
38
+ config_overwrite=["TRAIN.STEPS_PER_EPOCH=500","TRAIN.EVAL_PERIOD=200","TRAIN.STARTING_EPOCH=1",
39
+ "PREPROC.TRAIN_SHORT_EDGE_SIZE=[800,1200]","TRAIN.CHECKPOINT_PERIOD=50",
40
+ "BACKBONE.FREEZE_AT=0"]
41
+ build_train_config=["max_datapoints=335703"]
42
+ dataset_val = publaynet
43
+ build_val_config = ["max_datapoints=2000"]
44
+
45
+ coco_metric = MetricRegistry.get_metric("coco")
46
+ ```
47
+
48
+ ## How to fine-tune this model
49
+
50
+ To fine tune this model, please check this [Fine-tune](https://github.com/deepdoctection/deepdoctection/blob/master/notebooks/Fine_Tune.ipynb) tutorial.