2024-05-28 09:05:19,044 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+da84357 ------------------------------------------------------------ 2024-05-28 09:05:20,912 - mmdet - INFO - Distributed training: True 2024-05-28 09:05:22,520 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPFourBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=448, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=24, embed_dim=1024, num_heads=16, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.4, init_scale=1, with_fpn=False, interaction_indexes=[[0, 1], [2, 3], [4, 5], [6, 7], [8, 9], [10, 11], [12, 13], [14, 15], [16, 17], [18, 19], [20, 21], [22, 23]], pretrained='./pretrained/deit_3_large_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[ 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28 ], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=672, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=896, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch4=dict( real_size=1344, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=192, num_heads=3, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_tiny_patch16_224-a1311bcf.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[1024, 1024, 1024, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1344, 806), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1344, 806), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1344, 806), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=24, layer_decay_rate=0.85, skip_stride=[2, 2, 2])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-28 09:05:27,580 - mmdet - INFO - Set random seed to 1300719596, deterministic: False 2024-05-28 09:05:35,527 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:05:37,326 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:05:38,612 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:05:38,932 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-28 09:07:18,461 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-28 09:07:19,072 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-28 09:07:19,145 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w4 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([1024, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([1024, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-28 09:07:40,581 - mmdet - INFO - {'num_layers': 24, 'layer_decay_rate': 0.85, 'skip_stride': [2, 2, 2]} 2024-05-28 09:07:40,581 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 26 2024-05-28 09:07:40,598 - mmdet - INFO - Param groups = { "layer_25_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.w4", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.merge_branch4.0.weight", "backbone.merge_branch4.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight", "backbone.branch4.pos_embed", "backbone.branch4.patch_embed.proj.weight" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias", "backbone.branch4.patch_embed.proj.bias" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias", "backbone.branch4.blocks.0.gamma_1", "backbone.branch4.blocks.0.gamma_2", "backbone.branch4.blocks.0.norm1.weight", "backbone.branch4.blocks.0.norm1.bias", "backbone.branch4.blocks.0.attn.qkv.bias", "backbone.branch4.blocks.0.attn.proj.bias", "backbone.branch4.blocks.0.norm2.weight", "backbone.branch4.blocks.0.norm2.bias", "backbone.branch4.blocks.0.mlp.fc1.bias", "backbone.branch4.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight", "backbone.branch4.blocks.0.attn.qkv.weight", "backbone.branch4.blocks.0.attn.proj.weight", "backbone.branch4.blocks.0.mlp.fc1.weight", "backbone.branch4.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias", "backbone.branch4.blocks.1.gamma_1", "backbone.branch4.blocks.1.gamma_2", "backbone.branch4.blocks.1.norm1.weight", "backbone.branch4.blocks.1.norm1.bias", "backbone.branch4.blocks.1.attn.qkv.bias", "backbone.branch4.blocks.1.attn.proj.bias", "backbone.branch4.blocks.1.norm2.weight", "backbone.branch4.blocks.1.norm2.bias", "backbone.branch4.blocks.1.mlp.fc1.bias", "backbone.branch4.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight", "backbone.branch4.blocks.1.attn.qkv.weight", "backbone.branch4.blocks.1.attn.proj.weight", "backbone.branch4.blocks.1.mlp.fc1.weight", "backbone.branch4.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias", "backbone.branch4.blocks.2.gamma_1", "backbone.branch4.blocks.2.gamma_2", "backbone.branch4.blocks.2.norm1.weight", "backbone.branch4.blocks.2.norm1.bias", "backbone.branch4.blocks.2.attn.qkv.bias", "backbone.branch4.blocks.2.attn.proj.bias", "backbone.branch4.blocks.2.norm2.weight", "backbone.branch4.blocks.2.norm2.bias", "backbone.branch4.blocks.2.mlp.fc1.bias", "backbone.branch4.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight", "backbone.branch4.blocks.2.attn.qkv.weight", "backbone.branch4.blocks.2.attn.proj.weight", "backbone.branch4.blocks.2.mlp.fc1.weight", "backbone.branch4.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias", "backbone.branch4.blocks.3.gamma_1", "backbone.branch4.blocks.3.gamma_2", "backbone.branch4.blocks.3.norm1.weight", "backbone.branch4.blocks.3.norm1.bias", "backbone.branch4.blocks.3.attn.qkv.bias", "backbone.branch4.blocks.3.attn.proj.bias", "backbone.branch4.blocks.3.norm2.weight", "backbone.branch4.blocks.3.norm2.bias", "backbone.branch4.blocks.3.mlp.fc1.bias", "backbone.branch4.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight", "backbone.branch4.blocks.3.attn.qkv.weight", "backbone.branch4.blocks.3.attn.proj.weight", "backbone.branch4.blocks.3.mlp.fc1.weight", "backbone.branch4.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias", "backbone.branch4.blocks.4.gamma_1", "backbone.branch4.blocks.4.gamma_2", "backbone.branch4.blocks.4.norm1.weight", "backbone.branch4.blocks.4.norm1.bias", "backbone.branch4.blocks.4.attn.qkv.bias", "backbone.branch4.blocks.4.attn.proj.bias", "backbone.branch4.blocks.4.norm2.weight", "backbone.branch4.blocks.4.norm2.bias", "backbone.branch4.blocks.4.mlp.fc1.bias", "backbone.branch4.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight", "backbone.branch4.blocks.4.attn.qkv.weight", "backbone.branch4.blocks.4.attn.proj.weight", "backbone.branch4.blocks.4.mlp.fc1.weight", "backbone.branch4.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias", "backbone.branch4.blocks.5.gamma_1", "backbone.branch4.blocks.5.gamma_2", "backbone.branch4.blocks.5.norm1.weight", "backbone.branch4.blocks.5.norm1.bias", "backbone.branch4.blocks.5.attn.qkv.bias", "backbone.branch4.blocks.5.attn.proj.bias", "backbone.branch4.blocks.5.norm2.weight", "backbone.branch4.blocks.5.norm2.bias", "backbone.branch4.blocks.5.mlp.fc1.bias", "backbone.branch4.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight", "backbone.branch4.blocks.5.attn.qkv.weight", "backbone.branch4.blocks.5.attn.proj.weight", "backbone.branch4.blocks.5.mlp.fc1.weight", "backbone.branch4.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.branch1.blocks.12.gamma_1", "backbone.branch1.blocks.12.gamma_2", "backbone.branch1.blocks.12.norm1.weight", "backbone.branch1.blocks.12.norm1.bias", "backbone.branch1.blocks.12.attn.qkv.bias", "backbone.branch1.blocks.12.attn.proj.bias", "backbone.branch1.blocks.12.norm2.weight", "backbone.branch1.blocks.12.norm2.bias", "backbone.branch1.blocks.12.mlp.fc1.bias", "backbone.branch1.blocks.12.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_13_decay": { "param_names": [ "backbone.branch1.blocks.12.attn.qkv.weight", "backbone.branch1.blocks.12.attn.proj.weight", "backbone.branch1.blocks.12.mlp.fc1.weight", "backbone.branch1.blocks.12.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_14_no_decay": { "param_names": [ "backbone.branch1.blocks.13.gamma_1", "backbone.branch1.blocks.13.gamma_2", "backbone.branch1.blocks.13.norm1.weight", "backbone.branch1.blocks.13.norm1.bias", "backbone.branch1.blocks.13.attn.qkv.bias", "backbone.branch1.blocks.13.attn.proj.bias", "backbone.branch1.blocks.13.norm2.weight", "backbone.branch1.blocks.13.norm2.bias", "backbone.branch1.blocks.13.mlp.fc1.bias", "backbone.branch1.blocks.13.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias", "backbone.branch4.blocks.6.gamma_1", "backbone.branch4.blocks.6.gamma_2", "backbone.branch4.blocks.6.norm1.weight", "backbone.branch4.blocks.6.norm1.bias", "backbone.branch4.blocks.6.attn.qkv.bias", "backbone.branch4.blocks.6.attn.proj.bias", "backbone.branch4.blocks.6.norm2.weight", "backbone.branch4.blocks.6.norm2.bias", "backbone.branch4.blocks.6.mlp.fc1.bias", "backbone.branch4.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_14_decay": { "param_names": [ "backbone.branch1.blocks.13.attn.qkv.weight", "backbone.branch1.blocks.13.attn.proj.weight", "backbone.branch1.blocks.13.mlp.fc1.weight", "backbone.branch1.blocks.13.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight", "backbone.branch4.blocks.6.attn.qkv.weight", "backbone.branch4.blocks.6.attn.proj.weight", "backbone.branch4.blocks.6.mlp.fc1.weight", "backbone.branch4.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_15_no_decay": { "param_names": [ "backbone.branch1.blocks.14.gamma_1", "backbone.branch1.blocks.14.gamma_2", "backbone.branch1.blocks.14.norm1.weight", "backbone.branch1.blocks.14.norm1.bias", "backbone.branch1.blocks.14.attn.qkv.bias", "backbone.branch1.blocks.14.attn.proj.bias", "backbone.branch1.blocks.14.norm2.weight", "backbone.branch1.blocks.14.norm2.bias", "backbone.branch1.blocks.14.mlp.fc1.bias", "backbone.branch1.blocks.14.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_15_decay": { "param_names": [ "backbone.branch1.blocks.14.attn.qkv.weight", "backbone.branch1.blocks.14.attn.proj.weight", "backbone.branch1.blocks.14.mlp.fc1.weight", "backbone.branch1.blocks.14.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_16_no_decay": { "param_names": [ "backbone.branch1.blocks.15.gamma_1", "backbone.branch1.blocks.15.gamma_2", "backbone.branch1.blocks.15.norm1.weight", "backbone.branch1.blocks.15.norm1.bias", "backbone.branch1.blocks.15.attn.qkv.bias", "backbone.branch1.blocks.15.attn.proj.bias", "backbone.branch1.blocks.15.norm2.weight", "backbone.branch1.blocks.15.norm2.bias", "backbone.branch1.blocks.15.mlp.fc1.bias", "backbone.branch1.blocks.15.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias", "backbone.branch4.blocks.7.gamma_1", "backbone.branch4.blocks.7.gamma_2", "backbone.branch4.blocks.7.norm1.weight", "backbone.branch4.blocks.7.norm1.bias", "backbone.branch4.blocks.7.attn.qkv.bias", "backbone.branch4.blocks.7.attn.proj.bias", "backbone.branch4.blocks.7.norm2.weight", "backbone.branch4.blocks.7.norm2.bias", "backbone.branch4.blocks.7.mlp.fc1.bias", "backbone.branch4.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_16_decay": { "param_names": [ "backbone.branch1.blocks.15.attn.qkv.weight", "backbone.branch1.blocks.15.attn.proj.weight", "backbone.branch1.blocks.15.mlp.fc1.weight", "backbone.branch1.blocks.15.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight", "backbone.branch4.blocks.7.attn.qkv.weight", "backbone.branch4.blocks.7.attn.proj.weight", "backbone.branch4.blocks.7.mlp.fc1.weight", "backbone.branch4.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_17_no_decay": { "param_names": [ "backbone.branch1.blocks.16.gamma_1", "backbone.branch1.blocks.16.gamma_2", "backbone.branch1.blocks.16.norm1.weight", "backbone.branch1.blocks.16.norm1.bias", "backbone.branch1.blocks.16.attn.qkv.bias", "backbone.branch1.blocks.16.attn.proj.bias", "backbone.branch1.blocks.16.norm2.weight", "backbone.branch1.blocks.16.norm2.bias", "backbone.branch1.blocks.16.mlp.fc1.bias", "backbone.branch1.blocks.16.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_17_decay": { "param_names": [ "backbone.branch1.blocks.16.attn.qkv.weight", "backbone.branch1.blocks.16.attn.proj.weight", "backbone.branch1.blocks.16.mlp.fc1.weight", "backbone.branch1.blocks.16.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_18_no_decay": { "param_names": [ "backbone.branch1.blocks.17.gamma_1", "backbone.branch1.blocks.17.gamma_2", "backbone.branch1.blocks.17.norm1.weight", "backbone.branch1.blocks.17.norm1.bias", "backbone.branch1.blocks.17.attn.qkv.bias", "backbone.branch1.blocks.17.attn.proj.bias", "backbone.branch1.blocks.17.norm2.weight", "backbone.branch1.blocks.17.norm2.bias", "backbone.branch1.blocks.17.mlp.fc1.bias", "backbone.branch1.blocks.17.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias", "backbone.branch4.blocks.8.gamma_1", "backbone.branch4.blocks.8.gamma_2", "backbone.branch4.blocks.8.norm1.weight", "backbone.branch4.blocks.8.norm1.bias", "backbone.branch4.blocks.8.attn.qkv.bias", "backbone.branch4.blocks.8.attn.proj.bias", "backbone.branch4.blocks.8.norm2.weight", "backbone.branch4.blocks.8.norm2.bias", "backbone.branch4.blocks.8.mlp.fc1.bias", "backbone.branch4.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_18_decay": { "param_names": [ "backbone.branch1.blocks.17.attn.qkv.weight", "backbone.branch1.blocks.17.attn.proj.weight", "backbone.branch1.blocks.17.mlp.fc1.weight", "backbone.branch1.blocks.17.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight", "backbone.branch4.blocks.8.attn.qkv.weight", "backbone.branch4.blocks.8.attn.proj.weight", "backbone.branch4.blocks.8.mlp.fc1.weight", "backbone.branch4.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_19_no_decay": { "param_names": [ "backbone.branch1.blocks.18.gamma_1", "backbone.branch1.blocks.18.gamma_2", "backbone.branch1.blocks.18.norm1.weight", "backbone.branch1.blocks.18.norm1.bias", "backbone.branch1.blocks.18.attn.qkv.bias", "backbone.branch1.blocks.18.attn.proj.bias", "backbone.branch1.blocks.18.norm2.weight", "backbone.branch1.blocks.18.norm2.bias", "backbone.branch1.blocks.18.mlp.fc1.bias", "backbone.branch1.blocks.18.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_19_decay": { "param_names": [ "backbone.branch1.blocks.18.attn.qkv.weight", "backbone.branch1.blocks.18.attn.proj.weight", "backbone.branch1.blocks.18.mlp.fc1.weight", "backbone.branch1.blocks.18.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_20_no_decay": { "param_names": [ "backbone.branch1.blocks.19.gamma_1", "backbone.branch1.blocks.19.gamma_2", "backbone.branch1.blocks.19.norm1.weight", "backbone.branch1.blocks.19.norm1.bias", "backbone.branch1.blocks.19.attn.qkv.bias", "backbone.branch1.blocks.19.attn.proj.bias", "backbone.branch1.blocks.19.norm2.weight", "backbone.branch1.blocks.19.norm2.bias", "backbone.branch1.blocks.19.mlp.fc1.bias", "backbone.branch1.blocks.19.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias", "backbone.branch4.blocks.9.gamma_1", "backbone.branch4.blocks.9.gamma_2", "backbone.branch4.blocks.9.norm1.weight", "backbone.branch4.blocks.9.norm1.bias", "backbone.branch4.blocks.9.attn.qkv.bias", "backbone.branch4.blocks.9.attn.proj.bias", "backbone.branch4.blocks.9.norm2.weight", "backbone.branch4.blocks.9.norm2.bias", "backbone.branch4.blocks.9.mlp.fc1.bias", "backbone.branch4.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_20_decay": { "param_names": [ "backbone.branch1.blocks.19.attn.qkv.weight", "backbone.branch1.blocks.19.attn.proj.weight", "backbone.branch1.blocks.19.mlp.fc1.weight", "backbone.branch1.blocks.19.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight", "backbone.branch4.blocks.9.attn.qkv.weight", "backbone.branch4.blocks.9.attn.proj.weight", "backbone.branch4.blocks.9.mlp.fc1.weight", "backbone.branch4.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_21_no_decay": { "param_names": [ "backbone.branch1.blocks.20.gamma_1", "backbone.branch1.blocks.20.gamma_2", "backbone.branch1.blocks.20.norm1.weight", "backbone.branch1.blocks.20.norm1.bias", "backbone.branch1.blocks.20.attn.qkv.bias", "backbone.branch1.blocks.20.attn.proj.bias", "backbone.branch1.blocks.20.norm2.weight", "backbone.branch1.blocks.20.norm2.bias", "backbone.branch1.blocks.20.mlp.fc1.bias", "backbone.branch1.blocks.20.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_21_decay": { "param_names": [ "backbone.branch1.blocks.20.attn.qkv.weight", "backbone.branch1.blocks.20.attn.proj.weight", "backbone.branch1.blocks.20.mlp.fc1.weight", "backbone.branch1.blocks.20.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_22_no_decay": { "param_names": [ "backbone.branch1.blocks.21.gamma_1", "backbone.branch1.blocks.21.gamma_2", "backbone.branch1.blocks.21.norm1.weight", "backbone.branch1.blocks.21.norm1.bias", "backbone.branch1.blocks.21.attn.qkv.bias", "backbone.branch1.blocks.21.attn.proj.bias", "backbone.branch1.blocks.21.norm2.weight", "backbone.branch1.blocks.21.norm2.bias", "backbone.branch1.blocks.21.mlp.fc1.bias", "backbone.branch1.blocks.21.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias", "backbone.branch4.blocks.10.gamma_1", "backbone.branch4.blocks.10.gamma_2", "backbone.branch4.blocks.10.norm1.weight", "backbone.branch4.blocks.10.norm1.bias", "backbone.branch4.blocks.10.attn.qkv.bias", "backbone.branch4.blocks.10.attn.proj.bias", "backbone.branch4.blocks.10.norm2.weight", "backbone.branch4.blocks.10.norm2.bias", "backbone.branch4.blocks.10.mlp.fc1.bias", "backbone.branch4.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_22_decay": { "param_names": [ "backbone.branch1.blocks.21.attn.qkv.weight", "backbone.branch1.blocks.21.attn.proj.weight", "backbone.branch1.blocks.21.mlp.fc1.weight", "backbone.branch1.blocks.21.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight", "backbone.branch4.blocks.10.attn.qkv.weight", "backbone.branch4.blocks.10.attn.proj.weight", "backbone.branch4.blocks.10.mlp.fc1.weight", "backbone.branch4.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_23_no_decay": { "param_names": [ "backbone.branch1.blocks.22.gamma_1", "backbone.branch1.blocks.22.gamma_2", "backbone.branch1.blocks.22.norm1.weight", "backbone.branch1.blocks.22.norm1.bias", "backbone.branch1.blocks.22.attn.qkv.bias", "backbone.branch1.blocks.22.attn.proj.bias", "backbone.branch1.blocks.22.norm2.weight", "backbone.branch1.blocks.22.norm2.bias", "backbone.branch1.blocks.22.mlp.fc1.bias", "backbone.branch1.blocks.22.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_23_decay": { "param_names": [ "backbone.branch1.blocks.22.attn.qkv.weight", "backbone.branch1.blocks.22.attn.proj.weight", "backbone.branch1.blocks.22.mlp.fc1.weight", "backbone.branch1.blocks.22.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_24_no_decay": { "param_names": [ "backbone.branch1.blocks.23.gamma_1", "backbone.branch1.blocks.23.gamma_2", "backbone.branch1.blocks.23.norm1.weight", "backbone.branch1.blocks.23.norm1.bias", "backbone.branch1.blocks.23.attn.qkv.bias", "backbone.branch1.blocks.23.attn.proj.bias", "backbone.branch1.blocks.23.norm2.weight", "backbone.branch1.blocks.23.norm2.bias", "backbone.branch1.blocks.23.mlp.fc1.bias", "backbone.branch1.blocks.23.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias", "backbone.branch4.blocks.11.gamma_1", "backbone.branch4.blocks.11.gamma_2", "backbone.branch4.blocks.11.norm1.weight", "backbone.branch4.blocks.11.norm1.bias", "backbone.branch4.blocks.11.attn.qkv.bias", "backbone.branch4.blocks.11.attn.proj.bias", "backbone.branch4.blocks.11.norm2.weight", "backbone.branch4.blocks.11.norm2.bias", "backbone.branch4.blocks.11.mlp.fc1.bias", "backbone.branch4.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_24_decay": { "param_names": [ "backbone.branch1.blocks.23.attn.qkv.weight", "backbone.branch1.blocks.23.attn.proj.weight", "backbone.branch1.blocks.23.mlp.fc1.weight", "backbone.branch1.blocks.23.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight", "backbone.branch4.blocks.11.attn.qkv.weight", "backbone.branch4.blocks.11.attn.proj.weight", "backbone.branch4.blocks.11.mlp.fc1.weight", "backbone.branch4.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_25_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.merge_branch4.1.weight", "backbone.merge_branch4.1.bias", "backbone.merge_branch4.4.weight", "backbone.merge_branch4.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-28 09:08:20,487 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-28 09:08:20,898 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16 2024-05-28 09:08:20,898 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-28 09:08:20,898 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-28 09:08:20,915 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-28 09:09:06,972 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 22:29:18, time: 0.921, data_time: 0.119, memory: 13189, loss_rpn_cls: 0.6494, loss_rpn_bbox: 0.1246, loss_cls: 1.9425, acc: 64.7520, loss_bbox: 0.0461, loss_mask: 1.4733, loss: 4.2360 2024-05-28 09:09:44,347 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 20:21:07, time: 0.747, data_time: 0.035, memory: 13260, loss_rpn_cls: 0.3204, loss_rpn_bbox: 0.1065, loss_cls: 0.3809, acc: 94.9309, loss_bbox: 0.1662, loss_mask: 0.7779, loss: 1.7520 2024-05-28 09:10:22,011 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 19:41:22, time: 0.754, data_time: 0.031, memory: 13309, loss_rpn_cls: 0.2708, loss_rpn_bbox: 0.1058, loss_cls: 0.3520, acc: 94.7295, loss_bbox: 0.1705, loss_mask: 0.7043, loss: 1.6035 2024-05-28 09:10:58,988 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 19:15:56, time: 0.740, data_time: 0.031, memory: 13457, loss_rpn_cls: 0.2463, loss_rpn_bbox: 0.1005, loss_cls: 0.3275, acc: 94.9211, loss_bbox: 0.1667, loss_mask: 0.6852, loss: 1.5262 2024-05-28 09:11:36,043 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 19:00:49, time: 0.741, data_time: 0.033, memory: 13457, loss_rpn_cls: 0.2170, loss_rpn_bbox: 0.0986, loss_cls: 0.3400, acc: 94.6245, loss_bbox: 0.1807, loss_mask: 0.6707, loss: 1.5070 2024-05-28 09:12:13,131 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 18:50:49, time: 0.742, data_time: 0.032, memory: 13457, loss_rpn_cls: 0.1804, loss_rpn_bbox: 0.0969, loss_cls: 0.4159, acc: 92.9976, loss_bbox: 0.2506, loss_mask: 0.6508, loss: 1.5945 2024-05-28 09:12:56,798 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 19:10:53, time: 0.873, data_time: 0.034, memory: 13527, loss_rpn_cls: 0.1480, loss_rpn_bbox: 0.0957, loss_cls: 0.4320, acc: 92.5513, loss_bbox: 0.2676, loss_mask: 0.6309, loss: 1.5741 2024-05-28 09:13:39,712 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 19:23:01, time: 0.858, data_time: 0.037, memory: 13532, loss_rpn_cls: 0.1328, loss_rpn_bbox: 0.0909, loss_cls: 0.4530, acc: 91.6326, loss_bbox: 0.3068, loss_mask: 0.5993, loss: 1.5827 2024-05-28 09:14:19,366 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 19:21:43, time: 0.793, data_time: 0.031, memory: 13571, loss_rpn_cls: 0.1122, loss_rpn_bbox: 0.0895, loss_cls: 0.4292, acc: 91.7727, loss_bbox: 0.3022, loss_mask: 0.5792, loss: 1.5124 2024-05-28 09:14:56,735 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 19:13:53, time: 0.747, data_time: 0.032, memory: 13571, loss_rpn_cls: 0.1196, loss_rpn_bbox: 0.0943, loss_cls: 0.4241, acc: 91.4263, loss_bbox: 0.3138, loss_mask: 0.5531, loss: 1.5048 2024-05-28 09:15:34,169 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 19:07:32, time: 0.749, data_time: 0.037, memory: 13571, loss_rpn_cls: 0.1090, loss_rpn_bbox: 0.0914, loss_cls: 0.4127, acc: 91.4255, loss_bbox: 0.3087, loss_mask: 0.5401, loss: 1.4618 2024-05-28 09:16:11,599 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 19:02:03, time: 0.748, data_time: 0.035, memory: 13571, loss_rpn_cls: 0.0913, loss_rpn_bbox: 0.0869, loss_cls: 0.3959, acc: 91.2305, loss_bbox: 0.3176, loss_mask: 0.5038, loss: 1.3956 2024-05-28 09:16:49,242 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 18:57:57, time: 0.754, data_time: 0.034, memory: 13571, loss_rpn_cls: 0.0976, loss_rpn_bbox: 0.0935, loss_cls: 0.4188, acc: 90.3704, loss_bbox: 0.3452, loss_mask: 0.5036, loss: 1.4587 2024-05-28 09:17:26,874 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 18:54:15, time: 0.753, data_time: 0.027, memory: 13571, loss_rpn_cls: 0.0945, loss_rpn_bbox: 0.0868, loss_cls: 0.4031, acc: 90.4033, loss_bbox: 0.3479, loss_mask: 0.4927, loss: 1.4251 2024-05-28 09:18:04,723 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 18:51:22, time: 0.757, data_time: 0.037, memory: 13571, loss_rpn_cls: 0.0864, loss_rpn_bbox: 0.0886, loss_cls: 0.4099, acc: 90.0339, loss_bbox: 0.3553, loss_mask: 0.4836, loss: 1.4238 2024-05-28 09:18:44,768 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 18:52:45, time: 0.801, data_time: 0.030, memory: 13572, loss_rpn_cls: 0.0901, loss_rpn_bbox: 0.0873, loss_cls: 0.3861, acc: 90.2983, loss_bbox: 0.3453, loss_mask: 0.4643, loss: 1.3731 2024-05-28 09:19:30,752 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 19:04:04, time: 0.920, data_time: 0.032, memory: 13572, loss_rpn_cls: 0.0822, loss_rpn_bbox: 0.0846, loss_cls: 0.3686, acc: 90.6978, loss_bbox: 0.3330, loss_mask: 0.4582, loss: 1.3265 2024-05-28 09:20:12,494 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 19:07:11, time: 0.835, data_time: 0.038, memory: 13572, loss_rpn_cls: 0.0850, loss_rpn_bbox: 0.0862, loss_cls: 0.3720, acc: 90.0920, loss_bbox: 0.3524, loss_mask: 0.4571, loss: 1.3526 2024-05-28 09:20:52,386 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 19:07:04, time: 0.798, data_time: 0.031, memory: 13572, loss_rpn_cls: 0.0840, loss_rpn_bbox: 0.0874, loss_cls: 0.3833, acc: 89.6267, loss_bbox: 0.3734, loss_mask: 0.4462, loss: 1.3743 2024-05-28 09:21:30,265 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 09:21:30,265 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 19:03:59, time: 0.758, data_time: 0.033, memory: 13572, loss_rpn_cls: 0.0839, loss_rpn_bbox: 0.0880, loss_cls: 0.3811, acc: 89.6868, loss_bbox: 0.3740, loss_mask: 0.4473, loss: 1.3743 2024-05-28 09:22:08,105 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 19:01:05, time: 0.757, data_time: 0.036, memory: 13574, loss_rpn_cls: 0.0803, loss_rpn_bbox: 0.0832, loss_cls: 0.3672, acc: 89.9321, loss_bbox: 0.3597, loss_mask: 0.4399, loss: 1.3302 2024-05-28 09:22:45,697 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 18:58:04, time: 0.752, data_time: 0.033, memory: 13574, loss_rpn_cls: 0.0757, loss_rpn_bbox: 0.0817, loss_cls: 0.3595, acc: 90.2319, loss_bbox: 0.3550, loss_mask: 0.4291, loss: 1.3011 2024-05-28 09:23:23,474 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 18:55:29, time: 0.756, data_time: 0.034, memory: 13574, loss_rpn_cls: 0.0709, loss_rpn_bbox: 0.0811, loss_cls: 0.3461, acc: 90.3574, loss_bbox: 0.3520, loss_mask: 0.4304, loss: 1.2806 2024-05-28 09:24:05,753 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 18:58:29, time: 0.846, data_time: 0.123, memory: 13574, loss_rpn_cls: 0.0821, loss_rpn_bbox: 0.0841, loss_cls: 0.3574, acc: 90.1855, loss_bbox: 0.3542, loss_mask: 0.4347, loss: 1.3125 2024-05-28 09:24:44,220 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 18:56:48, time: 0.769, data_time: 0.039, memory: 13574, loss_rpn_cls: 0.0784, loss_rpn_bbox: 0.0845, loss_cls: 0.3690, acc: 89.7949, loss_bbox: 0.3634, loss_mask: 0.4273, loss: 1.3226 2024-05-28 09:25:27,891 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 19:00:57, time: 0.873, data_time: 0.030, memory: 13574, loss_rpn_cls: 0.0660, loss_rpn_bbox: 0.0750, loss_cls: 0.3512, acc: 90.0308, loss_bbox: 0.3582, loss_mask: 0.4229, loss: 1.2732 2024-05-28 09:26:09,182 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 19:02:13, time: 0.826, data_time: 0.030, memory: 13574, loss_rpn_cls: 0.0700, loss_rpn_bbox: 0.0827, loss_cls: 0.3499, acc: 89.9988, loss_bbox: 0.3619, loss_mask: 0.4189, loss: 1.2835 2024-05-28 09:26:49,947 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 19:02:48, time: 0.815, data_time: 0.035, memory: 13574, loss_rpn_cls: 0.0712, loss_rpn_bbox: 0.0820, loss_cls: 0.3466, acc: 90.0439, loss_bbox: 0.3646, loss_mask: 0.4154, loss: 1.2799 2024-05-28 09:27:30,117 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 19:02:41, time: 0.803, data_time: 0.030, memory: 13574, loss_rpn_cls: 0.0690, loss_rpn_bbox: 0.0797, loss_cls: 0.3398, acc: 90.1526, loss_bbox: 0.3517, loss_mask: 0.4061, loss: 1.2463 2024-05-28 09:28:08,191 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 19:00:32, time: 0.761, data_time: 0.027, memory: 13574, loss_rpn_cls: 0.0630, loss_rpn_bbox: 0.0755, loss_cls: 0.3407, acc: 89.8928, loss_bbox: 0.3640, loss_mask: 0.3958, loss: 1.2390 2024-05-28 09:28:46,646 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 18:58:50, time: 0.769, data_time: 0.040, memory: 13574, loss_rpn_cls: 0.0617, loss_rpn_bbox: 0.0723, loss_cls: 0.3275, acc: 90.2681, loss_bbox: 0.3539, loss_mask: 0.3914, loss: 1.2068 2024-05-28 09:29:24,777 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 18:56:55, time: 0.763, data_time: 0.035, memory: 13574, loss_rpn_cls: 0.0665, loss_rpn_bbox: 0.0758, loss_cls: 0.3533, acc: 89.4170, loss_bbox: 0.3801, loss_mask: 0.4001, loss: 1.2758 2024-05-28 09:30:02,664 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 18:54:51, time: 0.758, data_time: 0.037, memory: 13574, loss_rpn_cls: 0.0671, loss_rpn_bbox: 0.0781, loss_cls: 0.3283, acc: 90.5286, loss_bbox: 0.3408, loss_mask: 0.3923, loss: 1.2066 2024-05-28 09:30:40,586 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 18:52:55, time: 0.758, data_time: 0.029, memory: 13574, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0744, loss_cls: 0.3354, acc: 89.8831, loss_bbox: 0.3626, loss_mask: 0.3919, loss: 1.2258 2024-05-28 09:31:24,287 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 18:55:47, time: 0.874, data_time: 0.034, memory: 13574, loss_rpn_cls: 0.0618, loss_rpn_bbox: 0.0766, loss_cls: 0.3384, acc: 89.7371, loss_bbox: 0.3711, loss_mask: 0.3944, loss: 1.2422 2024-05-28 09:32:02,745 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 18:54:16, time: 0.769, data_time: 0.035, memory: 13574, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0711, loss_cls: 0.3295, acc: 89.9695, loss_bbox: 0.3614, loss_mask: 0.3927, loss: 1.2127 2024-05-28 09:32:46,029 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 18:56:33, time: 0.866, data_time: 0.033, memory: 13574, loss_rpn_cls: 0.0632, loss_rpn_bbox: 0.0767, loss_cls: 0.3271, acc: 90.2122, loss_bbox: 0.3543, loss_mask: 0.3862, loss: 1.2075 2024-05-28 09:33:28,221 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 18:57:51, time: 0.844, data_time: 0.037, memory: 13574, loss_rpn_cls: 0.0670, loss_rpn_bbox: 0.0797, loss_cls: 0.3326, acc: 90.0413, loss_bbox: 0.3579, loss_mask: 0.3853, loss: 1.2226 2024-05-28 09:34:06,535 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 18:56:12, time: 0.766, data_time: 0.027, memory: 13574, loss_rpn_cls: 0.0657, loss_rpn_bbox: 0.0752, loss_cls: 0.3391, acc: 89.7700, loss_bbox: 0.3653, loss_mask: 0.3871, loss: 1.2325 2024-05-28 09:34:45,034 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 09:34:45,034 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 18:54:43, time: 0.770, data_time: 0.032, memory: 13574, loss_rpn_cls: 0.0634, loss_rpn_bbox: 0.0772, loss_cls: 0.3292, acc: 89.9814, loss_bbox: 0.3600, loss_mask: 0.3799, loss: 1.2096 2024-05-28 09:35:23,122 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 18:53:00, time: 0.762, data_time: 0.033, memory: 13574, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0740, loss_cls: 0.3172, acc: 90.3621, loss_bbox: 0.3476, loss_mask: 0.3782, loss: 1.1750 2024-05-28 09:36:01,885 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 18:51:48, time: 0.775, data_time: 0.034, memory: 13574, loss_rpn_cls: 0.0641, loss_rpn_bbox: 0.0769, loss_cls: 0.3289, acc: 89.7634, loss_bbox: 0.3651, loss_mask: 0.3874, loss: 1.2223 2024-05-28 09:36:39,962 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 18:50:10, time: 0.762, data_time: 0.029, memory: 13574, loss_rpn_cls: 0.0610, loss_rpn_bbox: 0.0710, loss_cls: 0.3171, acc: 90.3481, loss_bbox: 0.3432, loss_mask: 0.3709, loss: 1.1633 2024-05-28 09:37:20,584 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 18:50:14, time: 0.812, data_time: 0.034, memory: 13574, loss_rpn_cls: 0.0628, loss_rpn_bbox: 0.0770, loss_cls: 0.3286, acc: 89.7869, loss_bbox: 0.3640, loss_mask: 0.3751, loss: 1.2074 2024-05-28 09:37:58,801 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 18:48:44, time: 0.764, data_time: 0.029, memory: 13574, loss_rpn_cls: 0.0543, loss_rpn_bbox: 0.0736, loss_cls: 0.3336, acc: 89.7229, loss_bbox: 0.3656, loss_mask: 0.3789, loss: 1.2059 2024-05-28 09:38:38,623 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 18:48:16, time: 0.796, data_time: 0.043, memory: 13574, loss_rpn_cls: 0.0566, loss_rpn_bbox: 0.0729, loss_cls: 0.3113, acc: 90.3672, loss_bbox: 0.3477, loss_mask: 0.3679, loss: 1.1564 2024-05-28 09:39:20,175 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 18:48:51, time: 0.831, data_time: 0.034, memory: 13575, loss_rpn_cls: 0.0572, loss_rpn_bbox: 0.0748, loss_cls: 0.3179, acc: 90.1912, loss_bbox: 0.3520, loss_mask: 0.3650, loss: 1.1670 2024-05-28 09:40:04,377 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 18:50:56, time: 0.884, data_time: 0.034, memory: 13575, loss_rpn_cls: 0.0538, loss_rpn_bbox: 0.0711, loss_cls: 0.3178, acc: 90.1760, loss_bbox: 0.3589, loss_mask: 0.3693, loss: 1.1709 2024-05-28 09:40:42,562 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 18:49:26, time: 0.764, data_time: 0.037, memory: 13575, loss_rpn_cls: 0.0543, loss_rpn_bbox: 0.0674, loss_cls: 0.3139, acc: 90.0923, loss_bbox: 0.3521, loss_mask: 0.3709, loss: 1.1586 2024-05-28 09:41:20,101 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 18:47:35, time: 0.751, data_time: 0.026, memory: 13575, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0678, loss_cls: 0.2973, acc: 90.8003, loss_bbox: 0.3289, loss_mask: 0.3498, loss: 1.1005 2024-05-28 09:41:58,428 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 18:46:13, time: 0.767, data_time: 0.033, memory: 13575, loss_rpn_cls: 0.0579, loss_rpn_bbox: 0.0715, loss_cls: 0.3086, acc: 90.2271, loss_bbox: 0.3486, loss_mask: 0.3609, loss: 1.1475 2024-05-28 09:42:37,215 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 18:45:08, time: 0.776, data_time: 0.039, memory: 13575, loss_rpn_cls: 0.0565, loss_rpn_bbox: 0.0729, loss_cls: 0.3022, acc: 90.5564, loss_bbox: 0.3406, loss_mask: 0.3554, loss: 1.1277 2024-05-28 09:43:17,301 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 18:44:46, time: 0.802, data_time: 0.031, memory: 13575, loss_rpn_cls: 0.0541, loss_rpn_bbox: 0.0698, loss_cls: 0.3064, acc: 90.2615, loss_bbox: 0.3529, loss_mask: 0.3608, loss: 1.1439 2024-05-28 09:43:55,469 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 18:43:23, time: 0.763, data_time: 0.041, memory: 13575, loss_rpn_cls: 0.0541, loss_rpn_bbox: 0.0721, loss_cls: 0.3084, acc: 90.2036, loss_bbox: 0.3507, loss_mask: 0.3582, loss: 1.1435 2024-05-28 09:44:37,601 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 18:44:04, time: 0.843, data_time: 0.032, memory: 13575, loss_rpn_cls: 0.0548, loss_rpn_bbox: 0.0709, loss_cls: 0.3088, acc: 90.1948, loss_bbox: 0.3534, loss_mask: 0.3518, loss: 1.1397 2024-05-28 09:45:17,794 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 18:43:43, time: 0.804, data_time: 0.028, memory: 13575, loss_rpn_cls: 0.0597, loss_rpn_bbox: 0.0744, loss_cls: 0.3095, acc: 90.3401, loss_bbox: 0.3495, loss_mask: 0.3538, loss: 1.1470 2024-05-28 09:45:59,878 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 18:44:18, time: 0.842, data_time: 0.032, memory: 13575, loss_rpn_cls: 0.0548, loss_rpn_bbox: 0.0704, loss_cls: 0.3058, acc: 90.6455, loss_bbox: 0.3377, loss_mask: 0.3558, loss: 1.1245 2024-05-28 09:46:41,398 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 18:44:34, time: 0.830, data_time: 0.028, memory: 13575, loss_rpn_cls: 0.0542, loss_rpn_bbox: 0.0708, loss_cls: 0.3031, acc: 90.4438, loss_bbox: 0.3382, loss_mask: 0.3479, loss: 1.1142 2024-05-28 09:47:19,623 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 18:43:13, time: 0.765, data_time: 0.037, memory: 13575, loss_rpn_cls: 0.0550, loss_rpn_bbox: 0.0738, loss_cls: 0.3029, acc: 90.4861, loss_bbox: 0.3389, loss_mask: 0.3588, loss: 1.1294 2024-05-28 09:47:57,786 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 09:47:57,786 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 18:41:51, time: 0.763, data_time: 0.036, memory: 13575, loss_rpn_cls: 0.0581, loss_rpn_bbox: 0.0735, loss_cls: 0.3088, acc: 90.4141, loss_bbox: 0.3483, loss_mask: 0.3508, loss: 1.1394 2024-05-28 09:48:36,108 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 18:40:36, time: 0.766, data_time: 0.031, memory: 13575, loss_rpn_cls: 0.0520, loss_rpn_bbox: 0.0671, loss_cls: 0.2938, acc: 90.7620, loss_bbox: 0.3351, loss_mask: 0.3460, loss: 1.0940 2024-05-28 09:49:16,074 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 18:40:06, time: 0.799, data_time: 0.031, memory: 13575, loss_rpn_cls: 0.0501, loss_rpn_bbox: 0.0655, loss_cls: 0.2877, acc: 90.9666, loss_bbox: 0.3227, loss_mask: 0.3433, loss: 1.0692 2024-05-28 09:49:54,637 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 18:38:59, time: 0.771, data_time: 0.030, memory: 13575, loss_rpn_cls: 0.0509, loss_rpn_bbox: 0.0696, loss_cls: 0.3176, acc: 90.0256, loss_bbox: 0.3545, loss_mask: 0.3453, loss: 1.1380 2024-05-28 09:50:32,744 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 18:37:40, time: 0.762, data_time: 0.033, memory: 13575, loss_rpn_cls: 0.0481, loss_rpn_bbox: 0.0653, loss_cls: 0.3025, acc: 90.4297, loss_bbox: 0.3348, loss_mask: 0.3472, loss: 1.0979 2024-05-28 09:51:14,634 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 18:38:01, time: 0.838, data_time: 0.031, memory: 13575, loss_rpn_cls: 0.0530, loss_rpn_bbox: 0.0721, loss_cls: 0.3105, acc: 90.0923, loss_bbox: 0.3514, loss_mask: 0.3509, loss: 1.1379 2024-05-28 09:51:55,742 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 18:38:00, time: 0.822, data_time: 0.035, memory: 13575, loss_rpn_cls: 0.0507, loss_rpn_bbox: 0.0663, loss_cls: 0.3020, acc: 90.4155, loss_bbox: 0.3444, loss_mask: 0.3359, loss: 1.0994 2024-05-28 09:52:37,931 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 18:38:26, time: 0.844, data_time: 0.034, memory: 13575, loss_rpn_cls: 0.0486, loss_rpn_bbox: 0.0673, loss_cls: 0.2944, acc: 90.5718, loss_bbox: 0.3378, loss_mask: 0.3375, loss: 1.0856 2024-05-28 09:53:16,237 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 18:37:13, time: 0.766, data_time: 0.031, memory: 13575, loss_rpn_cls: 0.0527, loss_rpn_bbox: 0.0685, loss_cls: 0.2954, acc: 90.5278, loss_bbox: 0.3427, loss_mask: 0.3500, loss: 1.1092 2024-05-28 09:53:56,725 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 18:36:54, time: 0.810, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0485, loss_rpn_bbox: 0.0674, loss_cls: 0.2922, acc: 90.4919, loss_bbox: 0.3402, loss_mask: 0.3413, loss: 1.0896 2024-05-28 09:54:35,224 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 18:35:46, time: 0.770, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0530, loss_rpn_bbox: 0.0658, loss_cls: 0.3085, acc: 90.0457, loss_bbox: 0.3543, loss_mask: 0.3415, loss: 1.1232 2024-05-28 09:55:15,100 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 18:35:13, time: 0.797, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0504, loss_rpn_bbox: 0.0646, loss_cls: 0.3059, acc: 90.2510, loss_bbox: 0.3459, loss_mask: 0.3396, loss: 1.1064 2024-05-28 09:55:52,926 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 18:33:50, time: 0.756, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0525, loss_rpn_bbox: 0.0715, loss_cls: 0.3007, acc: 90.4890, loss_bbox: 0.3404, loss_mask: 0.3429, loss: 1.1080 2024-05-28 09:56:30,985 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 18:32:35, time: 0.761, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0521, loss_rpn_bbox: 0.0710, loss_cls: 0.3021, acc: 90.2705, loss_bbox: 0.3524, loss_mask: 0.3537, loss: 1.1313 2024-05-28 09:57:13,090 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 18:32:53, time: 0.842, data_time: 0.028, memory: 13576, loss_rpn_cls: 0.0521, loss_rpn_bbox: 0.0677, loss_cls: 0.3119, acc: 90.2188, loss_bbox: 0.3491, loss_mask: 0.3391, loss: 1.1200 2024-05-28 09:57:53,755 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 18:32:37, time: 0.813, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0501, loss_rpn_bbox: 0.0630, loss_cls: 0.2860, acc: 90.9268, loss_bbox: 0.3284, loss_mask: 0.3252, loss: 1.0527 2024-05-28 09:58:37,424 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 18:33:26, time: 0.873, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0464, loss_rpn_bbox: 0.0649, loss_cls: 0.2906, acc: 90.7017, loss_bbox: 0.3331, loss_mask: 0.3405, loss: 1.0754 2024-05-28 09:59:15,780 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 18:32:17, time: 0.767, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0467, loss_rpn_bbox: 0.0670, loss_cls: 0.2924, acc: 90.6609, loss_bbox: 0.3364, loss_mask: 0.3417, loss: 1.0842 2024-05-28 09:59:56,818 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 18:32:07, time: 0.821, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0470, loss_rpn_bbox: 0.0643, loss_cls: 0.2854, acc: 90.8882, loss_bbox: 0.3297, loss_mask: 0.3337, loss: 1.0600 2024-05-28 10:00:37,037 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 18:31:39, time: 0.804, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0683, loss_cls: 0.2867, acc: 90.6863, loss_bbox: 0.3291, loss_mask: 0.3321, loss: 1.0653 2024-05-28 10:01:15,439 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 10:01:15,440 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 18:30:32, time: 0.768, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0504, loss_rpn_bbox: 0.0623, loss_cls: 0.2933, acc: 90.6628, loss_bbox: 0.3305, loss_mask: 0.3382, loss: 1.0746 2024-05-28 10:01:54,104 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 18:29:31, time: 0.773, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0547, loss_rpn_bbox: 0.0713, loss_cls: 0.3080, acc: 89.9231, loss_bbox: 0.3547, loss_mask: 0.3399, loss: 1.1285 2024-05-28 10:02:34,583 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 18:29:08, time: 0.810, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0521, loss_rpn_bbox: 0.0679, loss_cls: 0.2943, acc: 90.5510, loss_bbox: 0.3379, loss_mask: 0.3332, loss: 1.0854 2024-05-28 10:03:14,457 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 18:28:32, time: 0.797, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0620, loss_cls: 0.2822, acc: 90.8865, loss_bbox: 0.3327, loss_mask: 0.3278, loss: 1.0507 2024-05-28 10:03:52,544 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 18:27:21, time: 0.762, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0672, loss_cls: 0.2960, acc: 90.5044, loss_bbox: 0.3391, loss_mask: 0.3376, loss: 1.0895 2024-05-28 10:04:36,830 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 18:28:12, time: 0.886, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0485, loss_rpn_bbox: 0.0655, loss_cls: 0.2868, acc: 90.6477, loss_bbox: 0.3257, loss_mask: 0.3275, loss: 1.0540 2024-05-28 10:05:15,420 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 18:27:11, time: 0.772, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0487, loss_rpn_bbox: 0.0705, loss_cls: 0.3093, acc: 90.1780, loss_bbox: 0.3494, loss_mask: 0.3444, loss: 1.1223 2024-05-28 10:05:56,148 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 18:26:51, time: 0.815, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0439, loss_rpn_bbox: 0.0602, loss_cls: 0.2860, acc: 90.8416, loss_bbox: 0.3234, loss_mask: 0.3248, loss: 1.0382 2024-05-28 10:06:36,601 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 18:26:25, time: 0.809, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0636, loss_cls: 0.2908, acc: 90.6362, loss_bbox: 0.3344, loss_mask: 0.3299, loss: 1.0678 2024-05-28 10:07:14,643 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 18:25:13, time: 0.761, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0471, loss_rpn_bbox: 0.0659, loss_cls: 0.2845, acc: 90.6587, loss_bbox: 0.3356, loss_mask: 0.3306, loss: 1.0637 2024-05-28 10:07:54,520 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 18:24:37, time: 0.798, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0464, loss_rpn_bbox: 0.0636, loss_cls: 0.2670, acc: 91.2075, loss_bbox: 0.3195, loss_mask: 0.3180, loss: 1.0146 2024-05-28 10:08:32,878 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 18:23:33, time: 0.767, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0497, loss_rpn_bbox: 0.0653, loss_cls: 0.2850, acc: 90.7092, loss_bbox: 0.3335, loss_mask: 0.3271, loss: 1.0606 2024-05-28 10:09:15,412 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 18:23:44, time: 0.851, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0628, loss_cls: 0.2817, acc: 90.7681, loss_bbox: 0.3294, loss_mask: 0.3247, loss: 1.0437 2024-05-28 10:09:54,068 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 18:22:45, time: 0.773, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0480, loss_rpn_bbox: 0.0652, loss_cls: 0.2918, acc: 90.7124, loss_bbox: 0.3353, loss_mask: 0.3297, loss: 1.0700 2024-05-28 10:10:34,687 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 18:22:22, time: 0.812, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0614, loss_cls: 0.2847, acc: 90.7295, loss_bbox: 0.3338, loss_mask: 0.3265, loss: 1.0495 2024-05-28 10:11:15,396 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 18:21:59, time: 0.814, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0619, loss_cls: 0.2917, acc: 90.6794, loss_bbox: 0.3288, loss_mask: 0.3216, loss: 1.0498 2024-05-28 10:11:58,116 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 18:22:12, time: 0.854, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0459, loss_rpn_bbox: 0.0627, loss_cls: 0.2885, acc: 90.5325, loss_bbox: 0.3373, loss_mask: 0.3265, loss: 1.0609 2024-05-28 10:12:36,426 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 18:21:07, time: 0.766, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0484, loss_rpn_bbox: 0.0655, loss_cls: 0.2895, acc: 90.5566, loss_bbox: 0.3358, loss_mask: 0.3299, loss: 1.0690 2024-05-28 10:13:14,521 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 18:19:59, time: 0.762, data_time: 0.027, memory: 13576, loss_rpn_cls: 0.0470, loss_rpn_bbox: 0.0617, loss_cls: 0.2841, acc: 90.8604, loss_bbox: 0.3281, loss_mask: 0.3290, loss: 1.0500 2024-05-28 10:13:57,081 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 18:20:07, time: 0.851, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0627, loss_cls: 0.2772, acc: 90.9209, loss_bbox: 0.3246, loss_mask: 0.3301, loss: 1.0404 2024-05-28 10:14:35,838 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 10:14:35,838 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 18:19:11, time: 0.775, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0622, loss_cls: 0.2843, acc: 90.7551, loss_bbox: 0.3292, loss_mask: 0.3249, loss: 1.0464 2024-05-28 10:15:16,251 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 18:18:42, time: 0.808, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0496, loss_rpn_bbox: 0.0655, loss_cls: 0.2789, acc: 90.8296, loss_bbox: 0.3289, loss_mask: 0.3225, loss: 1.0453 2024-05-28 10:15:54,837 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 18:17:43, time: 0.772, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0446, loss_rpn_bbox: 0.0674, loss_cls: 0.2849, acc: 90.6177, loss_bbox: 0.3375, loss_mask: 0.3206, loss: 1.0550 2024-05-28 10:16:36,367 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 18:17:32, time: 0.831, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0446, loss_rpn_bbox: 0.0660, loss_cls: 0.2864, acc: 90.7095, loss_bbox: 0.3274, loss_mask: 0.3253, loss: 1.0497 2024-05-28 10:17:16,661 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 18:17:01, time: 0.806, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0648, loss_cls: 0.2832, acc: 90.8652, loss_bbox: 0.3272, loss_mask: 0.3232, loss: 1.0428 2024-05-28 10:17:58,654 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 18:16:56, time: 0.840, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0439, loss_rpn_bbox: 0.0591, loss_cls: 0.2777, acc: 91.1411, loss_bbox: 0.3182, loss_mask: 0.3188, loss: 1.0177 2024-05-28 10:18:38,961 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 18:16:24, time: 0.806, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0498, loss_rpn_bbox: 0.0676, loss_cls: 0.2839, acc: 90.6965, loss_bbox: 0.3343, loss_mask: 0.3238, loss: 1.0595 2024-05-28 10:19:17,453 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 18:15:24, time: 0.770, data_time: 0.043, memory: 13576, loss_rpn_cls: 0.0502, loss_rpn_bbox: 0.0692, loss_cls: 0.2960, acc: 90.4148, loss_bbox: 0.3375, loss_mask: 0.3254, loss: 1.0784 2024-05-28 10:19:57,606 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 18:14:50, time: 0.803, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0457, loss_rpn_bbox: 0.0658, loss_cls: 0.2752, acc: 91.0146, loss_bbox: 0.3183, loss_mask: 0.3229, loss: 1.0279 2024-05-28 10:20:35,928 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 18:13:48, time: 0.766, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0632, loss_cls: 0.2745, acc: 91.0183, loss_bbox: 0.3259, loss_mask: 0.3199, loss: 1.0273 2024-05-28 10:21:16,277 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 18:13:17, time: 0.807, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0420, loss_rpn_bbox: 0.0627, loss_cls: 0.2796, acc: 90.7244, loss_bbox: 0.3300, loss_mask: 0.3255, loss: 1.0398 2024-05-28 10:21:54,934 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 18:12:21, time: 0.773, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0640, loss_cls: 0.2840, acc: 90.5791, loss_bbox: 0.3294, loss_mask: 0.3216, loss: 1.0448 2024-05-28 10:22:34,790 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 18:11:42, time: 0.797, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0400, loss_rpn_bbox: 0.0599, loss_cls: 0.2639, acc: 91.2915, loss_bbox: 0.3138, loss_mask: 0.3145, loss: 0.9921 2024-05-28 10:23:16,717 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 18:11:34, time: 0.839, data_time: 0.026, memory: 13576, loss_rpn_cls: 0.0399, loss_rpn_bbox: 0.0582, loss_cls: 0.2619, acc: 91.3123, loss_bbox: 0.3126, loss_mask: 0.3202, loss: 0.9928 2024-05-28 10:23:58,933 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 18:11:29, time: 0.844, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0443, loss_rpn_bbox: 0.0621, loss_cls: 0.2784, acc: 90.7983, loss_bbox: 0.3330, loss_mask: 0.3214, loss: 1.0392 2024-05-28 10:24:39,237 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 18:10:57, time: 0.806, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0651, loss_cls: 0.2896, acc: 90.5198, loss_bbox: 0.3380, loss_mask: 0.3223, loss: 1.0589 2024-05-28 10:25:17,483 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 18:09:55, time: 0.765, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0422, loss_rpn_bbox: 0.0594, loss_cls: 0.2718, acc: 91.1211, loss_bbox: 0.3173, loss_mask: 0.3200, loss: 1.0108 2024-05-28 10:25:55,980 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 18:08:57, time: 0.770, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0454, loss_rpn_bbox: 0.0622, loss_cls: 0.2755, acc: 91.0251, loss_bbox: 0.3255, loss_mask: 0.3234, loss: 1.0321 2024-05-28 10:26:36,311 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 18:08:24, time: 0.807, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0484, loss_rpn_bbox: 0.0669, loss_cls: 0.2742, acc: 90.9209, loss_bbox: 0.3290, loss_mask: 0.3237, loss: 1.0421 2024-05-28 10:27:14,824 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 18:07:27, time: 0.770, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0445, loss_rpn_bbox: 0.0628, loss_cls: 0.2700, acc: 90.9058, loss_bbox: 0.3271, loss_mask: 0.3173, loss: 1.0216 2024-05-28 10:27:53,159 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 10:27:53,159 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 18:06:27, time: 0.766, data_time: 0.028, memory: 13576, loss_rpn_cls: 0.0458, loss_rpn_bbox: 0.0617, loss_cls: 0.2776, acc: 90.9329, loss_bbox: 0.3243, loss_mask: 0.3174, loss: 1.0269 2024-05-28 10:28:34,900 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 18:06:14, time: 0.835, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0447, loss_rpn_bbox: 0.0605, loss_cls: 0.2897, acc: 90.5142, loss_bbox: 0.3378, loss_mask: 0.3164, loss: 1.0492 2024-05-28 10:29:13,296 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 18:05:16, time: 0.768, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0445, loss_rpn_bbox: 0.0601, loss_cls: 0.2747, acc: 90.9243, loss_bbox: 0.3179, loss_mask: 0.3148, loss: 1.0119 2024-05-28 10:29:53,994 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 18:04:49, time: 0.814, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0431, loss_rpn_bbox: 0.0603, loss_cls: 0.2708, acc: 91.1536, loss_bbox: 0.3149, loss_mask: 0.3146, loss: 1.0037 2024-05-28 10:30:38,279 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 18:05:08, time: 0.886, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0469, loss_rpn_bbox: 0.0639, loss_cls: 0.2723, acc: 91.0347, loss_bbox: 0.3219, loss_mask: 0.3153, loss: 1.0203 2024-05-28 10:31:16,597 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 18:04:09, time: 0.766, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0431, loss_rpn_bbox: 0.0579, loss_cls: 0.2683, acc: 91.3225, loss_bbox: 0.3094, loss_mask: 0.3105, loss: 0.9892 2024-05-28 10:31:57,047 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 18:03:37, time: 0.809, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0628, loss_cls: 0.2790, acc: 90.9861, loss_bbox: 0.3229, loss_mask: 0.3228, loss: 1.0308 2024-05-28 10:32:35,288 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 18:02:37, time: 0.765, data_time: 0.027, memory: 13576, loss_rpn_cls: 0.0448, loss_rpn_bbox: 0.0608, loss_cls: 0.2801, acc: 90.7449, loss_bbox: 0.3268, loss_mask: 0.3283, loss: 1.0409 2024-05-28 10:33:13,473 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 18:01:37, time: 0.764, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0605, loss_cls: 0.2620, acc: 91.1865, loss_bbox: 0.3196, loss_mask: 0.3108, loss: 0.9981 2024-05-28 10:33:53,747 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 18:01:03, time: 0.805, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0425, loss_rpn_bbox: 0.0614, loss_cls: 0.2718, acc: 91.0977, loss_bbox: 0.3167, loss_mask: 0.3095, loss: 1.0019 2024-05-28 10:34:34,121 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 18:00:31, time: 0.807, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0613, loss_cls: 0.2747, acc: 90.8367, loss_bbox: 0.3245, loss_mask: 0.3142, loss: 1.0157 2024-05-28 10:35:14,352 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 17:59:57, time: 0.805, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0452, loss_rpn_bbox: 0.0656, loss_cls: 0.2732, acc: 90.8516, loss_bbox: 0.3306, loss_mask: 0.3210, loss: 1.0358 2024-05-28 10:35:52,632 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 17:58:58, time: 0.766, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0588, loss_cls: 0.2783, acc: 90.8162, loss_bbox: 0.3272, loss_mask: 0.3121, loss: 1.0165 2024-05-28 10:36:36,811 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 17:59:12, time: 0.884, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0459, loss_rpn_bbox: 0.0625, loss_cls: 0.2745, acc: 90.9492, loss_bbox: 0.3194, loss_mask: 0.3084, loss: 1.0107 2024-05-28 10:37:16,755 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 17:58:34, time: 0.799, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0585, loss_cls: 0.2734, acc: 91.0059, loss_bbox: 0.3183, loss_mask: 0.3123, loss: 1.0011 2024-05-28 10:37:56,863 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 17:57:57, time: 0.802, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0641, loss_cls: 0.2676, acc: 91.2295, loss_bbox: 0.3162, loss_mask: 0.3136, loss: 1.0053 2024-05-28 10:38:35,190 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 17:57:00, time: 0.767, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0421, loss_rpn_bbox: 0.0616, loss_cls: 0.2750, acc: 90.8555, loss_bbox: 0.3251, loss_mask: 0.3161, loss: 1.0199 2024-05-28 10:39:13,315 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 17:56:00, time: 0.763, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0408, loss_rpn_bbox: 0.0595, loss_cls: 0.2594, acc: 91.3430, loss_bbox: 0.3153, loss_mask: 0.3059, loss: 0.9809 2024-05-28 10:39:53,593 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 17:55:26, time: 0.806, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0444, loss_rpn_bbox: 0.0597, loss_cls: 0.2837, acc: 90.9910, loss_bbox: 0.3141, loss_mask: 0.3100, loss: 1.0120 2024-05-28 10:40:33,759 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 17:54:50, time: 0.803, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0565, loss_cls: 0.2644, acc: 91.1597, loss_bbox: 0.3114, loss_mask: 0.3048, loss: 0.9757 2024-05-28 10:41:13,580 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 10:41:13,580 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 17:54:11, time: 0.796, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0461, loss_rpn_bbox: 0.0615, loss_cls: 0.2722, acc: 90.9519, loss_bbox: 0.3162, loss_mask: 0.3124, loss: 1.0084 2024-05-28 10:41:51,679 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 17:53:11, time: 0.762, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0415, loss_rpn_bbox: 0.0613, loss_cls: 0.2764, acc: 90.8457, loss_bbox: 0.3276, loss_mask: 0.3176, loss: 1.0244 2024-05-28 10:42:31,872 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 17:52:36, time: 0.804, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0428, loss_rpn_bbox: 0.0635, loss_cls: 0.2762, acc: 90.7693, loss_bbox: 0.3273, loss_mask: 0.3046, loss: 1.0143 2024-05-28 10:43:11,966 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 17:51:59, time: 0.801, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0626, loss_cls: 0.2677, acc: 91.0232, loss_bbox: 0.3219, loss_mask: 0.3038, loss: 0.9977 2024-05-28 10:43:52,285 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 17:51:26, time: 0.807, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0434, loss_rpn_bbox: 0.0616, loss_cls: 0.2761, acc: 90.7595, loss_bbox: 0.3215, loss_mask: 0.3137, loss: 1.0163 2024-05-28 10:44:34,719 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 17:51:15, time: 0.849, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0388, loss_rpn_bbox: 0.0586, loss_cls: 0.2662, acc: 91.2356, loss_bbox: 0.3122, loss_mask: 0.3055, loss: 0.9814 2024-05-28 10:45:13,025 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 17:50:19, time: 0.766, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0409, loss_rpn_bbox: 0.0568, loss_cls: 0.2582, acc: 91.4370, loss_bbox: 0.3101, loss_mask: 0.3031, loss: 0.9690 2024-05-28 10:45:37,321 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-28 10:48:04,403 - mmdet - INFO - Evaluating bbox... 2024-05-28 10:48:34,573 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.303 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.558 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.298 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.159 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.331 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.449 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.431 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.431 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.431 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.230 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.470 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.607 2024-05-28 10:48:34,573 - mmdet - INFO - Evaluating segm... 2024-05-28 10:49:07,586 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.293 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.519 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.297 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.099 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.312 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.510 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.407 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.179 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.450 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.624 2024-05-28 10:49:08,098 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 10:49:08,099 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.3030, bbox_mAP_50: 0.5580, bbox_mAP_75: 0.2980, bbox_mAP_s: 0.1590, bbox_mAP_m: 0.3310, bbox_mAP_l: 0.4490, bbox_mAP_copypaste: 0.303 0.558 0.298 0.159 0.331 0.449, segm_mAP: 0.2930, segm_mAP_50: 0.5190, segm_mAP_75: 0.2970, segm_mAP_s: 0.0990, segm_mAP_m: 0.3120, segm_mAP_l: 0.5100, segm_mAP_copypaste: 0.293 0.519 0.297 0.099 0.312 0.510 2024-05-28 10:49:54,063 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 17:46:01, time: 0.919, data_time: 0.108, memory: 13576, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0590, loss_cls: 0.2622, acc: 91.1572, loss_bbox: 0.3130, loss_mask: 0.2991, loss: 0.9721 2024-05-28 10:50:37,689 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 17:46:04, time: 0.873, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0403, loss_rpn_bbox: 0.0594, loss_cls: 0.2616, acc: 91.2036, loss_bbox: 0.3176, loss_mask: 0.3080, loss: 0.9869 2024-05-28 10:51:16,124 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 17:45:10, time: 0.769, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0592, loss_cls: 0.2633, acc: 91.1023, loss_bbox: 0.3170, loss_mask: 0.3035, loss: 0.9815 2024-05-28 10:51:54,935 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 17:44:21, time: 0.776, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0620, loss_cls: 0.2590, acc: 91.1941, loss_bbox: 0.3118, loss_mask: 0.2983, loss: 0.9728 2024-05-28 10:52:33,235 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 17:43:27, time: 0.766, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0576, loss_cls: 0.2557, acc: 91.2993, loss_bbox: 0.3101, loss_mask: 0.2999, loss: 0.9613 2024-05-28 10:53:11,407 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 17:42:31, time: 0.763, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0590, loss_cls: 0.2552, acc: 91.3286, loss_bbox: 0.3060, loss_mask: 0.3049, loss: 0.9644 2024-05-28 10:53:51,996 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 17:42:01, time: 0.812, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0421, loss_rpn_bbox: 0.0604, loss_cls: 0.2608, acc: 91.1672, loss_bbox: 0.3135, loss_mask: 0.3045, loss: 0.9814 2024-05-28 10:54:30,665 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 17:41:10, time: 0.773, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0559, loss_cls: 0.2636, acc: 91.1343, loss_bbox: 0.3106, loss_mask: 0.3007, loss: 0.9685 2024-05-28 10:55:09,137 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 17:40:18, time: 0.769, data_time: 0.028, memory: 13576, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0556, loss_cls: 0.2444, acc: 91.8015, loss_bbox: 0.2963, loss_mask: 0.2979, loss: 0.9331 2024-05-28 10:55:52,850 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 17:40:20, time: 0.874, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0413, loss_rpn_bbox: 0.0608, loss_cls: 0.2611, acc: 91.1375, loss_bbox: 0.3142, loss_mask: 0.3045, loss: 0.9819 2024-05-28 10:56:31,013 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 17:39:24, time: 0.763, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0579, loss_cls: 0.2543, acc: 91.5085, loss_bbox: 0.3001, loss_mask: 0.2974, loss: 0.9478 2024-05-28 10:57:11,528 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 17:38:53, time: 0.810, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0611, loss_cls: 0.2631, acc: 91.1729, loss_bbox: 0.3213, loss_mask: 0.3065, loss: 0.9912 2024-05-28 10:57:52,466 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 17:38:26, time: 0.819, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0577, loss_cls: 0.2524, acc: 91.4114, loss_bbox: 0.3056, loss_mask: 0.2947, loss: 0.9478 2024-05-28 10:58:30,961 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 17:37:34, time: 0.770, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0373, loss_rpn_bbox: 0.0546, loss_cls: 0.2456, acc: 91.6907, loss_bbox: 0.2959, loss_mask: 0.2933, loss: 0.9267 2024-05-28 10:59:14,362 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 17:37:31, time: 0.868, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0603, loss_cls: 0.2717, acc: 90.8792, loss_bbox: 0.3201, loss_mask: 0.3039, loss: 0.9939 2024-05-28 10:59:52,778 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 17:36:39, time: 0.768, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0581, loss_cls: 0.2580, acc: 91.3655, loss_bbox: 0.3067, loss_mask: 0.3004, loss: 0.9595 2024-05-28 11:00:34,241 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 17:36:16, time: 0.829, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0586, loss_cls: 0.2587, acc: 91.3027, loss_bbox: 0.3150, loss_mask: 0.2997, loss: 0.9687 2024-05-28 11:01:14,571 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 17:35:42, time: 0.807, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0555, loss_cls: 0.2542, acc: 91.3757, loss_bbox: 0.3080, loss_mask: 0.3019, loss: 0.9566 2024-05-28 11:01:53,061 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 17:34:51, time: 0.770, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0381, loss_rpn_bbox: 0.0578, loss_cls: 0.2541, acc: 91.4294, loss_bbox: 0.3089, loss_mask: 0.3012, loss: 0.9599 2024-05-28 11:02:31,341 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 17:33:57, time: 0.765, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0574, loss_cls: 0.2572, acc: 91.2429, loss_bbox: 0.3121, loss_mask: 0.2984, loss: 0.9640 2024-05-28 11:03:13,914 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 17:33:45, time: 0.851, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0546, loss_cls: 0.2421, acc: 91.8354, loss_bbox: 0.2901, loss_mask: 0.2922, loss: 0.9168 2024-05-28 11:03:55,764 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 17:33:25, time: 0.837, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0582, loss_cls: 0.2540, acc: 91.2944, loss_bbox: 0.3120, loss_mask: 0.3073, loss: 0.9705 2024-05-28 11:04:34,266 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 17:32:34, time: 0.770, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0559, loss_cls: 0.2561, acc: 91.2854, loss_bbox: 0.3068, loss_mask: 0.3044, loss: 0.9611 2024-05-28 11:05:12,831 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 17:31:43, time: 0.771, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0361, loss_rpn_bbox: 0.0585, loss_cls: 0.2591, acc: 91.3760, loss_bbox: 0.3067, loss_mask: 0.3009, loss: 0.9612 2024-05-28 11:05:55,702 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 17:31:33, time: 0.857, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0593, loss_cls: 0.2671, acc: 91.0869, loss_bbox: 0.3142, loss_mask: 0.3098, loss: 0.9872 2024-05-28 11:06:39,483 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 17:31:30, time: 0.876, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0606, loss_cls: 0.2542, acc: 91.3630, loss_bbox: 0.3096, loss_mask: 0.2977, loss: 0.9626 2024-05-28 11:07:20,712 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 17:31:04, time: 0.825, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0404, loss_rpn_bbox: 0.0605, loss_cls: 0.2470, acc: 91.6399, loss_bbox: 0.3039, loss_mask: 0.2978, loss: 0.9496 2024-05-28 11:07:59,328 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 17:30:13, time: 0.772, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0387, loss_rpn_bbox: 0.0577, loss_cls: 0.2590, acc: 91.2500, loss_bbox: 0.3090, loss_mask: 0.3000, loss: 0.9644 2024-05-28 11:08:38,225 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 17:29:26, time: 0.778, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0405, loss_rpn_bbox: 0.0599, loss_cls: 0.2527, acc: 91.4890, loss_bbox: 0.3082, loss_mask: 0.3005, loss: 0.9619 2024-05-28 11:09:18,969 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 17:28:55, time: 0.815, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0577, loss_cls: 0.2589, acc: 91.1917, loss_bbox: 0.3097, loss_mask: 0.3040, loss: 0.9667 2024-05-28 11:09:57,331 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 17:28:02, time: 0.767, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0569, loss_cls: 0.2465, acc: 91.6243, loss_bbox: 0.3024, loss_mask: 0.2927, loss: 0.9361 2024-05-28 11:10:35,327 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 17:27:07, time: 0.760, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0525, loss_cls: 0.2500, acc: 91.6045, loss_bbox: 0.3007, loss_mask: 0.2899, loss: 0.9283 2024-05-28 11:11:19,554 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 17:27:07, time: 0.884, data_time: 0.047, memory: 13576, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0579, loss_cls: 0.2556, acc: 91.3306, loss_bbox: 0.3080, loss_mask: 0.3014, loss: 0.9622 2024-05-28 11:11:57,844 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 17:26:14, time: 0.766, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0550, loss_cls: 0.2550, acc: 91.3230, loss_bbox: 0.3043, loss_mask: 0.2938, loss: 0.9429 2024-05-28 11:12:35,861 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 17:25:19, time: 0.760, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0561, loss_cls: 0.2405, acc: 91.7131, loss_bbox: 0.2909, loss_mask: 0.2945, loss: 0.9194 2024-05-28 11:13:18,866 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 17:25:07, time: 0.860, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0565, loss_cls: 0.2476, acc: 91.4905, loss_bbox: 0.2989, loss_mask: 0.2954, loss: 0.9376 2024-05-28 11:13:59,261 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 17:24:33, time: 0.808, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0539, loss_cls: 0.2437, acc: 91.7810, loss_bbox: 0.2953, loss_mask: 0.2866, loss: 0.9123 2024-05-28 11:14:43,910 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 17:24:35, time: 0.893, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0594, loss_cls: 0.2608, acc: 91.1814, loss_bbox: 0.3162, loss_mask: 0.3070, loss: 0.9826 2024-05-28 11:15:22,481 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 17:23:44, time: 0.771, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0547, loss_cls: 0.2473, acc: 91.6572, loss_bbox: 0.2945, loss_mask: 0.2989, loss: 0.9311 2024-05-28 11:16:01,107 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 17:22:55, time: 0.773, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0577, loss_cls: 0.2588, acc: 91.3127, loss_bbox: 0.3022, loss_mask: 0.3028, loss: 0.9597 2024-05-28 11:16:39,670 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 17:22:05, time: 0.771, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0580, loss_cls: 0.2523, acc: 91.4124, loss_bbox: 0.3056, loss_mask: 0.3023, loss: 0.9544 2024-05-28 11:17:18,014 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 17:21:13, time: 0.767, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0564, loss_cls: 0.2467, acc: 91.6882, loss_bbox: 0.2998, loss_mask: 0.3018, loss: 0.9430 2024-05-28 11:17:58,234 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 17:20:37, time: 0.804, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0560, loss_cls: 0.2549, acc: 91.2976, loss_bbox: 0.3072, loss_mask: 0.2993, loss: 0.9526 2024-05-28 11:18:42,028 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 17:20:30, time: 0.876, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0558, loss_cls: 0.2600, acc: 91.3987, loss_bbox: 0.3086, loss_mask: 0.2999, loss: 0.9621 2024-05-28 11:19:20,506 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 17:19:39, time: 0.770, data_time: 0.044, memory: 13576, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0555, loss_cls: 0.2497, acc: 91.5579, loss_bbox: 0.3070, loss_mask: 0.3003, loss: 0.9484 2024-05-28 11:19:58,857 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 17:18:48, time: 0.767, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0364, loss_rpn_bbox: 0.0560, loss_cls: 0.2498, acc: 91.3635, loss_bbox: 0.3054, loss_mask: 0.2996, loss: 0.9472 2024-05-28 11:20:39,855 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 17:18:18, time: 0.820, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0553, loss_cls: 0.2512, acc: 91.5200, loss_bbox: 0.2998, loss_mask: 0.2914, loss: 0.9332 2024-05-28 11:21:22,569 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 17:18:02, time: 0.854, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0542, loss_cls: 0.2415, acc: 91.8293, loss_bbox: 0.2931, loss_mask: 0.2938, loss: 0.9193 2024-05-28 11:22:03,549 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 17:17:31, time: 0.820, data_time: 0.043, memory: 13576, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0574, loss_cls: 0.2503, acc: 91.4478, loss_bbox: 0.3023, loss_mask: 0.3006, loss: 0.9477 2024-05-28 11:22:44,314 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 17:16:59, time: 0.815, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0549, loss_cls: 0.2413, acc: 91.7842, loss_bbox: 0.2979, loss_mask: 0.2896, loss: 0.9177 2024-05-28 11:23:23,208 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 17:16:12, time: 0.778, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0558, loss_cls: 0.2450, acc: 91.6721, loss_bbox: 0.2958, loss_mask: 0.2906, loss: 0.9248 2024-05-28 11:24:01,456 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 17:15:19, time: 0.764, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0523, loss_cls: 0.2432, acc: 91.7664, loss_bbox: 0.2952, loss_mask: 0.2920, loss: 0.9175 2024-05-28 11:24:39,833 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 17:14:29, time: 0.768, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0409, loss_rpn_bbox: 0.0595, loss_cls: 0.2537, acc: 91.4824, loss_bbox: 0.3075, loss_mask: 0.2975, loss: 0.9590 2024-05-28 11:25:18,180 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 17:13:38, time: 0.767, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0380, loss_rpn_bbox: 0.0572, loss_cls: 0.2533, acc: 91.3743, loss_bbox: 0.3079, loss_mask: 0.2981, loss: 0.9545 2024-05-28 11:25:58,622 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 17:13:03, time: 0.808, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0610, loss_cls: 0.2606, acc: 90.9929, loss_bbox: 0.3140, loss_mask: 0.2952, loss: 0.9684 2024-05-28 11:26:39,960 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 17:12:35, time: 0.827, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0564, loss_cls: 0.2483, acc: 91.6377, loss_bbox: 0.2963, loss_mask: 0.2894, loss: 0.9266 2024-05-28 11:27:20,506 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 17:12:01, time: 0.811, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0558, loss_cls: 0.2497, acc: 91.5271, loss_bbox: 0.3025, loss_mask: 0.2945, loss: 0.9380 2024-05-28 11:28:00,798 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 17:11:25, time: 0.806, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0562, loss_cls: 0.2421, acc: 91.6462, loss_bbox: 0.2984, loss_mask: 0.2953, loss: 0.9296 2024-05-28 11:28:43,963 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 17:11:10, time: 0.863, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0560, loss_cls: 0.2414, acc: 91.6011, loss_bbox: 0.2954, loss_mask: 0.2984, loss: 0.9283 2024-05-28 11:29:22,687 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 17:10:22, time: 0.774, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0596, loss_cls: 0.2488, acc: 91.4009, loss_bbox: 0.3063, loss_mask: 0.2879, loss: 0.9422 2024-05-28 11:30:07,572 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 17:10:20, time: 0.898, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0364, loss_rpn_bbox: 0.0556, loss_cls: 0.2423, acc: 91.7610, loss_bbox: 0.2908, loss_mask: 0.2955, loss: 0.9207 2024-05-28 11:30:45,802 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 17:09:28, time: 0.765, data_time: 0.042, memory: 13576, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0545, loss_cls: 0.2514, acc: 91.4260, loss_bbox: 0.3055, loss_mask: 0.2946, loss: 0.9417 2024-05-28 11:31:24,120 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 17:08:37, time: 0.766, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0553, loss_cls: 0.2511, acc: 91.3711, loss_bbox: 0.3067, loss_mask: 0.2933, loss: 0.9421 2024-05-28 11:32:02,208 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 17:07:45, time: 0.762, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0562, loss_cls: 0.2542, acc: 91.3608, loss_bbox: 0.3060, loss_mask: 0.2952, loss: 0.9501 2024-05-28 11:32:40,461 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 17:06:53, time: 0.765, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0541, loss_cls: 0.2507, acc: 91.4795, loss_bbox: 0.3049, loss_mask: 0.2944, loss: 0.9375 2024-05-28 11:33:21,031 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 17:06:19, time: 0.811, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0553, loss_cls: 0.2467, acc: 91.5439, loss_bbox: 0.3068, loss_mask: 0.2954, loss: 0.9408 2024-05-28 11:34:02,027 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 17:05:48, time: 0.820, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0382, loss_rpn_bbox: 0.0599, loss_cls: 0.2491, acc: 91.3914, loss_bbox: 0.3047, loss_mask: 0.2978, loss: 0.9497 2024-05-28 11:34:40,654 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 17:04:59, time: 0.773, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0363, loss_rpn_bbox: 0.0581, loss_cls: 0.2490, acc: 91.4514, loss_bbox: 0.3076, loss_mask: 0.2968, loss: 0.9478 2024-05-28 11:35:23,283 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 17:04:40, time: 0.853, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0584, loss_cls: 0.2496, acc: 91.5625, loss_bbox: 0.2974, loss_mask: 0.2910, loss: 0.9302 2024-05-28 11:36:03,828 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 17:04:05, time: 0.811, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0571, loss_cls: 0.2460, acc: 91.5032, loss_bbox: 0.3014, loss_mask: 0.2961, loss: 0.9345 2024-05-28 11:36:44,423 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 17:03:30, time: 0.812, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0369, loss_rpn_bbox: 0.0559, loss_cls: 0.2458, acc: 91.4626, loss_bbox: 0.3008, loss_mask: 0.2887, loss: 0.9282 2024-05-28 11:37:27,540 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 17:03:14, time: 0.862, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0545, loss_cls: 0.2538, acc: 91.4460, loss_bbox: 0.3005, loss_mask: 0.2900, loss: 0.9359 2024-05-28 11:38:05,740 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 17:02:22, time: 0.764, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0388, loss_rpn_bbox: 0.0551, loss_cls: 0.2470, acc: 91.5383, loss_bbox: 0.2992, loss_mask: 0.2927, loss: 0.9328 2024-05-28 11:38:44,173 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 17:01:33, time: 0.769, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0568, loss_cls: 0.2523, acc: 91.4729, loss_bbox: 0.3060, loss_mask: 0.2920, loss: 0.9438 2024-05-28 11:39:22,582 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 17:00:43, time: 0.768, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0390, loss_rpn_bbox: 0.0588, loss_cls: 0.2448, acc: 91.5566, loss_bbox: 0.3003, loss_mask: 0.2938, loss: 0.9366 2024-05-28 11:40:01,095 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 16:59:54, time: 0.770, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0518, loss_cls: 0.2391, acc: 91.9431, loss_bbox: 0.2856, loss_mask: 0.2857, loss: 0.8949 2024-05-28 11:40:41,615 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 16:59:19, time: 0.810, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0569, loss_cls: 0.2430, acc: 91.7710, loss_bbox: 0.2863, loss_mask: 0.2921, loss: 0.9144 2024-05-28 11:41:19,968 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 16:58:29, time: 0.767, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0543, loss_cls: 0.2470, acc: 91.6340, loss_bbox: 0.2932, loss_mask: 0.2941, loss: 0.9223 2024-05-28 11:41:58,367 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 16:57:40, time: 0.768, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0559, loss_cls: 0.2523, acc: 91.4341, loss_bbox: 0.3032, loss_mask: 0.2947, loss: 0.9392 2024-05-28 11:42:41,213 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 16:57:20, time: 0.857, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0516, loss_cls: 0.2440, acc: 91.7480, loss_bbox: 0.2894, loss_mask: 0.2959, loss: 0.9168 2024-05-28 11:43:22,077 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 16:56:47, time: 0.817, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0549, loss_cls: 0.2361, acc: 92.0027, loss_bbox: 0.2818, loss_mask: 0.2845, loss: 0.8911 2024-05-28 11:44:06,101 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 16:56:36, time: 0.880, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0366, loss_rpn_bbox: 0.0545, loss_cls: 0.2394, acc: 91.6982, loss_bbox: 0.2959, loss_mask: 0.2888, loss: 0.9152 2024-05-28 11:44:48,942 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 16:56:16, time: 0.857, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0349, loss_rpn_bbox: 0.0577, loss_cls: 0.2408, acc: 91.6621, loss_bbox: 0.2934, loss_mask: 0.2891, loss: 0.9158 2024-05-28 11:45:27,111 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 16:55:25, time: 0.763, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0530, loss_cls: 0.2539, acc: 91.4749, loss_bbox: 0.2989, loss_mask: 0.2883, loss: 0.9294 2024-05-28 11:46:05,865 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 16:54:37, time: 0.775, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0559, loss_cls: 0.2515, acc: 91.3677, loss_bbox: 0.3069, loss_mask: 0.2843, loss: 0.9320 2024-05-28 11:46:44,514 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 16:53:50, time: 0.773, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0530, loss_cls: 0.2417, acc: 91.6965, loss_bbox: 0.2961, loss_mask: 0.2841, loss: 0.9095 2024-05-28 11:47:23,039 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 16:53:01, time: 0.770, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0519, loss_cls: 0.2471, acc: 91.6519, loss_bbox: 0.2943, loss_mask: 0.2886, loss: 0.9170 2024-05-28 11:48:01,574 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 16:52:13, time: 0.771, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0529, loss_cls: 0.2460, acc: 91.7341, loss_bbox: 0.2903, loss_mask: 0.2871, loss: 0.9105 2024-05-28 11:48:41,724 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 16:51:35, time: 0.803, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0508, loss_cls: 0.2329, acc: 92.1711, loss_bbox: 0.2754, loss_mask: 0.2787, loss: 0.8707 2024-05-28 11:49:19,727 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 16:50:43, time: 0.760, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0543, loss_cls: 0.2355, acc: 92.0796, loss_bbox: 0.2846, loss_mask: 0.2901, loss: 0.8978 2024-05-28 11:49:58,364 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 16:49:56, time: 0.773, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0540, loss_cls: 0.2433, acc: 91.8171, loss_bbox: 0.2894, loss_mask: 0.2893, loss: 0.9097 2024-05-28 11:50:41,117 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 16:49:35, time: 0.855, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0561, loss_cls: 0.2417, acc: 91.7937, loss_bbox: 0.2966, loss_mask: 0.2940, loss: 0.9215 2024-05-28 11:51:23,095 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 16:49:09, time: 0.840, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0553, loss_cls: 0.2549, acc: 91.2947, loss_bbox: 0.3061, loss_mask: 0.2896, loss: 0.9421 2024-05-28 11:52:03,250 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 16:48:31, time: 0.803, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0517, loss_cls: 0.2423, acc: 91.7524, loss_bbox: 0.2924, loss_mask: 0.2852, loss: 0.9049 2024-05-28 11:52:48,593 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 16:48:25, time: 0.907, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0549, loss_cls: 0.2430, acc: 91.7761, loss_bbox: 0.2907, loss_mask: 0.2931, loss: 0.9196 2024-05-28 11:53:26,936 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 16:47:36, time: 0.767, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0551, loss_cls: 0.2380, acc: 91.8730, loss_bbox: 0.2882, loss_mask: 0.2855, loss: 0.9018 2024-05-28 11:54:05,374 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 16:46:47, time: 0.769, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0354, loss_rpn_bbox: 0.0549, loss_cls: 0.2521, acc: 91.4404, loss_bbox: 0.2949, loss_mask: 0.2865, loss: 0.9238 2024-05-28 11:54:43,588 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 16:45:57, time: 0.764, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0541, loss_cls: 0.2391, acc: 91.8767, loss_bbox: 0.2875, loss_mask: 0.2888, loss: 0.9065 2024-05-28 11:55:21,708 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 16:45:07, time: 0.763, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0523, loss_cls: 0.2428, acc: 91.7043, loss_bbox: 0.2925, loss_mask: 0.2837, loss: 0.9038 2024-05-28 11:56:00,046 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 16:44:18, time: 0.767, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0562, loss_cls: 0.2436, acc: 91.5327, loss_bbox: 0.3000, loss_mask: 0.2871, loss: 0.9204 2024-05-28 11:56:38,508 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 16:43:29, time: 0.769, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0314, loss_rpn_bbox: 0.0532, loss_cls: 0.2342, acc: 92.0481, loss_bbox: 0.2908, loss_mask: 0.2861, loss: 0.8957 2024-05-28 11:57:18,803 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 16:42:52, time: 0.806, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0538, loss_cls: 0.2362, acc: 92.0391, loss_bbox: 0.2806, loss_mask: 0.2848, loss: 0.8930 2024-05-28 11:57:59,461 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 16:42:17, time: 0.813, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0527, loss_cls: 0.2368, acc: 91.8650, loss_bbox: 0.2874, loss_mask: 0.2852, loss: 0.8946 2024-05-28 11:58:37,570 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 16:41:27, time: 0.762, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0557, loss_cls: 0.2325, acc: 92.0950, loss_bbox: 0.2826, loss_mask: 0.2836, loss: 0.8914 2024-05-28 11:59:23,558 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 16:41:24, time: 0.919, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0512, loss_cls: 0.2242, acc: 92.3047, loss_bbox: 0.2756, loss_mask: 0.2806, loss: 0.8627 2024-05-28 12:00:04,139 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 16:40:49, time: 0.812, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0531, loss_cls: 0.2422, acc: 91.7549, loss_bbox: 0.2927, loss_mask: 0.2851, loss: 0.9061 2024-05-28 12:00:44,766 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 16:40:13, time: 0.813, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0506, loss_cls: 0.2361, acc: 91.8860, loss_bbox: 0.2894, loss_mask: 0.2795, loss: 0.8865 2024-05-28 12:01:25,407 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 16:39:38, time: 0.813, data_time: 0.030, memory: 13576, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0529, loss_cls: 0.2397, acc: 91.7964, loss_bbox: 0.2934, loss_mask: 0.2891, loss: 0.9060 2024-05-28 12:02:04,036 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 16:38:51, time: 0.773, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0558, loss_cls: 0.2377, acc: 91.8796, loss_bbox: 0.2819, loss_mask: 0.2846, loss: 0.8983 2024-05-28 12:02:42,111 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 16:38:01, time: 0.761, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0542, loss_cls: 0.2421, acc: 91.8442, loss_bbox: 0.2835, loss_mask: 0.2839, loss: 0.8984 2024-05-28 12:03:20,467 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 16:37:12, time: 0.767, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0343, loss_rpn_bbox: 0.0594, loss_cls: 0.2459, acc: 91.6765, loss_bbox: 0.2968, loss_mask: 0.2952, loss: 0.9317 2024-05-28 12:03:59,190 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 16:36:26, time: 0.774, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0548, loss_cls: 0.2450, acc: 91.6953, loss_bbox: 0.2954, loss_mask: 0.2907, loss: 0.9231 2024-05-28 12:04:37,741 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 16:35:39, time: 0.771, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0516, loss_cls: 0.2395, acc: 91.7217, loss_bbox: 0.2944, loss_mask: 0.2813, loss: 0.8998 2024-05-28 12:05:18,595 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 16:35:05, time: 0.817, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0567, loss_cls: 0.2426, acc: 91.5942, loss_bbox: 0.2935, loss_mask: 0.2882, loss: 0.9157 2024-05-28 12:05:59,308 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 16:34:30, time: 0.814, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0562, loss_cls: 0.2424, acc: 91.6182, loss_bbox: 0.2952, loss_mask: 0.2846, loss: 0.9130 2024-05-28 12:06:40,290 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 16:33:56, time: 0.820, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0524, loss_cls: 0.2442, acc: 91.7661, loss_bbox: 0.2867, loss_mask: 0.2862, loss: 0.9034 2024-05-28 12:07:20,860 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 16:33:21, time: 0.811, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0528, loss_cls: 0.2385, acc: 91.7954, loss_bbox: 0.2927, loss_mask: 0.2881, loss: 0.9058 2024-05-28 12:08:04,333 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 16:33:01, time: 0.869, data_time: 0.040, memory: 13576, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0542, loss_cls: 0.2485, acc: 91.5874, loss_bbox: 0.2972, loss_mask: 0.2865, loss: 0.9216 2024-05-28 12:08:45,082 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 16:32:26, time: 0.816, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0542, loss_cls: 0.2401, acc: 91.6794, loss_bbox: 0.2946, loss_mask: 0.2872, loss: 0.9106 2024-05-28 12:09:25,783 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 16:31:51, time: 0.814, data_time: 0.048, memory: 13576, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0555, loss_cls: 0.2501, acc: 91.5337, loss_bbox: 0.2940, loss_mask: 0.2867, loss: 0.9211 2024-05-28 12:10:04,405 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 16:31:04, time: 0.772, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0496, loss_cls: 0.2303, acc: 92.0273, loss_bbox: 0.2798, loss_mask: 0.2822, loss: 0.8743 2024-05-28 12:10:42,925 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 16:30:17, time: 0.770, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0500, loss_cls: 0.2317, acc: 91.9807, loss_bbox: 0.2863, loss_mask: 0.2790, loss: 0.8775 2024-05-28 12:11:21,287 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 16:29:29, time: 0.767, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0559, loss_cls: 0.2402, acc: 91.7119, loss_bbox: 0.2940, loss_mask: 0.2879, loss: 0.9126 2024-05-28 12:11:59,953 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 16:28:42, time: 0.773, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0555, loss_cls: 0.2321, acc: 91.9580, loss_bbox: 0.2857, loss_mask: 0.2842, loss: 0.8911 2024-05-28 12:12:38,751 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 16:27:57, time: 0.776, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0357, loss_rpn_bbox: 0.0557, loss_cls: 0.2471, acc: 91.5320, loss_bbox: 0.2963, loss_mask: 0.2768, loss: 0.9116 2024-05-28 12:13:17,130 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 16:27:09, time: 0.768, data_time: 0.033, memory: 13576, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0561, loss_cls: 0.2421, acc: 91.7463, loss_bbox: 0.2884, loss_mask: 0.2870, loss: 0.9091 2024-05-28 12:14:00,426 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 16:26:48, time: 0.866, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0533, loss_cls: 0.2349, acc: 91.9097, loss_bbox: 0.2899, loss_mask: 0.2811, loss: 0.8900 2024-05-28 12:14:45,538 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 16:26:36, time: 0.902, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0552, loss_cls: 0.2267, acc: 92.1531, loss_bbox: 0.2781, loss_mask: 0.2815, loss: 0.8728 2024-05-28 12:15:26,025 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 16:26:00, time: 0.810, data_time: 0.042, memory: 13576, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0537, loss_cls: 0.2308, acc: 92.0972, loss_bbox: 0.2799, loss_mask: 0.2880, loss: 0.8879 2024-05-28 12:16:04,468 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 16:25:12, time: 0.769, data_time: 0.032, memory: 13576, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0508, loss_cls: 0.2315, acc: 91.9070, loss_bbox: 0.2838, loss_mask: 0.2802, loss: 0.8793 2024-05-28 12:16:44,721 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 16:24:34, time: 0.805, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0534, loss_cls: 0.2346, acc: 91.9814, loss_bbox: 0.2822, loss_mask: 0.2843, loss: 0.8880 2024-05-28 12:17:23,184 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 16:23:47, time: 0.769, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0517, loss_cls: 0.2300, acc: 92.0645, loss_bbox: 0.2791, loss_mask: 0.2770, loss: 0.8698 2024-05-28 12:18:03,743 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 16:23:11, time: 0.811, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0542, loss_cls: 0.2371, acc: 91.8257, loss_bbox: 0.2846, loss_mask: 0.2789, loss: 0.8864 2024-05-28 12:18:42,197 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 16:22:23, time: 0.769, data_time: 0.034, memory: 13576, loss_rpn_cls: 0.0337, loss_rpn_bbox: 0.0513, loss_cls: 0.2399, acc: 91.7578, loss_bbox: 0.2942, loss_mask: 0.2855, loss: 0.9046 2024-05-28 12:19:20,442 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 16:21:35, time: 0.765, data_time: 0.041, memory: 13576, loss_rpn_cls: 0.0340, loss_rpn_bbox: 0.0535, loss_cls: 0.2315, acc: 91.9976, loss_bbox: 0.2843, loss_mask: 0.2836, loss: 0.8869 2024-05-28 12:19:58,992 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 16:20:48, time: 0.771, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0352, loss_rpn_bbox: 0.0557, loss_cls: 0.2367, acc: 91.9634, loss_bbox: 0.2823, loss_mask: 0.2797, loss: 0.8895 2024-05-28 12:20:40,152 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 16:20:15, time: 0.823, data_time: 0.036, memory: 13576, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0518, loss_cls: 0.2351, acc: 91.9888, loss_bbox: 0.2820, loss_mask: 0.2838, loss: 0.8860 2024-05-28 12:21:18,381 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 16:19:27, time: 0.765, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0535, loss_cls: 0.2507, acc: 91.5522, loss_bbox: 0.2965, loss_mask: 0.2842, loss: 0.9182 2024-05-28 12:22:02,386 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 16:19:09, time: 0.880, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0536, loss_cls: 0.2383, acc: 91.7642, loss_bbox: 0.2896, loss_mask: 0.2745, loss: 0.8881 2024-05-28 12:22:45,305 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 16:18:44, time: 0.858, data_time: 0.035, memory: 13576, loss_rpn_cls: 0.0338, loss_rpn_bbox: 0.0510, loss_cls: 0.2307, acc: 92.0750, loss_bbox: 0.2833, loss_mask: 0.2730, loss: 0.8717 2024-05-28 12:23:25,607 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 16:18:07, time: 0.806, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0489, loss_cls: 0.2316, acc: 92.0896, loss_bbox: 0.2775, loss_mask: 0.2771, loss: 0.8674 2024-05-28 12:24:03,827 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 16:17:18, time: 0.764, data_time: 0.038, memory: 13576, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0538, loss_cls: 0.2387, acc: 91.8027, loss_bbox: 0.2886, loss_mask: 0.2798, loss: 0.8930 2024-05-28 12:24:42,091 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 16:16:30, time: 0.765, data_time: 0.039, memory: 13576, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0504, loss_cls: 0.2363, acc: 91.7744, loss_bbox: 0.2868, loss_mask: 0.2825, loss: 0.8888 2024-05-28 12:25:22,746 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 16:15:54, time: 0.813, data_time: 0.029, memory: 13576, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0503, loss_cls: 0.2363, acc: 91.9167, loss_bbox: 0.2854, loss_mask: 0.2780, loss: 0.8790 2024-05-28 12:26:03,400 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 16:15:18, time: 0.813, data_time: 0.031, memory: 13576, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0487, loss_cls: 0.2267, acc: 92.2756, loss_bbox: 0.2732, loss_mask: 0.2823, loss: 0.8607 2024-05-28 12:26:41,803 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 16:14:31, time: 0.768, data_time: 0.037, memory: 13576, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0527, loss_cls: 0.2246, acc: 92.2200, loss_bbox: 0.2790, loss_mask: 0.2762, loss: 0.8646 2024-05-28 12:27:05,670 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-05-28 12:29:33,658 - mmdet - INFO - Evaluating bbox... 2024-05-28 12:30:00,930 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.364 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.614 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.385 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.195 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.398 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.531 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.491 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.491 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.491 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.277 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.542 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.661 2024-05-28 12:30:00,931 - mmdet - INFO - Evaluating segm... 2024-05-28 12:30:30,970 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.338 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.572 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.351 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.126 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.367 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.449 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.449 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.449 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.213 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.500 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.662 2024-05-28 12:30:31,371 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 12:30:31,373 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3640, bbox_mAP_50: 0.6140, bbox_mAP_75: 0.3850, bbox_mAP_s: 0.1950, bbox_mAP_m: 0.3980, bbox_mAP_l: 0.5310, bbox_mAP_copypaste: 0.364 0.614 0.385 0.195 0.398 0.531, segm_mAP: 0.3380, segm_mAP_50: 0.5720, segm_mAP_75: 0.3510, segm_mAP_s: 0.1260, segm_mAP_m: 0.3670, segm_mAP_l: 0.5660, segm_mAP_copypaste: 0.338 0.572 0.351 0.126 0.367 0.566 2024-05-28 12:31:19,950 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 16:12:12, time: 0.971, data_time: 0.106, memory: 13582, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0517, loss_cls: 0.2306, acc: 92.1006, loss_bbox: 0.2800, loss_mask: 0.2741, loss: 0.8671 2024-05-28 12:32:00,680 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 16:11:36, time: 0.815, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0510, loss_cls: 0.2299, acc: 91.9800, loss_bbox: 0.2833, loss_mask: 0.2760, loss: 0.8696 2024-05-28 12:32:38,862 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 16:10:48, time: 0.764, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0520, loss_cls: 0.2276, acc: 92.0154, loss_bbox: 0.2816, loss_mask: 0.2767, loss: 0.8677 2024-05-28 12:33:20,063 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 16:10:15, time: 0.824, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0457, loss_cls: 0.2114, acc: 92.5159, loss_bbox: 0.2746, loss_mask: 0.2727, loss: 0.8312 2024-05-28 12:33:58,541 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 16:09:29, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0505, loss_cls: 0.2242, acc: 92.0986, loss_bbox: 0.2779, loss_mask: 0.2818, loss: 0.8642 2024-05-28 12:34:39,339 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 16:08:54, time: 0.816, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0529, loss_cls: 0.2320, acc: 91.9028, loss_bbox: 0.2894, loss_mask: 0.2795, loss: 0.8864 2024-05-28 12:35:18,350 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 16:08:10, time: 0.780, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0523, loss_cls: 0.2273, acc: 91.8989, loss_bbox: 0.2891, loss_mask: 0.2788, loss: 0.8777 2024-05-28 12:35:59,069 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 16:07:35, time: 0.814, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0479, loss_cls: 0.2251, acc: 92.3154, loss_bbox: 0.2773, loss_mask: 0.2723, loss: 0.8520 2024-05-28 12:36:37,864 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 16:06:50, time: 0.776, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0519, loss_cls: 0.2332, acc: 91.7222, loss_bbox: 0.2867, loss_mask: 0.2767, loss: 0.8803 2024-05-28 12:37:16,800 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 16:06:06, time: 0.779, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0508, loss_cls: 0.2227, acc: 92.2427, loss_bbox: 0.2772, loss_mask: 0.2766, loss: 0.8569 2024-05-28 12:37:57,671 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 16:05:31, time: 0.817, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0520, loss_cls: 0.2273, acc: 91.9802, loss_bbox: 0.2895, loss_mask: 0.2786, loss: 0.8783 2024-05-28 12:38:36,005 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 16:04:44, time: 0.767, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0498, loss_cls: 0.2213, acc: 92.3098, loss_bbox: 0.2756, loss_mask: 0.2753, loss: 0.8541 2024-05-28 12:39:16,699 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 16:04:09, time: 0.814, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0501, loss_cls: 0.2251, acc: 92.2202, loss_bbox: 0.2761, loss_mask: 0.2674, loss: 0.8487 2024-05-28 12:39:57,658 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 16:03:34, time: 0.819, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0512, loss_cls: 0.2218, acc: 92.3037, loss_bbox: 0.2725, loss_mask: 0.2742, loss: 0.8499 2024-05-28 12:40:39,486 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 16:03:04, time: 0.837, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0482, loss_cls: 0.2176, acc: 92.3521, loss_bbox: 0.2709, loss_mask: 0.2721, loss: 0.8364 2024-05-28 12:41:18,217 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 16:02:19, time: 0.774, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0529, loss_cls: 0.2262, acc: 92.1326, loss_bbox: 0.2808, loss_mask: 0.2771, loss: 0.8669 2024-05-28 12:42:02,283 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 16:01:59, time: 0.881, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0504, loss_cls: 0.2298, acc: 92.1106, loss_bbox: 0.2777, loss_mask: 0.2729, loss: 0.8621 2024-05-28 12:42:40,558 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 16:01:12, time: 0.766, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0503, loss_cls: 0.2278, acc: 92.0378, loss_bbox: 0.2801, loss_mask: 0.2722, loss: 0.8604 2024-05-28 12:43:23,953 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 16:00:49, time: 0.868, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0514, loss_cls: 0.2295, acc: 92.0017, loss_bbox: 0.2791, loss_mask: 0.2749, loss: 0.8643 2024-05-28 12:44:02,567 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 16:00:03, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0491, loss_cls: 0.2260, acc: 92.0972, loss_bbox: 0.2836, loss_mask: 0.2671, loss: 0.8538 2024-05-28 12:44:41,621 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 15:59:20, time: 0.781, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0523, loss_cls: 0.2165, acc: 92.3582, loss_bbox: 0.2752, loss_mask: 0.2711, loss: 0.8448 2024-05-28 12:45:20,228 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 15:58:34, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0501, loss_cls: 0.2274, acc: 91.9465, loss_bbox: 0.2867, loss_mask: 0.2721, loss: 0.8663 2024-05-28 12:46:03,717 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 15:58:11, time: 0.870, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0497, loss_cls: 0.2225, acc: 92.3135, loss_bbox: 0.2744, loss_mask: 0.2775, loss: 0.8540 2024-05-28 12:46:46,889 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 15:57:46, time: 0.863, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0503, loss_cls: 0.2220, acc: 92.1626, loss_bbox: 0.2811, loss_mask: 0.2714, loss: 0.8534 2024-05-28 12:47:25,591 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 15:57:01, time: 0.774, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0528, loss_cls: 0.2257, acc: 92.1030, loss_bbox: 0.2817, loss_mask: 0.2731, loss: 0.8652 2024-05-28 12:48:04,308 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 15:56:16, time: 0.774, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0513, loss_cls: 0.2244, acc: 92.2502, loss_bbox: 0.2776, loss_mask: 0.2778, loss: 0.8637 2024-05-28 12:48:43,022 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 15:55:31, time: 0.774, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0504, loss_cls: 0.2207, acc: 92.2395, loss_bbox: 0.2744, loss_mask: 0.2766, loss: 0.8517 2024-05-28 12:49:24,121 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 15:54:57, time: 0.822, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0493, loss_cls: 0.2197, acc: 92.3843, loss_bbox: 0.2704, loss_mask: 0.2694, loss: 0.8398 2024-05-28 12:50:05,222 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 15:54:23, time: 0.822, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0519, loss_cls: 0.2244, acc: 92.1592, loss_bbox: 0.2750, loss_mask: 0.2689, loss: 0.8495 2024-05-28 12:50:45,774 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 15:53:46, time: 0.811, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0517, loss_cls: 0.2242, acc: 92.0425, loss_bbox: 0.2783, loss_mask: 0.2748, loss: 0.8581 2024-05-28 12:51:28,664 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 15:53:20, time: 0.858, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0528, loss_cls: 0.2260, acc: 92.0049, loss_bbox: 0.2777, loss_mask: 0.2715, loss: 0.8583 2024-05-28 12:52:07,220 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 15:52:34, time: 0.771, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0515, loss_cls: 0.2201, acc: 92.2400, loss_bbox: 0.2741, loss_mask: 0.2754, loss: 0.8518 2024-05-28 12:52:46,046 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 15:51:50, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0497, loss_cls: 0.2204, acc: 92.3074, loss_bbox: 0.2743, loss_mask: 0.2804, loss: 0.8549 2024-05-28 12:53:24,794 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 15:51:05, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0519, loss_cls: 0.2369, acc: 91.7764, loss_bbox: 0.2893, loss_mask: 0.2778, loss: 0.8854 2024-05-28 12:54:09,179 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 15:50:45, time: 0.888, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0524, loss_cls: 0.2266, acc: 92.0571, loss_bbox: 0.2878, loss_mask: 0.2790, loss: 0.8761 2024-05-28 12:54:51,732 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 15:50:17, time: 0.851, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0477, loss_cls: 0.2155, acc: 92.4934, loss_bbox: 0.2614, loss_mask: 0.2629, loss: 0.8142 2024-05-28 12:55:30,770 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 15:49:33, time: 0.781, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0527, loss_cls: 0.2292, acc: 92.0171, loss_bbox: 0.2876, loss_mask: 0.2788, loss: 0.8794 2024-05-28 12:56:09,381 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 15:48:48, time: 0.772, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0516, loss_cls: 0.2216, acc: 92.2520, loss_bbox: 0.2770, loss_mask: 0.2756, loss: 0.8556 2024-05-28 12:56:48,028 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 15:48:03, time: 0.773, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0489, loss_cls: 0.2258, acc: 92.1255, loss_bbox: 0.2743, loss_mask: 0.2711, loss: 0.8480 2024-05-28 12:57:28,973 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 15:47:28, time: 0.819, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0493, loss_cls: 0.2216, acc: 92.2378, loss_bbox: 0.2710, loss_mask: 0.2734, loss: 0.8450 2024-05-28 12:58:07,921 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 15:46:44, time: 0.779, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0512, loss_cls: 0.2229, acc: 92.2434, loss_bbox: 0.2757, loss_mask: 0.2709, loss: 0.8500 2024-05-28 12:58:52,917 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 15:46:26, time: 0.900, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0512, loss_cls: 0.2287, acc: 92.0122, loss_bbox: 0.2864, loss_mask: 0.2750, loss: 0.8712 2024-05-28 12:59:34,103 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 15:45:52, time: 0.824, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0510, loss_cls: 0.2283, acc: 92.1013, loss_bbox: 0.2789, loss_mask: 0.2673, loss: 0.8549 2024-05-28 13:00:12,784 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 15:45:07, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0508, loss_cls: 0.2158, acc: 92.3196, loss_bbox: 0.2712, loss_mask: 0.2701, loss: 0.8353 2024-05-28 13:00:51,667 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 15:44:23, time: 0.778, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0533, loss_cls: 0.2229, acc: 92.1592, loss_bbox: 0.2795, loss_mask: 0.2741, loss: 0.8576 2024-05-28 13:01:30,091 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 15:43:37, time: 0.768, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0303, loss_rpn_bbox: 0.0501, loss_cls: 0.2254, acc: 92.2241, loss_bbox: 0.2722, loss_mask: 0.2689, loss: 0.8470 2024-05-28 13:02:11,616 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 15:43:04, time: 0.830, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0535, loss_cls: 0.2329, acc: 91.8523, loss_bbox: 0.2897, loss_mask: 0.2782, loss: 0.8844 2024-05-28 13:02:56,109 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 15:42:43, time: 0.890, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0511, loss_cls: 0.2243, acc: 92.1689, loss_bbox: 0.2766, loss_mask: 0.2769, loss: 0.8592 2024-05-28 13:03:34,891 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 15:41:58, time: 0.776, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0516, loss_cls: 0.2274, acc: 92.0554, loss_bbox: 0.2818, loss_mask: 0.2772, loss: 0.8689 2024-05-28 13:04:16,062 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 15:41:24, time: 0.823, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0525, loss_cls: 0.2228, acc: 92.1648, loss_bbox: 0.2782, loss_mask: 0.2751, loss: 0.8586 2024-05-28 13:04:54,883 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 15:40:40, time: 0.776, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0550, loss_cls: 0.2302, acc: 91.8262, loss_bbox: 0.2877, loss_mask: 0.2749, loss: 0.8800 2024-05-28 13:05:33,600 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 15:39:55, time: 0.774, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0475, loss_cls: 0.2250, acc: 92.1445, loss_bbox: 0.2746, loss_mask: 0.2741, loss: 0.8513 2024-05-28 13:06:12,214 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 15:39:10, time: 0.772, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0504, loss_cls: 0.2256, acc: 92.1262, loss_bbox: 0.2758, loss_mask: 0.2736, loss: 0.8568 2024-05-28 13:07:00,055 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 15:39:02, time: 0.957, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0528, loss_cls: 0.2284, acc: 92.0408, loss_bbox: 0.2800, loss_mask: 0.2758, loss: 0.8680 2024-05-28 13:07:39,024 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 15:38:19, time: 0.779, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0498, loss_cls: 0.2166, acc: 92.3455, loss_bbox: 0.2713, loss_mask: 0.2713, loss: 0.8382 2024-05-28 13:08:17,972 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 15:37:35, time: 0.778, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0514, loss_cls: 0.2291, acc: 92.0308, loss_bbox: 0.2851, loss_mask: 0.2810, loss: 0.8765 2024-05-28 13:08:56,880 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 15:36:51, time: 0.779, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0495, loss_cls: 0.2182, acc: 92.4490, loss_bbox: 0.2677, loss_mask: 0.2663, loss: 0.8317 2024-05-28 13:09:38,629 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 15:36:19, time: 0.835, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0471, loss_cls: 0.2141, acc: 92.5210, loss_bbox: 0.2688, loss_mask: 0.2668, loss: 0.8256 2024-05-28 13:10:20,035 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 15:35:45, time: 0.828, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0516, loss_cls: 0.2280, acc: 92.0215, loss_bbox: 0.2820, loss_mask: 0.2806, loss: 0.8713 2024-05-28 13:11:01,027 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 15:35:09, time: 0.820, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0478, loss_cls: 0.2188, acc: 92.3394, loss_bbox: 0.2733, loss_mask: 0.2719, loss: 0.8417 2024-05-28 13:11:41,721 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 15:34:32, time: 0.814, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0488, loss_cls: 0.2141, acc: 92.5586, loss_bbox: 0.2690, loss_mask: 0.2704, loss: 0.8310 2024-05-28 13:12:20,651 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 15:33:49, time: 0.779, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0532, loss_cls: 0.2242, acc: 92.0684, loss_bbox: 0.2824, loss_mask: 0.2785, loss: 0.8690 2024-05-28 13:12:59,392 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 15:33:04, time: 0.775, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0495, loss_cls: 0.2165, acc: 92.4438, loss_bbox: 0.2662, loss_mask: 0.2646, loss: 0.8252 2024-05-28 13:13:37,871 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 15:32:19, time: 0.770, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0458, loss_cls: 0.2085, acc: 92.7046, loss_bbox: 0.2584, loss_mask: 0.2683, loss: 0.8090 2024-05-28 13:14:16,535 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 15:31:34, time: 0.773, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0524, loss_cls: 0.2240, acc: 92.1641, loss_bbox: 0.2816, loss_mask: 0.2719, loss: 0.8575 2024-05-28 13:14:59,798 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 15:31:07, time: 0.865, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0500, loss_cls: 0.2178, acc: 92.4500, loss_bbox: 0.2715, loss_mask: 0.2721, loss: 0.8431 2024-05-28 13:15:43,309 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 15:30:41, time: 0.870, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0516, loss_cls: 0.2172, acc: 92.2385, loss_bbox: 0.2722, loss_mask: 0.2649, loss: 0.8368 2024-05-28 13:16:22,350 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 15:29:58, time: 0.781, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0495, loss_cls: 0.2213, acc: 92.3254, loss_bbox: 0.2726, loss_mask: 0.2657, loss: 0.8411 2024-05-28 13:17:03,937 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 15:29:24, time: 0.831, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0481, loss_cls: 0.2285, acc: 92.1741, loss_bbox: 0.2743, loss_mask: 0.2689, loss: 0.8482 2024-05-28 13:17:45,004 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 15:28:49, time: 0.822, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0507, loss_cls: 0.2274, acc: 92.2478, loss_bbox: 0.2760, loss_mask: 0.2735, loss: 0.8569 2024-05-28 13:18:26,533 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 15:28:15, time: 0.831, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0497, loss_cls: 0.2329, acc: 91.9546, loss_bbox: 0.2824, loss_mask: 0.2728, loss: 0.8669 2024-05-28 13:19:05,399 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 15:27:31, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0510, loss_cls: 0.2332, acc: 91.9246, loss_bbox: 0.2841, loss_mask: 0.2727, loss: 0.8718 2024-05-28 13:19:46,331 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 15:26:55, time: 0.819, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0470, loss_cls: 0.2155, acc: 92.4739, loss_bbox: 0.2630, loss_mask: 0.2611, loss: 0.8139 2024-05-28 13:20:25,143 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 15:26:11, time: 0.776, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0518, loss_cls: 0.2305, acc: 91.9429, loss_bbox: 0.2854, loss_mask: 0.2672, loss: 0.8658 2024-05-28 13:21:03,998 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 15:25:27, time: 0.777, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0493, loss_cls: 0.2220, acc: 92.2288, loss_bbox: 0.2726, loss_mask: 0.2679, loss: 0.8386 2024-05-28 13:21:42,799 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 15:24:43, time: 0.776, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0512, loss_cls: 0.2206, acc: 92.3042, loss_bbox: 0.2672, loss_mask: 0.2642, loss: 0.8326 2024-05-28 13:22:25,932 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 15:24:15, time: 0.863, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0525, loss_cls: 0.2311, acc: 91.7446, loss_bbox: 0.2849, loss_mask: 0.2733, loss: 0.8718 2024-05-28 13:23:04,829 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 15:23:31, time: 0.778, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0522, loss_cls: 0.2174, acc: 92.1523, loss_bbox: 0.2767, loss_mask: 0.2651, loss: 0.8400 2024-05-28 13:23:48,964 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 15:23:07, time: 0.883, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0509, loss_cls: 0.2369, acc: 91.8074, loss_bbox: 0.2848, loss_mask: 0.2678, loss: 0.8702 2024-05-28 13:24:27,746 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 15:22:23, time: 0.776, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0549, loss_cls: 0.2264, acc: 92.0366, loss_bbox: 0.2771, loss_mask: 0.2656, loss: 0.8567 2024-05-28 13:25:13,668 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 15:22:05, time: 0.918, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0500, loss_cls: 0.2266, acc: 92.0046, loss_bbox: 0.2793, loss_mask: 0.2749, loss: 0.8585 2024-05-28 13:25:52,356 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 15:21:20, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0501, loss_cls: 0.2235, acc: 92.0422, loss_bbox: 0.2761, loss_mask: 0.2698, loss: 0.8474 2024-05-28 13:26:31,324 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 15:20:37, time: 0.779, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0518, loss_cls: 0.2233, acc: 92.0481, loss_bbox: 0.2811, loss_mask: 0.2672, loss: 0.8515 2024-05-28 13:27:10,103 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 15:19:53, time: 0.775, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0535, loss_cls: 0.2283, acc: 92.0134, loss_bbox: 0.2781, loss_mask: 0.2668, loss: 0.8595 2024-05-28 13:27:48,601 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 15:19:07, time: 0.770, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0300, loss_rpn_bbox: 0.0486, loss_cls: 0.2189, acc: 92.4109, loss_bbox: 0.2670, loss_mask: 0.2625, loss: 0.8270 2024-05-28 13:28:29,932 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 15:18:33, time: 0.827, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0531, loss_cls: 0.2286, acc: 91.9668, loss_bbox: 0.2799, loss_mask: 0.2734, loss: 0.8665 2024-05-28 13:29:09,062 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 15:17:50, time: 0.783, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0507, loss_cls: 0.2317, acc: 92.0044, loss_bbox: 0.2787, loss_mask: 0.2784, loss: 0.8692 2024-05-28 13:29:49,561 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 15:17:12, time: 0.810, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0522, loss_cls: 0.2264, acc: 92.1736, loss_bbox: 0.2734, loss_mask: 0.2710, loss: 0.8523 2024-05-28 13:30:30,326 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 15:16:35, time: 0.815, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0512, loss_cls: 0.2208, acc: 92.1675, loss_bbox: 0.2738, loss_mask: 0.2704, loss: 0.8457 2024-05-28 13:31:09,116 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 15:15:51, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0497, loss_cls: 0.2175, acc: 92.4004, loss_bbox: 0.2729, loss_mask: 0.2699, loss: 0.8374 2024-05-28 13:31:52,653 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 15:15:24, time: 0.871, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0516, loss_cls: 0.2310, acc: 92.0818, loss_bbox: 0.2789, loss_mask: 0.2670, loss: 0.8570 2024-05-28 13:32:31,516 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 15:14:40, time: 0.777, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0493, loss_cls: 0.2279, acc: 92.1406, loss_bbox: 0.2768, loss_mask: 0.2653, loss: 0.8487 2024-05-28 13:33:14,855 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 15:14:12, time: 0.867, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0513, loss_cls: 0.2305, acc: 91.7627, loss_bbox: 0.2905, loss_mask: 0.2794, loss: 0.8811 2024-05-28 13:33:58,447 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 15:13:45, time: 0.872, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0478, loss_cls: 0.2258, acc: 92.1445, loss_bbox: 0.2768, loss_mask: 0.2727, loss: 0.8519 2024-05-28 13:34:37,091 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 15:13:00, time: 0.773, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0488, loss_cls: 0.2195, acc: 92.3972, loss_bbox: 0.2673, loss_mask: 0.2659, loss: 0.8303 2024-05-28 13:35:16,529 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 15:12:19, time: 0.789, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0501, loss_cls: 0.2246, acc: 91.9492, loss_bbox: 0.2795, loss_mask: 0.2671, loss: 0.8500 2024-05-28 13:35:55,368 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 15:11:35, time: 0.777, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0531, loss_cls: 0.2289, acc: 91.9758, loss_bbox: 0.2811, loss_mask: 0.2691, loss: 0.8629 2024-05-28 13:36:36,752 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 15:11:00, time: 0.828, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0520, loss_cls: 0.2226, acc: 92.2856, loss_bbox: 0.2703, loss_mask: 0.2679, loss: 0.8423 2024-05-28 13:37:15,781 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 15:10:17, time: 0.781, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0481, loss_cls: 0.2228, acc: 92.1375, loss_bbox: 0.2726, loss_mask: 0.2630, loss: 0.8339 2024-05-28 13:37:56,401 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 15:09:39, time: 0.812, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0528, loss_cls: 0.2261, acc: 92.0010, loss_bbox: 0.2825, loss_mask: 0.2726, loss: 0.8670 2024-05-28 13:38:35,150 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 15:08:55, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0501, loss_cls: 0.2274, acc: 92.0762, loss_bbox: 0.2759, loss_mask: 0.2643, loss: 0.8473 2024-05-28 13:39:20,328 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 15:08:33, time: 0.904, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0494, loss_cls: 0.2196, acc: 92.3281, loss_bbox: 0.2733, loss_mask: 0.2667, loss: 0.8375 2024-05-28 13:39:59,024 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 15:07:49, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0470, loss_cls: 0.2094, acc: 92.6135, loss_bbox: 0.2616, loss_mask: 0.2615, loss: 0.8079 2024-05-28 13:40:40,191 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 15:07:13, time: 0.823, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0468, loss_cls: 0.2168, acc: 92.3657, loss_bbox: 0.2668, loss_mask: 0.2625, loss: 0.8196 2024-05-28 13:41:22,164 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 15:06:40, time: 0.839, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0529, loss_cls: 0.2266, acc: 92.0173, loss_bbox: 0.2810, loss_mask: 0.2743, loss: 0.8630 2024-05-28 13:42:02,774 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 15:06:02, time: 0.812, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0469, loss_cls: 0.2151, acc: 92.4697, loss_bbox: 0.2699, loss_mask: 0.2670, loss: 0.8269 2024-05-28 13:42:41,486 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 15:05:18, time: 0.774, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0488, loss_cls: 0.2087, acc: 92.5798, loss_bbox: 0.2628, loss_mask: 0.2634, loss: 0.8151 2024-05-28 13:43:20,360 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 15:04:34, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0501, loss_cls: 0.2271, acc: 92.0105, loss_bbox: 0.2767, loss_mask: 0.2689, loss: 0.8512 2024-05-28 13:44:01,712 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 15:03:59, time: 0.827, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0486, loss_cls: 0.2127, acc: 92.5730, loss_bbox: 0.2633, loss_mask: 0.2612, loss: 0.8122 2024-05-28 13:44:40,809 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 15:03:16, time: 0.781, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0539, loss_cls: 0.2340, acc: 91.7886, loss_bbox: 0.2853, loss_mask: 0.2762, loss: 0.8817 2024-05-28 13:45:19,634 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 15:02:32, time: 0.777, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0507, loss_cls: 0.2250, acc: 92.0273, loss_bbox: 0.2820, loss_mask: 0.2654, loss: 0.8543 2024-05-28 13:46:03,301 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 15:02:04, time: 0.873, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0510, loss_cls: 0.2203, acc: 92.0630, loss_bbox: 0.2758, loss_mask: 0.2664, loss: 0.8432 2024-05-28 13:46:41,599 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 15:01:19, time: 0.766, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0484, loss_cls: 0.2148, acc: 92.4365, loss_bbox: 0.2635, loss_mask: 0.2601, loss: 0.8144 2024-05-28 13:47:24,997 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 15:00:50, time: 0.868, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0489, loss_cls: 0.2231, acc: 92.0581, loss_bbox: 0.2780, loss_mask: 0.2748, loss: 0.8530 2024-05-28 13:48:03,864 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 15:00:07, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0501, loss_cls: 0.2229, acc: 92.2859, loss_bbox: 0.2726, loss_mask: 0.2689, loss: 0.8424 2024-05-28 13:48:42,566 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 14:59:22, time: 0.774, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0479, loss_cls: 0.2234, acc: 92.0659, loss_bbox: 0.2757, loss_mask: 0.2664, loss: 0.8412 2024-05-28 13:49:23,906 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 14:58:47, time: 0.827, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0484, loss_cls: 0.2263, acc: 92.1274, loss_bbox: 0.2752, loss_mask: 0.2699, loss: 0.8480 2024-05-28 13:50:07,858 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 14:58:20, time: 0.879, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0493, loss_cls: 0.2211, acc: 92.1912, loss_bbox: 0.2773, loss_mask: 0.2680, loss: 0.8436 2024-05-28 13:50:46,905 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 14:57:37, time: 0.781, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0484, loss_cls: 0.2134, acc: 92.5427, loss_bbox: 0.2695, loss_mask: 0.2617, loss: 0.8216 2024-05-28 13:51:25,500 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 14:56:52, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0457, loss_cls: 0.2102, acc: 92.5459, loss_bbox: 0.2593, loss_mask: 0.2635, loss: 0.8038 2024-05-28 13:52:06,567 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 14:56:16, time: 0.821, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0505, loss_cls: 0.2149, acc: 92.4851, loss_bbox: 0.2652, loss_mask: 0.2644, loss: 0.8230 2024-05-28 13:52:45,457 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 14:55:32, time: 0.778, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0503, loss_cls: 0.2206, acc: 92.2356, loss_bbox: 0.2724, loss_mask: 0.2682, loss: 0.8412 2024-05-28 13:53:26,905 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 14:54:57, time: 0.829, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0503, loss_cls: 0.2177, acc: 92.3057, loss_bbox: 0.2696, loss_mask: 0.2659, loss: 0.8334 2024-05-28 13:54:06,080 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 14:54:15, time: 0.783, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0502, loss_cls: 0.2186, acc: 92.2632, loss_bbox: 0.2759, loss_mask: 0.2663, loss: 0.8411 2024-05-28 13:54:50,473 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 14:53:49, time: 0.888, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0491, loss_cls: 0.2182, acc: 92.2776, loss_bbox: 0.2682, loss_mask: 0.2702, loss: 0.8342 2024-05-28 13:55:32,265 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 14:53:15, time: 0.836, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0445, loss_cls: 0.2208, acc: 92.2639, loss_bbox: 0.2657, loss_mask: 0.2666, loss: 0.8234 2024-05-28 13:56:10,615 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 14:52:29, time: 0.767, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0468, loss_cls: 0.2127, acc: 92.5344, loss_bbox: 0.2625, loss_mask: 0.2625, loss: 0.8119 2024-05-28 13:56:49,405 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 14:51:45, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0491, loss_cls: 0.2218, acc: 92.1873, loss_bbox: 0.2732, loss_mask: 0.2655, loss: 0.8385 2024-05-28 13:57:32,922 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 14:51:17, time: 0.870, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0495, loss_cls: 0.2209, acc: 92.2876, loss_bbox: 0.2698, loss_mask: 0.2583, loss: 0.8284 2024-05-28 13:58:14,529 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 14:50:42, time: 0.832, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0511, loss_cls: 0.2166, acc: 92.3247, loss_bbox: 0.2698, loss_mask: 0.2629, loss: 0.8300 2024-05-28 13:58:53,307 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 14:49:58, time: 0.775, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0495, loss_cls: 0.2197, acc: 92.3706, loss_bbox: 0.2615, loss_mask: 0.2603, loss: 0.8187 2024-05-28 13:59:32,249 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 14:49:15, time: 0.779, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0509, loss_cls: 0.2245, acc: 92.1069, loss_bbox: 0.2767, loss_mask: 0.2690, loss: 0.8504 2024-05-28 14:00:13,755 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 14:48:39, time: 0.830, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0515, loss_cls: 0.2231, acc: 92.1924, loss_bbox: 0.2760, loss_mask: 0.2656, loss: 0.8449 2024-05-28 14:00:52,254 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 14:47:55, time: 0.770, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0477, loss_cls: 0.2109, acc: 92.3838, loss_bbox: 0.2684, loss_mask: 0.2645, loss: 0.8186 2024-05-28 14:01:35,693 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 14:47:25, time: 0.869, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0486, loss_cls: 0.2160, acc: 92.4133, loss_bbox: 0.2652, loss_mask: 0.2709, loss: 0.8286 2024-05-28 14:02:14,197 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 14:46:41, time: 0.770, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0495, loss_cls: 0.2181, acc: 92.4109, loss_bbox: 0.2665, loss_mask: 0.2707, loss: 0.8330 2024-05-28 14:02:57,210 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 14:46:10, time: 0.860, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0499, loss_cls: 0.2146, acc: 92.5215, loss_bbox: 0.2627, loss_mask: 0.2611, loss: 0.8164 2024-05-28 14:03:35,655 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 14:45:25, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0487, loss_cls: 0.2143, acc: 92.4287, loss_bbox: 0.2594, loss_mask: 0.2671, loss: 0.8159 2024-05-28 14:04:14,511 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 14:44:42, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0503, loss_cls: 0.2315, acc: 91.9155, loss_bbox: 0.2785, loss_mask: 0.2662, loss: 0.8547 2024-05-28 14:04:53,299 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 14:43:58, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0494, loss_cls: 0.2218, acc: 92.3169, loss_bbox: 0.2645, loss_mask: 0.2558, loss: 0.8194 2024-05-28 14:05:40,418 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 14:43:40, time: 0.942, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0469, loss_cls: 0.2308, acc: 91.9915, loss_bbox: 0.2804, loss_mask: 0.2651, loss: 0.8520 2024-05-28 14:06:19,888 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 14:42:58, time: 0.789, data_time: 0.047, memory: 13582, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0471, loss_cls: 0.2119, acc: 92.4324, loss_bbox: 0.2607, loss_mask: 0.2665, loss: 0.8149 2024-05-28 14:06:58,680 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 14:42:14, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0477, loss_cls: 0.2152, acc: 92.3899, loss_bbox: 0.2686, loss_mask: 0.2674, loss: 0.8268 2024-05-28 14:07:37,374 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 14:41:30, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0481, loss_cls: 0.2161, acc: 92.4697, loss_bbox: 0.2640, loss_mask: 0.2649, loss: 0.8207 2024-05-28 14:08:16,129 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 14:40:46, time: 0.775, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0461, loss_cls: 0.2138, acc: 92.4790, loss_bbox: 0.2659, loss_mask: 0.2631, loss: 0.8166 2024-05-28 14:08:59,786 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 14:40:17, time: 0.873, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0484, loss_cls: 0.2097, acc: 92.6956, loss_bbox: 0.2587, loss_mask: 0.2606, loss: 0.8049 2024-05-28 14:09:23,844 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-05-28 14:11:48,285 - mmdet - INFO - Evaluating bbox... 2024-05-28 14:12:11,965 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.398 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.643 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.431 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.225 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.436 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.574 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.515 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.515 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.515 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.306 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.699 2024-05-28 14:12:11,965 - mmdet - INFO - Evaluating segm... 2024-05-28 14:12:36,767 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.366 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.601 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.382 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.152 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.395 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.474 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.474 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.474 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.244 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.526 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.686 2024-05-28 14:12:37,084 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 14:12:37,085 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.3980, bbox_mAP_50: 0.6430, bbox_mAP_75: 0.4310, bbox_mAP_s: 0.2250, bbox_mAP_m: 0.4360, bbox_mAP_l: 0.5740, bbox_mAP_copypaste: 0.398 0.643 0.431 0.225 0.436 0.574, segm_mAP: 0.3660, segm_mAP_50: 0.6010, segm_mAP_75: 0.3820, segm_mAP_s: 0.1520, segm_mAP_m: 0.3950, segm_mAP_l: 0.5910, segm_mAP_copypaste: 0.366 0.601 0.382 0.152 0.395 0.591 2024-05-28 14:13:33,492 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 14:38:51, time: 1.128, data_time: 0.098, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0443, loss_cls: 0.2078, acc: 92.4160, loss_bbox: 0.2631, loss_mask: 0.2583, loss: 0.7972 2024-05-28 14:14:11,758 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 14:38:05, time: 0.765, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0475, loss_cls: 0.2133, acc: 92.4458, loss_bbox: 0.2631, loss_mask: 0.2642, loss: 0.8147 2024-05-28 14:14:50,191 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 14:37:21, time: 0.769, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0497, loss_cls: 0.2077, acc: 92.6104, loss_bbox: 0.2643, loss_mask: 0.2603, loss: 0.8069 2024-05-28 14:15:28,861 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 14:36:37, time: 0.773, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0492, loss_cls: 0.2086, acc: 92.6155, loss_bbox: 0.2633, loss_mask: 0.2639, loss: 0.8108 2024-05-28 14:16:07,403 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 14:35:53, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0445, loss_cls: 0.2072, acc: 92.6226, loss_bbox: 0.2601, loss_mask: 0.2601, loss: 0.7951 2024-05-28 14:16:46,102 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 14:35:09, time: 0.774, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0480, loss_cls: 0.2098, acc: 92.2986, loss_bbox: 0.2735, loss_mask: 0.2655, loss: 0.8227 2024-05-28 14:17:27,373 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 14:34:33, time: 0.825, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0480, loss_cls: 0.2155, acc: 92.1721, loss_bbox: 0.2723, loss_mask: 0.2607, loss: 0.8229 2024-05-28 14:18:05,996 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 14:33:49, time: 0.772, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0466, loss_cls: 0.2093, acc: 92.4690, loss_bbox: 0.2634, loss_mask: 0.2601, loss: 0.8039 2024-05-28 14:18:47,286 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 14:33:13, time: 0.826, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0499, loss_cls: 0.2041, acc: 92.6990, loss_bbox: 0.2627, loss_mask: 0.2584, loss: 0.8028 2024-05-28 14:19:28,441 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 14:32:36, time: 0.823, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0281, loss_rpn_bbox: 0.0498, loss_cls: 0.2153, acc: 92.3181, loss_bbox: 0.2723, loss_mask: 0.2625, loss: 0.8280 2024-05-28 14:20:09,868 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 14:32:00, time: 0.829, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0483, loss_cls: 0.2045, acc: 92.5552, loss_bbox: 0.2618, loss_mask: 0.2587, loss: 0.7994 2024-05-28 14:20:50,798 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 14:31:23, time: 0.819, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0491, loss_cls: 0.2015, acc: 92.7690, loss_bbox: 0.2573, loss_mask: 0.2619, loss: 0.7959 2024-05-28 14:21:34,147 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 14:30:53, time: 0.867, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0482, loss_cls: 0.2057, acc: 92.6802, loss_bbox: 0.2583, loss_mask: 0.2603, loss: 0.8000 2024-05-28 14:22:15,521 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 14:30:17, time: 0.827, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0492, loss_cls: 0.2126, acc: 92.4531, loss_bbox: 0.2648, loss_mask: 0.2624, loss: 0.8143 2024-05-28 14:22:54,281 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 14:29:33, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0467, loss_cls: 0.2008, acc: 92.7988, loss_bbox: 0.2585, loss_mask: 0.2559, loss: 0.7859 2024-05-28 14:23:33,254 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 14:28:50, time: 0.779, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0515, loss_cls: 0.2175, acc: 92.2249, loss_bbox: 0.2702, loss_mask: 0.2611, loss: 0.8286 2024-05-28 14:24:14,420 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 14:28:13, time: 0.823, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0466, loss_cls: 0.2095, acc: 92.5752, loss_bbox: 0.2618, loss_mask: 0.2624, loss: 0.8066 2024-05-28 14:24:52,856 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 14:27:29, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0427, loss_cls: 0.2001, acc: 92.8867, loss_bbox: 0.2542, loss_mask: 0.2609, loss: 0.7803 2024-05-28 14:25:31,283 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 14:26:45, time: 0.769, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0445, loss_cls: 0.2064, acc: 92.7397, loss_bbox: 0.2574, loss_mask: 0.2570, loss: 0.7891 2024-05-28 14:26:10,315 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 14:26:02, time: 0.781, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0463, loss_cls: 0.2126, acc: 92.5422, loss_bbox: 0.2624, loss_mask: 0.2641, loss: 0.8118 2024-05-28 14:26:52,250 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 14:25:27, time: 0.839, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0465, loss_cls: 0.2011, acc: 92.7954, loss_bbox: 0.2550, loss_mask: 0.2572, loss: 0.7848 2024-05-28 14:27:36,212 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 14:24:58, time: 0.879, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0504, loss_cls: 0.2219, acc: 92.0564, loss_bbox: 0.2780, loss_mask: 0.2677, loss: 0.8474 2024-05-28 14:28:15,259 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 14:24:16, time: 0.781, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0491, loss_cls: 0.2071, acc: 92.5029, loss_bbox: 0.2646, loss_mask: 0.2568, loss: 0.8037 2024-05-28 14:28:58,557 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 14:23:45, time: 0.866, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0450, loss_cls: 0.1994, acc: 92.8867, loss_bbox: 0.2464, loss_mask: 0.2531, loss: 0.7697 2024-05-28 14:29:39,526 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 14:23:08, time: 0.819, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0457, loss_cls: 0.2065, acc: 92.6926, loss_bbox: 0.2617, loss_mask: 0.2585, loss: 0.7984 2024-05-28 14:30:20,603 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 14:22:31, time: 0.821, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0506, loss_cls: 0.2277, acc: 91.9307, loss_bbox: 0.2828, loss_mask: 0.2714, loss: 0.8609 2024-05-28 14:30:59,115 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 14:21:46, time: 0.770, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0455, loss_cls: 0.2065, acc: 92.7344, loss_bbox: 0.2561, loss_mask: 0.2562, loss: 0.7890 2024-05-28 14:31:37,794 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 14:21:03, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0488, loss_cls: 0.2028, acc: 92.6689, loss_bbox: 0.2592, loss_mask: 0.2580, loss: 0.7957 2024-05-28 14:32:16,775 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 14:20:20, time: 0.780, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0480, loss_cls: 0.2068, acc: 92.6270, loss_bbox: 0.2653, loss_mask: 0.2604, loss: 0.8057 2024-05-28 14:32:59,613 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 14:19:48, time: 0.857, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0476, loss_cls: 0.2062, acc: 92.6492, loss_bbox: 0.2587, loss_mask: 0.2589, loss: 0.7980 2024-05-28 14:33:38,330 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 14:19:04, time: 0.774, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0452, loss_cls: 0.2085, acc: 92.6177, loss_bbox: 0.2622, loss_mask: 0.2549, loss: 0.7941 2024-05-28 14:34:19,872 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 14:18:28, time: 0.831, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0500, loss_cls: 0.2146, acc: 92.4031, loss_bbox: 0.2674, loss_mask: 0.2620, loss: 0.8229 2024-05-28 14:34:58,820 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 14:17:45, time: 0.779, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0503, loss_cls: 0.2125, acc: 92.4463, loss_bbox: 0.2666, loss_mask: 0.2585, loss: 0.8137 2024-05-28 14:35:44,240 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 14:17:20, time: 0.908, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0507, loss_cls: 0.2094, acc: 92.3911, loss_bbox: 0.2710, loss_mask: 0.2621, loss: 0.8208 2024-05-28 14:36:23,075 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 14:16:37, time: 0.777, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0471, loss_cls: 0.2050, acc: 92.5918, loss_bbox: 0.2637, loss_mask: 0.2580, loss: 0.7983 2024-05-28 14:37:04,049 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 14:15:59, time: 0.820, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0500, loss_cls: 0.2211, acc: 92.2605, loss_bbox: 0.2746, loss_mask: 0.2629, loss: 0.8357 2024-05-28 14:37:46,902 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 14:15:27, time: 0.857, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0483, loss_cls: 0.2032, acc: 92.7715, loss_bbox: 0.2565, loss_mask: 0.2580, loss: 0.7920 2024-05-28 14:38:25,709 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 14:14:44, time: 0.776, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0430, loss_cls: 0.1977, acc: 92.8694, loss_bbox: 0.2504, loss_mask: 0.2464, loss: 0.7618 2024-05-28 14:39:04,407 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 14:14:00, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0498, loss_cls: 0.2112, acc: 92.5605, loss_bbox: 0.2613, loss_mask: 0.2609, loss: 0.8110 2024-05-28 14:39:43,503 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 14:13:18, time: 0.782, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0462, loss_cls: 0.2091, acc: 92.5237, loss_bbox: 0.2605, loss_mask: 0.2541, loss: 0.7937 2024-05-28 14:40:22,232 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 14:12:34, time: 0.775, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0476, loss_cls: 0.2151, acc: 92.4287, loss_bbox: 0.2683, loss_mask: 0.2561, loss: 0.8107 2024-05-28 14:41:06,690 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 14:12:06, time: 0.889, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0464, loss_cls: 0.2073, acc: 92.6699, loss_bbox: 0.2584, loss_mask: 0.2555, loss: 0.7926 2024-05-28 14:41:45,424 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 14:11:23, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0476, loss_cls: 0.2104, acc: 92.4780, loss_bbox: 0.2682, loss_mask: 0.2605, loss: 0.8117 2024-05-28 14:42:24,362 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 14:10:40, time: 0.779, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0493, loss_cls: 0.2088, acc: 92.6157, loss_bbox: 0.2616, loss_mask: 0.2627, loss: 0.8091 2024-05-28 14:43:05,445 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 14:10:02, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0479, loss_cls: 0.2129, acc: 92.3892, loss_bbox: 0.2675, loss_mask: 0.2658, loss: 0.8197 2024-05-28 14:43:48,993 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 14:09:32, time: 0.871, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0496, loss_cls: 0.2265, acc: 91.8442, loss_bbox: 0.2806, loss_mask: 0.2640, loss: 0.8470 2024-05-28 14:44:30,034 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 14:08:54, time: 0.821, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0460, loss_cls: 0.2012, acc: 92.8679, loss_bbox: 0.2480, loss_mask: 0.2509, loss: 0.7696 2024-05-28 14:45:08,959 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 14:08:11, time: 0.778, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0478, loss_cls: 0.2177, acc: 92.3381, loss_bbox: 0.2683, loss_mask: 0.2538, loss: 0.8133 2024-05-28 14:45:52,651 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 14:07:41, time: 0.874, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0507, loss_cls: 0.2173, acc: 92.2620, loss_bbox: 0.2655, loss_mask: 0.2585, loss: 0.8194 2024-05-28 14:46:31,214 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 14:06:57, time: 0.771, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0476, loss_cls: 0.2110, acc: 92.5923, loss_bbox: 0.2565, loss_mask: 0.2616, loss: 0.8022 2024-05-28 14:47:09,590 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 14:06:13, time: 0.768, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0464, loss_cls: 0.1995, acc: 92.9172, loss_bbox: 0.2502, loss_mask: 0.2537, loss: 0.7749 2024-05-28 14:47:48,194 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 14:05:29, time: 0.772, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0455, loss_cls: 0.2039, acc: 92.6855, loss_bbox: 0.2556, loss_mask: 0.2559, loss: 0.7845 2024-05-28 14:48:30,079 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 14:04:54, time: 0.838, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0468, loss_cls: 0.2076, acc: 92.6147, loss_bbox: 0.2573, loss_mask: 0.2538, loss: 0.7915 2024-05-28 14:49:10,994 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 14:04:16, time: 0.818, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0471, loss_cls: 0.2009, acc: 92.9507, loss_bbox: 0.2537, loss_mask: 0.2585, loss: 0.7852 2024-05-28 14:49:49,360 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 14:03:32, time: 0.767, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0463, loss_cls: 0.2049, acc: 92.6228, loss_bbox: 0.2639, loss_mask: 0.2593, loss: 0.7985 2024-05-28 14:50:28,187 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 14:02:49, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0492, loss_cls: 0.2132, acc: 92.3567, loss_bbox: 0.2657, loss_mask: 0.2648, loss: 0.8195 2024-05-28 14:51:11,498 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 14:02:17, time: 0.866, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0485, loss_cls: 0.2177, acc: 92.2202, loss_bbox: 0.2745, loss_mask: 0.2630, loss: 0.8302 2024-05-28 14:51:52,497 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 14:01:39, time: 0.820, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0434, loss_cls: 0.1988, acc: 92.9304, loss_bbox: 0.2493, loss_mask: 0.2456, loss: 0.7596 2024-05-28 14:52:33,736 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 14:01:02, time: 0.825, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0489, loss_cls: 0.2080, acc: 92.6523, loss_bbox: 0.2595, loss_mask: 0.2603, loss: 0.8016 2024-05-28 14:53:12,762 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 14:00:20, time: 0.781, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0476, loss_cls: 0.2016, acc: 92.8206, loss_bbox: 0.2590, loss_mask: 0.2540, loss: 0.7858 2024-05-28 14:53:53,802 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 13:59:42, time: 0.821, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0468, loss_cls: 0.2087, acc: 92.4275, loss_bbox: 0.2675, loss_mask: 0.2568, loss: 0.8056 2024-05-28 14:54:35,041 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 13:59:05, time: 0.825, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0506, loss_cls: 0.2115, acc: 92.3875, loss_bbox: 0.2662, loss_mask: 0.2600, loss: 0.8136 2024-05-28 14:55:16,037 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 13:58:28, time: 0.820, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0457, loss_cls: 0.2018, acc: 92.8191, loss_bbox: 0.2564, loss_mask: 0.2562, loss: 0.7831 2024-05-28 14:55:54,562 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 13:57:44, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0481, loss_cls: 0.2139, acc: 92.4133, loss_bbox: 0.2673, loss_mask: 0.2611, loss: 0.8154 2024-05-28 14:56:36,002 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 13:57:07, time: 0.829, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0484, loss_cls: 0.2132, acc: 92.5098, loss_bbox: 0.2632, loss_mask: 0.2570, loss: 0.8082 2024-05-28 14:57:15,016 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 13:56:25, time: 0.780, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0490, loss_cls: 0.2095, acc: 92.4792, loss_bbox: 0.2611, loss_mask: 0.2618, loss: 0.8102 2024-05-28 14:57:53,597 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 13:55:41, time: 0.772, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0478, loss_cls: 0.2095, acc: 92.5068, loss_bbox: 0.2612, loss_mask: 0.2603, loss: 0.8063 2024-05-28 14:58:34,488 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 13:55:03, time: 0.818, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0472, loss_cls: 0.2070, acc: 92.6365, loss_bbox: 0.2530, loss_mask: 0.2524, loss: 0.7841 2024-05-28 14:59:20,127 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 13:54:37, time: 0.913, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0455, loss_cls: 0.2042, acc: 92.6492, loss_bbox: 0.2617, loss_mask: 0.2610, loss: 0.7971 2024-05-28 14:59:58,862 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 13:53:54, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0491, loss_cls: 0.2059, acc: 92.6436, loss_bbox: 0.2635, loss_mask: 0.2616, loss: 0.8056 2024-05-28 15:00:39,765 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 13:53:16, time: 0.818, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0454, loss_cls: 0.2005, acc: 92.8904, loss_bbox: 0.2554, loss_mask: 0.2531, loss: 0.7795 2024-05-28 15:01:20,975 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 13:52:38, time: 0.824, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0453, loss_cls: 0.2038, acc: 92.6584, loss_bbox: 0.2588, loss_mask: 0.2596, loss: 0.7929 2024-05-28 15:01:59,990 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 13:51:56, time: 0.780, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0492, loss_cls: 0.2083, acc: 92.5537, loss_bbox: 0.2640, loss_mask: 0.2615, loss: 0.8103 2024-05-28 15:02:38,813 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 13:51:13, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0449, loss_cls: 0.2011, acc: 92.7102, loss_bbox: 0.2547, loss_mask: 0.2573, loss: 0.7816 2024-05-28 15:03:22,713 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 13:50:42, time: 0.878, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0455, loss_cls: 0.2074, acc: 92.5999, loss_bbox: 0.2642, loss_mask: 0.2628, loss: 0.8064 2024-05-28 15:04:01,390 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 13:49:59, time: 0.774, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0498, loss_cls: 0.2214, acc: 92.1030, loss_bbox: 0.2729, loss_mask: 0.2669, loss: 0.8372 2024-05-28 15:04:40,257 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 13:49:16, time: 0.777, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0479, loss_cls: 0.2107, acc: 92.5500, loss_bbox: 0.2587, loss_mask: 0.2597, loss: 0.8025 2024-05-28 15:05:18,805 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 13:48:32, time: 0.771, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0474, loss_cls: 0.2075, acc: 92.7234, loss_bbox: 0.2512, loss_mask: 0.2491, loss: 0.7811 2024-05-28 15:05:59,269 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 13:47:53, time: 0.809, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0435, loss_cls: 0.2009, acc: 92.9028, loss_bbox: 0.2511, loss_mask: 0.2594, loss: 0.7771 2024-05-28 15:06:40,105 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 13:47:15, time: 0.817, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0489, loss_cls: 0.2146, acc: 92.3865, loss_bbox: 0.2677, loss_mask: 0.2603, loss: 0.8158 2024-05-28 15:07:23,010 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 13:46:42, time: 0.858, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0447, loss_cls: 0.2041, acc: 92.6653, loss_bbox: 0.2563, loss_mask: 0.2636, loss: 0.7944 2024-05-28 15:08:03,440 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 13:46:02, time: 0.809, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0471, loss_cls: 0.2056, acc: 92.5925, loss_bbox: 0.2601, loss_mask: 0.2584, loss: 0.7967 2024-05-28 15:08:42,324 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 13:45:20, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0475, loss_cls: 0.2102, acc: 92.4409, loss_bbox: 0.2661, loss_mask: 0.2566, loss: 0.8066 2024-05-28 15:09:22,845 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 13:44:41, time: 0.810, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0480, loss_cls: 0.2021, acc: 92.7280, loss_bbox: 0.2625, loss_mask: 0.2538, loss: 0.7917 2024-05-28 15:10:01,188 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 13:43:57, time: 0.767, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0469, loss_cls: 0.2019, acc: 92.6702, loss_bbox: 0.2568, loss_mask: 0.2549, loss: 0.7845 2024-05-28 15:10:41,995 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 13:43:18, time: 0.816, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0457, loss_cls: 0.2005, acc: 92.9373, loss_bbox: 0.2536, loss_mask: 0.2498, loss: 0.7742 2024-05-28 15:11:23,108 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 13:42:41, time: 0.822, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0459, loss_cls: 0.2048, acc: 92.6643, loss_bbox: 0.2601, loss_mask: 0.2565, loss: 0.7930 2024-05-28 15:12:01,751 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 13:41:57, time: 0.773, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0447, loss_cls: 0.2019, acc: 92.8735, loss_bbox: 0.2553, loss_mask: 0.2557, loss: 0.7853 2024-05-28 15:12:40,441 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 13:41:14, time: 0.774, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0496, loss_cls: 0.2066, acc: 92.5847, loss_bbox: 0.2587, loss_mask: 0.2606, loss: 0.8022 2024-05-28 15:13:21,015 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 13:40:35, time: 0.812, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0483, loss_cls: 0.2099, acc: 92.5210, loss_bbox: 0.2608, loss_mask: 0.2602, loss: 0.8063 2024-05-28 15:13:59,708 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 13:39:52, time: 0.774, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0481, loss_cls: 0.2059, acc: 92.6414, loss_bbox: 0.2594, loss_mask: 0.2625, loss: 0.8013 2024-05-28 15:14:42,435 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 13:39:18, time: 0.855, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0469, loss_cls: 0.2018, acc: 92.9507, loss_bbox: 0.2531, loss_mask: 0.2531, loss: 0.7794 2024-05-28 15:15:23,433 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 13:38:40, time: 0.820, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0454, loss_cls: 0.2072, acc: 92.6619, loss_bbox: 0.2606, loss_mask: 0.2549, loss: 0.7940 2024-05-28 15:16:02,545 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 13:37:58, time: 0.782, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0472, loss_cls: 0.2165, acc: 92.2883, loss_bbox: 0.2662, loss_mask: 0.2571, loss: 0.8136 2024-05-28 15:16:43,603 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 13:37:21, time: 0.821, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0453, loss_cls: 0.1929, acc: 93.1594, loss_bbox: 0.2448, loss_mask: 0.2513, loss: 0.7593 2024-05-28 15:17:24,316 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 13:36:42, time: 0.814, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0482, loss_cls: 0.2146, acc: 92.2617, loss_bbox: 0.2649, loss_mask: 0.2555, loss: 0.8103 2024-05-28 15:18:03,028 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 13:35:59, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0446, loss_cls: 0.2013, acc: 92.8379, loss_bbox: 0.2549, loss_mask: 0.2518, loss: 0.7770 2024-05-28 15:18:43,945 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 13:35:21, time: 0.818, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0447, loss_cls: 0.2013, acc: 92.7595, loss_bbox: 0.2547, loss_mask: 0.2594, loss: 0.7837 2024-05-28 15:19:24,869 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 13:34:43, time: 0.818, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0264, loss_rpn_bbox: 0.0467, loss_cls: 0.2070, acc: 92.5022, loss_bbox: 0.2612, loss_mask: 0.2566, loss: 0.7979 2024-05-28 15:20:05,981 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 13:34:05, time: 0.822, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0283, loss_rpn_bbox: 0.0516, loss_cls: 0.2172, acc: 92.2058, loss_bbox: 0.2704, loss_mask: 0.2607, loss: 0.8282 2024-05-28 15:20:44,790 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 13:33:22, time: 0.776, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0493, loss_cls: 0.2105, acc: 92.3921, loss_bbox: 0.2663, loss_mask: 0.2556, loss: 0.8073 2024-05-28 15:21:25,557 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 13:32:44, time: 0.815, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0446, loss_cls: 0.1991, acc: 92.9414, loss_bbox: 0.2499, loss_mask: 0.2491, loss: 0.7665 2024-05-28 15:22:04,081 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 13:32:00, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0473, loss_cls: 0.2088, acc: 92.5073, loss_bbox: 0.2624, loss_mask: 0.2586, loss: 0.8016 2024-05-28 15:22:47,542 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 13:31:28, time: 0.869, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0464, loss_cls: 0.2065, acc: 92.4294, loss_bbox: 0.2640, loss_mask: 0.2561, loss: 0.7969 2024-05-28 15:23:26,262 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 13:30:45, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0461, loss_cls: 0.2092, acc: 92.4834, loss_bbox: 0.2656, loss_mask: 0.2638, loss: 0.8113 2024-05-28 15:24:05,214 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 13:30:02, time: 0.779, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0519, loss_cls: 0.2300, acc: 91.9741, loss_bbox: 0.2766, loss_mask: 0.2626, loss: 0.8498 2024-05-28 15:24:46,369 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 13:29:25, time: 0.823, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0466, loss_cls: 0.2034, acc: 92.6399, loss_bbox: 0.2597, loss_mask: 0.2513, loss: 0.7857 2024-05-28 15:25:27,414 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 13:28:47, time: 0.821, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0461, loss_cls: 0.2080, acc: 92.6777, loss_bbox: 0.2538, loss_mask: 0.2532, loss: 0.7874 2024-05-28 15:26:06,167 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 13:28:04, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0471, loss_cls: 0.2063, acc: 92.6514, loss_bbox: 0.2603, loss_mask: 0.2594, loss: 0.8007 2024-05-28 15:26:49,069 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 13:27:30, time: 0.858, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0463, loss_cls: 0.2027, acc: 92.7439, loss_bbox: 0.2542, loss_mask: 0.2570, loss: 0.7872 2024-05-28 15:27:30,124 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 13:26:52, time: 0.821, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0474, loss_cls: 0.2086, acc: 92.5273, loss_bbox: 0.2666, loss_mask: 0.2571, loss: 0.8046 2024-05-28 15:28:08,860 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 13:26:09, time: 0.775, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0455, loss_cls: 0.2062, acc: 92.5254, loss_bbox: 0.2630, loss_mask: 0.2588, loss: 0.7987 2024-05-28 15:28:49,747 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 13:25:31, time: 0.818, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0477, loss_cls: 0.2076, acc: 92.5337, loss_bbox: 0.2588, loss_mask: 0.2533, loss: 0.7936 2024-05-28 15:29:31,070 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 13:24:54, time: 0.826, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0448, loss_cls: 0.2035, acc: 92.7449, loss_bbox: 0.2557, loss_mask: 0.2562, loss: 0.7842 2024-05-28 15:30:09,638 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 13:24:10, time: 0.771, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0451, loss_cls: 0.2043, acc: 92.6021, loss_bbox: 0.2665, loss_mask: 0.2552, loss: 0.7959 2024-05-28 15:30:50,075 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 13:23:31, time: 0.809, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0484, loss_cls: 0.1993, acc: 92.8296, loss_bbox: 0.2532, loss_mask: 0.2512, loss: 0.7768 2024-05-28 15:31:31,209 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 13:22:53, time: 0.823, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0472, loss_cls: 0.2137, acc: 92.4521, loss_bbox: 0.2646, loss_mask: 0.2517, loss: 0.8042 2024-05-28 15:32:09,811 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 13:22:10, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0455, loss_cls: 0.2060, acc: 92.7810, loss_bbox: 0.2503, loss_mask: 0.2543, loss: 0.7828 2024-05-28 15:32:48,700 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 13:21:27, time: 0.778, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0459, loss_cls: 0.2097, acc: 92.4590, loss_bbox: 0.2663, loss_mask: 0.2585, loss: 0.8046 2024-05-28 15:33:31,950 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 13:20:54, time: 0.864, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0459, loss_cls: 0.2138, acc: 92.3030, loss_bbox: 0.2661, loss_mask: 0.2527, loss: 0.8028 2024-05-28 15:34:10,704 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 13:20:11, time: 0.776, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0455, loss_cls: 0.1995, acc: 92.9043, loss_bbox: 0.2479, loss_mask: 0.2512, loss: 0.7689 2024-05-28 15:34:51,626 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 13:19:33, time: 0.818, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0462, loss_cls: 0.2114, acc: 92.5649, loss_bbox: 0.2582, loss_mask: 0.2567, loss: 0.7991 2024-05-28 15:35:32,739 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 13:18:55, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0456, loss_cls: 0.2018, acc: 92.8174, loss_bbox: 0.2525, loss_mask: 0.2567, loss: 0.7812 2024-05-28 15:36:13,991 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 13:18:18, time: 0.825, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0469, loss_cls: 0.2091, acc: 92.4878, loss_bbox: 0.2585, loss_mask: 0.2507, loss: 0.7931 2024-05-28 15:36:54,564 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 13:17:39, time: 0.812, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0468, loss_cls: 0.2042, acc: 92.8032, loss_bbox: 0.2520, loss_mask: 0.2575, loss: 0.7854 2024-05-28 15:37:33,277 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 13:16:56, time: 0.774, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0458, loss_cls: 0.1935, acc: 93.0657, loss_bbox: 0.2464, loss_mask: 0.2479, loss: 0.7574 2024-05-28 15:38:14,333 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 13:16:18, time: 0.821, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0470, loss_cls: 0.2065, acc: 92.5776, loss_bbox: 0.2590, loss_mask: 0.2574, loss: 0.7955 2024-05-28 15:38:52,808 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 13:15:34, time: 0.769, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0453, loss_cls: 0.2022, acc: 92.7651, loss_bbox: 0.2544, loss_mask: 0.2552, loss: 0.7820 2024-05-28 15:39:31,661 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 13:14:52, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0486, loss_cls: 0.2240, acc: 92.0068, loss_bbox: 0.2764, loss_mask: 0.2660, loss: 0.8406 2024-05-28 15:40:12,912 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 13:14:14, time: 0.825, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0426, loss_cls: 0.2034, acc: 92.6855, loss_bbox: 0.2542, loss_mask: 0.2555, loss: 0.7788 2024-05-28 15:40:55,687 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 13:13:40, time: 0.855, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0439, loss_cls: 0.1934, acc: 93.1306, loss_bbox: 0.2370, loss_mask: 0.2522, loss: 0.7503 2024-05-28 15:41:34,115 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 13:12:56, time: 0.769, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0429, loss_cls: 0.2014, acc: 92.8250, loss_bbox: 0.2516, loss_mask: 0.2504, loss: 0.7686 2024-05-28 15:42:15,400 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 13:12:19, time: 0.826, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0449, loss_cls: 0.2071, acc: 92.5908, loss_bbox: 0.2599, loss_mask: 0.2559, loss: 0.7918 2024-05-28 15:42:54,044 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 13:11:36, time: 0.773, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0475, loss_cls: 0.2044, acc: 92.7109, loss_bbox: 0.2567, loss_mask: 0.2578, loss: 0.7907 2024-05-28 15:43:35,158 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 13:10:58, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0475, loss_cls: 0.2026, acc: 92.7668, loss_bbox: 0.2516, loss_mask: 0.2528, loss: 0.7792 2024-05-28 15:44:13,892 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 13:10:15, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0265, loss_rpn_bbox: 0.0501, loss_cls: 0.2141, acc: 92.4268, loss_bbox: 0.2649, loss_mask: 0.2649, loss: 0.8204 2024-05-28 15:45:00,053 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 13:09:47, time: 0.923, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0456, loss_cls: 0.2019, acc: 92.7205, loss_bbox: 0.2528, loss_mask: 0.2550, loss: 0.7809 2024-05-28 15:45:39,001 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 13:09:05, time: 0.779, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0465, loss_cls: 0.1987, acc: 92.9109, loss_bbox: 0.2458, loss_mask: 0.2481, loss: 0.7626 2024-05-28 15:46:17,624 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 13:08:22, time: 0.773, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0478, loss_cls: 0.2077, acc: 92.5623, loss_bbox: 0.2637, loss_mask: 0.2588, loss: 0.8028 2024-05-28 15:46:56,396 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 13:07:39, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0450, loss_cls: 0.2006, acc: 92.8755, loss_bbox: 0.2480, loss_mask: 0.2491, loss: 0.7669 2024-05-28 15:47:38,003 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 13:07:02, time: 0.833, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0460, loss_cls: 0.2087, acc: 92.5706, loss_bbox: 0.2621, loss_mask: 0.2547, loss: 0.7978 2024-05-28 15:48:18,847 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 13:06:24, time: 0.817, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0472, loss_cls: 0.2030, acc: 92.8247, loss_bbox: 0.2522, loss_mask: 0.2532, loss: 0.7824 2024-05-28 15:48:59,886 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 13:05:46, time: 0.821, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0464, loss_cls: 0.2030, acc: 92.6951, loss_bbox: 0.2622, loss_mask: 0.2578, loss: 0.7937 2024-05-28 15:49:38,165 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 13:05:02, time: 0.766, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0464, loss_cls: 0.2017, acc: 92.7957, loss_bbox: 0.2540, loss_mask: 0.2551, loss: 0.7819 2024-05-28 15:50:18,471 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 13:04:22, time: 0.806, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0444, loss_cls: 0.2032, acc: 92.7913, loss_bbox: 0.2533, loss_mask: 0.2542, loss: 0.7797 2024-05-28 15:50:59,147 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 13:03:44, time: 0.814, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0445, loss_cls: 0.2022, acc: 92.8281, loss_bbox: 0.2531, loss_mask: 0.2494, loss: 0.7724 2024-05-28 15:51:23,328 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-05-28 15:53:51,162 - mmdet - INFO - Evaluating bbox... 2024-05-28 15:54:19,239 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.412 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.662 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.447 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.234 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.536 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.332 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.586 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.703 2024-05-28 15:54:19,240 - mmdet - INFO - Evaluating segm... 2024-05-28 15:54:46,753 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.377 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.623 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.394 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.159 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.407 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.605 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.488 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.264 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.541 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.689 2024-05-28 15:54:47,151 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 15:54:47,153 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4120, bbox_mAP_50: 0.6620, bbox_mAP_75: 0.4470, bbox_mAP_s: 0.2340, bbox_mAP_m: 0.4500, bbox_mAP_l: 0.5850, bbox_mAP_copypaste: 0.412 0.662 0.447 0.234 0.450 0.585, segm_mAP: 0.3770, segm_mAP_50: 0.6230, segm_mAP_75: 0.3940, segm_mAP_s: 0.1590, segm_mAP_m: 0.4070, segm_mAP_l: 0.6050, segm_mAP_copypaste: 0.377 0.623 0.394 0.159 0.407 0.605 2024-05-28 15:55:36,327 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 13:02:10, time: 0.983, data_time: 0.104, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0438, loss_cls: 0.1970, acc: 93.0095, loss_bbox: 0.2457, loss_mask: 0.2492, loss: 0.7576 2024-05-28 15:56:17,486 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 13:01:32, time: 0.823, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0458, loss_cls: 0.1988, acc: 92.8750, loss_bbox: 0.2557, loss_mask: 0.2476, loss: 0.7714 2024-05-28 15:56:55,900 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 13:00:48, time: 0.768, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0454, loss_cls: 0.1891, acc: 93.1016, loss_bbox: 0.2420, loss_mask: 0.2455, loss: 0.7445 2024-05-28 15:57:34,206 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 13:00:05, time: 0.766, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0432, loss_cls: 0.1931, acc: 92.8931, loss_bbox: 0.2513, loss_mask: 0.2521, loss: 0.7620 2024-05-28 15:58:17,725 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 12:59:32, time: 0.870, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0462, loss_cls: 0.1987, acc: 92.8010, loss_bbox: 0.2546, loss_mask: 0.2505, loss: 0.7742 2024-05-28 15:59:01,073 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 12:58:58, time: 0.867, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0426, loss_cls: 0.1898, acc: 93.0327, loss_bbox: 0.2471, loss_mask: 0.2457, loss: 0.7463 2024-05-28 15:59:40,315 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 12:58:16, time: 0.785, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0445, loss_cls: 0.1979, acc: 92.8247, loss_bbox: 0.2522, loss_mask: 0.2504, loss: 0.7662 2024-05-28 16:00:19,066 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 12:57:34, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0469, loss_cls: 0.1999, acc: 92.7512, loss_bbox: 0.2541, loss_mask: 0.2556, loss: 0.7805 2024-05-28 16:00:57,864 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 12:56:51, time: 0.776, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0475, loss_cls: 0.2061, acc: 92.6272, loss_bbox: 0.2650, loss_mask: 0.2568, loss: 0.7985 2024-05-28 16:01:38,764 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 12:56:13, time: 0.818, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0460, loss_cls: 0.2016, acc: 92.7031, loss_bbox: 0.2576, loss_mask: 0.2528, loss: 0.7785 2024-05-28 16:02:19,633 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 12:55:34, time: 0.817, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0463, loss_cls: 0.2038, acc: 92.6355, loss_bbox: 0.2550, loss_mask: 0.2499, loss: 0.7804 2024-05-28 16:02:58,005 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 12:54:51, time: 0.767, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0431, loss_cls: 0.1862, acc: 93.1875, loss_bbox: 0.2385, loss_mask: 0.2433, loss: 0.7327 2024-05-28 16:03:41,629 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 12:54:18, time: 0.872, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0435, loss_cls: 0.1901, acc: 93.0874, loss_bbox: 0.2451, loss_mask: 0.2458, loss: 0.7461 2024-05-28 16:04:20,501 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 12:53:36, time: 0.778, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0469, loss_cls: 0.1937, acc: 92.8545, loss_bbox: 0.2491, loss_mask: 0.2486, loss: 0.7626 2024-05-28 16:05:01,234 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 12:52:57, time: 0.815, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0428, loss_cls: 0.1860, acc: 93.0930, loss_bbox: 0.2444, loss_mask: 0.2463, loss: 0.7414 2024-05-28 16:05:39,804 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 12:52:14, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0440, loss_cls: 0.1925, acc: 92.9866, loss_bbox: 0.2459, loss_mask: 0.2481, loss: 0.7527 2024-05-28 16:06:23,785 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 12:51:41, time: 0.880, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0455, loss_cls: 0.2024, acc: 92.7588, loss_bbox: 0.2550, loss_mask: 0.2559, loss: 0.7838 2024-05-28 16:07:04,634 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 12:51:03, time: 0.817, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0450, loss_cls: 0.2023, acc: 92.6575, loss_bbox: 0.2553, loss_mask: 0.2477, loss: 0.7745 2024-05-28 16:07:43,422 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 12:50:20, time: 0.776, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0461, loss_cls: 0.1984, acc: 92.6975, loss_bbox: 0.2556, loss_mask: 0.2553, loss: 0.7797 2024-05-28 16:08:22,178 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 12:49:38, time: 0.775, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0420, loss_cls: 0.1893, acc: 93.1199, loss_bbox: 0.2479, loss_mask: 0.2475, loss: 0.7509 2024-05-28 16:09:01,160 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 12:48:56, time: 0.780, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0453, loss_cls: 0.1989, acc: 92.9192, loss_bbox: 0.2496, loss_mask: 0.2488, loss: 0.7654 2024-05-28 16:09:40,316 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 12:48:14, time: 0.783, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0465, loss_cls: 0.2009, acc: 92.7156, loss_bbox: 0.2593, loss_mask: 0.2489, loss: 0.7801 2024-05-28 16:10:23,710 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 12:47:40, time: 0.868, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0463, loss_cls: 0.2031, acc: 92.6245, loss_bbox: 0.2550, loss_mask: 0.2483, loss: 0.7766 2024-05-28 16:11:02,138 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 12:46:57, time: 0.768, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0434, loss_cls: 0.1948, acc: 92.8618, loss_bbox: 0.2569, loss_mask: 0.2481, loss: 0.7639 2024-05-28 16:11:46,238 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 12:46:25, time: 0.882, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0461, loss_cls: 0.1963, acc: 92.9614, loss_bbox: 0.2475, loss_mask: 0.2473, loss: 0.7597 2024-05-28 16:12:24,480 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 12:45:41, time: 0.765, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0435, loss_cls: 0.1899, acc: 93.1309, loss_bbox: 0.2444, loss_mask: 0.2498, loss: 0.7486 2024-05-28 16:13:04,916 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 12:45:02, time: 0.809, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0422, loss_cls: 0.1912, acc: 93.0247, loss_bbox: 0.2410, loss_mask: 0.2427, loss: 0.7389 2024-05-28 16:13:44,046 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 12:44:20, time: 0.783, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0452, loss_cls: 0.2011, acc: 92.7942, loss_bbox: 0.2537, loss_mask: 0.2492, loss: 0.7712 2024-05-28 16:14:25,421 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 12:43:42, time: 0.827, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0458, loss_cls: 0.1964, acc: 92.9519, loss_bbox: 0.2485, loss_mask: 0.2457, loss: 0.7598 2024-05-28 16:15:09,930 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 12:43:11, time: 0.890, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0466, loss_cls: 0.2098, acc: 92.4333, loss_bbox: 0.2591, loss_mask: 0.2563, loss: 0.7960 2024-05-28 16:15:48,524 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 12:42:28, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0446, loss_cls: 0.1974, acc: 92.8047, loss_bbox: 0.2507, loss_mask: 0.2483, loss: 0.7646 2024-05-28 16:16:27,740 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 12:41:46, time: 0.785, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0470, loss_cls: 0.2002, acc: 92.8667, loss_bbox: 0.2480, loss_mask: 0.2477, loss: 0.7673 2024-05-28 16:17:06,410 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 12:41:04, time: 0.773, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0441, loss_cls: 0.1928, acc: 93.0247, loss_bbox: 0.2433, loss_mask: 0.2509, loss: 0.7540 2024-05-28 16:17:45,140 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 12:40:21, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0470, loss_cls: 0.2043, acc: 92.6326, loss_bbox: 0.2574, loss_mask: 0.2507, loss: 0.7843 2024-05-28 16:18:28,769 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 12:39:47, time: 0.873, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0466, loss_cls: 0.2014, acc: 92.6509, loss_bbox: 0.2581, loss_mask: 0.2471, loss: 0.7757 2024-05-28 16:19:07,886 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 12:39:06, time: 0.782, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0461, loss_cls: 0.1967, acc: 92.7312, loss_bbox: 0.2501, loss_mask: 0.2469, loss: 0.7632 2024-05-28 16:19:46,504 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 12:38:23, time: 0.772, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0451, loss_cls: 0.1893, acc: 93.0193, loss_bbox: 0.2441, loss_mask: 0.2492, loss: 0.7488 2024-05-28 16:20:30,222 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 12:37:50, time: 0.874, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0445, loss_cls: 0.1878, acc: 93.1311, loss_bbox: 0.2434, loss_mask: 0.2493, loss: 0.7455 2024-05-28 16:21:08,860 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 12:37:07, time: 0.773, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0422, loss_cls: 0.1918, acc: 92.9873, loss_bbox: 0.2449, loss_mask: 0.2528, loss: 0.7527 2024-05-28 16:21:52,697 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 12:36:34, time: 0.877, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0459, loss_cls: 0.2048, acc: 92.6550, loss_bbox: 0.2535, loss_mask: 0.2513, loss: 0.7782 2024-05-28 16:22:31,587 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 12:35:51, time: 0.778, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0418, loss_cls: 0.1913, acc: 93.1853, loss_bbox: 0.2375, loss_mask: 0.2420, loss: 0.7336 2024-05-28 16:23:16,488 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 12:35:20, time: 0.898, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0460, loss_cls: 0.1928, acc: 93.0193, loss_bbox: 0.2504, loss_mask: 0.2483, loss: 0.7603 2024-05-28 16:23:55,299 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 12:34:38, time: 0.776, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0455, loss_cls: 0.1986, acc: 92.7395, loss_bbox: 0.2499, loss_mask: 0.2529, loss: 0.7695 2024-05-28 16:24:34,262 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 12:33:56, time: 0.779, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0454, loss_cls: 0.2004, acc: 92.6572, loss_bbox: 0.2594, loss_mask: 0.2558, loss: 0.7831 2024-05-28 16:25:13,001 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 12:33:13, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0414, loss_cls: 0.1981, acc: 92.8547, loss_bbox: 0.2458, loss_mask: 0.2435, loss: 0.7503 2024-05-28 16:25:52,403 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 12:32:32, time: 0.788, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0450, loss_cls: 0.1955, acc: 92.9526, loss_bbox: 0.2471, loss_mask: 0.2500, loss: 0.7612 2024-05-28 16:26:35,560 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 12:31:57, time: 0.863, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0482, loss_cls: 0.2018, acc: 92.7720, loss_bbox: 0.2517, loss_mask: 0.2509, loss: 0.7776 2024-05-28 16:27:16,370 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 12:31:19, time: 0.816, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0467, loss_cls: 0.2053, acc: 92.6819, loss_bbox: 0.2527, loss_mask: 0.2493, loss: 0.7767 2024-05-28 16:27:55,347 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 12:30:37, time: 0.780, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0485, loss_cls: 0.2062, acc: 92.5244, loss_bbox: 0.2548, loss_mask: 0.2552, loss: 0.7898 2024-05-28 16:28:40,286 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 12:30:05, time: 0.899, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0465, loss_cls: 0.2007, acc: 92.6863, loss_bbox: 0.2553, loss_mask: 0.2509, loss: 0.7770 2024-05-28 16:29:19,457 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 12:29:23, time: 0.783, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0449, loss_cls: 0.1992, acc: 92.8440, loss_bbox: 0.2467, loss_mask: 0.2517, loss: 0.7675 2024-05-28 16:30:03,390 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 12:28:50, time: 0.878, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0429, loss_cls: 0.1909, acc: 93.0632, loss_bbox: 0.2453, loss_mask: 0.2483, loss: 0.7483 2024-05-28 16:30:46,962 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 12:28:16, time: 0.872, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0470, loss_cls: 0.2055, acc: 92.5020, loss_bbox: 0.2638, loss_mask: 0.2561, loss: 0.7982 2024-05-28 16:31:25,523 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 12:27:33, time: 0.771, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0444, loss_cls: 0.1880, acc: 93.2251, loss_bbox: 0.2392, loss_mask: 0.2421, loss: 0.7350 2024-05-28 16:32:04,292 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 12:26:51, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0424, loss_cls: 0.1936, acc: 93.0493, loss_bbox: 0.2467, loss_mask: 0.2504, loss: 0.7546 2024-05-28 16:32:43,094 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 12:26:09, time: 0.776, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0428, loss_cls: 0.1909, acc: 93.0615, loss_bbox: 0.2447, loss_mask: 0.2463, loss: 0.7470 2024-05-28 16:33:21,863 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 12:25:26, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0469, loss_cls: 0.1943, acc: 92.9417, loss_bbox: 0.2500, loss_mask: 0.2514, loss: 0.7677 2024-05-28 16:34:00,846 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 12:24:44, time: 0.780, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0469, loss_cls: 0.1983, acc: 92.7820, loss_bbox: 0.2552, loss_mask: 0.2507, loss: 0.7762 2024-05-28 16:34:39,751 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 12:24:02, time: 0.778, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0460, loss_cls: 0.1993, acc: 92.7463, loss_bbox: 0.2519, loss_mask: 0.2560, loss: 0.7764 2024-05-28 16:35:23,502 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 12:23:28, time: 0.875, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0464, loss_cls: 0.2024, acc: 92.6499, loss_bbox: 0.2566, loss_mask: 0.2505, loss: 0.7794 2024-05-28 16:36:02,063 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 12:22:46, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0429, loss_cls: 0.1926, acc: 93.0508, loss_bbox: 0.2429, loss_mask: 0.2490, loss: 0.7489 2024-05-28 16:36:47,381 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 12:22:14, time: 0.906, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0439, loss_cls: 0.1931, acc: 93.0781, loss_bbox: 0.2476, loss_mask: 0.2512, loss: 0.7585 2024-05-28 16:37:28,596 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 12:21:36, time: 0.825, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0461, loss_cls: 0.1973, acc: 92.8718, loss_bbox: 0.2484, loss_mask: 0.2449, loss: 0.7588 2024-05-28 16:38:10,229 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 12:20:59, time: 0.832, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0455, loss_cls: 0.2000, acc: 92.8064, loss_bbox: 0.2557, loss_mask: 0.2543, loss: 0.7781 2024-05-28 16:38:55,017 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 12:20:27, time: 0.896, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0434, loss_cls: 0.1977, acc: 92.8906, loss_bbox: 0.2505, loss_mask: 0.2466, loss: 0.7615 2024-05-28 16:39:33,561 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 12:19:44, time: 0.771, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0447, loss_cls: 0.1871, acc: 93.1948, loss_bbox: 0.2486, loss_mask: 0.2475, loss: 0.7494 2024-05-28 16:40:12,467 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 12:19:02, time: 0.778, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0441, loss_cls: 0.1948, acc: 92.9312, loss_bbox: 0.2512, loss_mask: 0.2477, loss: 0.7616 2024-05-28 16:40:51,248 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 12:18:19, time: 0.776, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0448, loss_cls: 0.1931, acc: 93.0771, loss_bbox: 0.2423, loss_mask: 0.2454, loss: 0.7499 2024-05-28 16:41:29,943 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 12:17:37, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0471, loss_cls: 0.2009, acc: 92.7107, loss_bbox: 0.2609, loss_mask: 0.2544, loss: 0.7867 2024-05-28 16:42:08,443 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 12:16:54, time: 0.769, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0453, loss_cls: 0.1990, acc: 92.8132, loss_bbox: 0.2506, loss_mask: 0.2534, loss: 0.7734 2024-05-28 16:42:47,186 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 12:16:12, time: 0.776, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0450, loss_cls: 0.1986, acc: 92.7517, loss_bbox: 0.2548, loss_mask: 0.2570, loss: 0.7774 2024-05-28 16:43:31,033 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 12:15:38, time: 0.877, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0447, loss_cls: 0.1958, acc: 92.9902, loss_bbox: 0.2476, loss_mask: 0.2517, loss: 0.7626 2024-05-28 16:44:09,743 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 12:14:56, time: 0.774, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0467, loss_cls: 0.1988, acc: 92.7502, loss_bbox: 0.2528, loss_mask: 0.2473, loss: 0.7704 2024-05-28 16:44:50,655 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 12:14:17, time: 0.818, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0462, loss_cls: 0.1929, acc: 92.9780, loss_bbox: 0.2492, loss_mask: 0.2496, loss: 0.7611 2024-05-28 16:45:34,985 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 12:13:44, time: 0.887, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0425, loss_cls: 0.1995, acc: 92.7769, loss_bbox: 0.2529, loss_mask: 0.2560, loss: 0.7727 2024-05-28 16:46:13,527 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 12:13:01, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0458, loss_cls: 0.1960, acc: 92.7483, loss_bbox: 0.2519, loss_mask: 0.2488, loss: 0.7651 2024-05-28 16:46:56,859 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 12:12:26, time: 0.867, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0460, loss_cls: 0.1933, acc: 93.0364, loss_bbox: 0.2500, loss_mask: 0.2475, loss: 0.7595 2024-05-28 16:47:37,629 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 12:11:47, time: 0.815, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0480, loss_cls: 0.1976, acc: 92.7886, loss_bbox: 0.2519, loss_mask: 0.2572, loss: 0.7784 2024-05-28 16:48:16,438 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 12:11:05, time: 0.776, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0488, loss_cls: 0.2079, acc: 92.5066, loss_bbox: 0.2655, loss_mask: 0.2595, loss: 0.8073 2024-05-28 16:48:55,224 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 12:10:23, time: 0.776, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0466, loss_cls: 0.1938, acc: 92.9089, loss_bbox: 0.2480, loss_mask: 0.2470, loss: 0.7601 2024-05-28 16:49:33,772 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 12:09:40, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0437, loss_cls: 0.1866, acc: 93.2139, loss_bbox: 0.2416, loss_mask: 0.2482, loss: 0.7417 2024-05-28 16:50:12,791 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 12:08:58, time: 0.780, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0477, loss_cls: 0.2059, acc: 92.5730, loss_bbox: 0.2597, loss_mask: 0.2513, loss: 0.7905 2024-05-28 16:50:51,297 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 12:08:16, time: 0.770, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0462, loss_cls: 0.1998, acc: 92.6741, loss_bbox: 0.2573, loss_mask: 0.2498, loss: 0.7759 2024-05-28 16:51:32,548 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 12:07:37, time: 0.825, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0455, loss_cls: 0.1948, acc: 92.9495, loss_bbox: 0.2471, loss_mask: 0.2531, loss: 0.7631 2024-05-28 16:52:15,466 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 12:07:02, time: 0.858, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0410, loss_cls: 0.1903, acc: 93.1104, loss_bbox: 0.2384, loss_mask: 0.2362, loss: 0.7269 2024-05-28 16:52:54,092 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 12:06:19, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0440, loss_cls: 0.2009, acc: 92.8232, loss_bbox: 0.2507, loss_mask: 0.2491, loss: 0.7698 2024-05-28 16:53:37,647 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 12:05:45, time: 0.871, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0438, loss_cls: 0.2020, acc: 92.9092, loss_bbox: 0.2499, loss_mask: 0.2504, loss: 0.7682 2024-05-28 16:54:16,404 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 12:05:03, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0443, loss_cls: 0.1968, acc: 92.8701, loss_bbox: 0.2489, loss_mask: 0.2486, loss: 0.7631 2024-05-28 16:54:57,940 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 12:04:25, time: 0.831, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0407, loss_cls: 0.1929, acc: 93.0007, loss_bbox: 0.2422, loss_mask: 0.2472, loss: 0.7433 2024-05-28 16:55:41,012 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 12:03:50, time: 0.861, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0479, loss_cls: 0.1993, acc: 92.7339, loss_bbox: 0.2556, loss_mask: 0.2522, loss: 0.7823 2024-05-28 16:56:19,610 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 12:03:07, time: 0.772, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0440, loss_cls: 0.1976, acc: 92.7595, loss_bbox: 0.2512, loss_mask: 0.2453, loss: 0.7616 2024-05-28 16:56:58,559 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 12:02:25, time: 0.779, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0452, loss_cls: 0.2073, acc: 92.4814, loss_bbox: 0.2612, loss_mask: 0.2534, loss: 0.7906 2024-05-28 16:57:37,132 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 12:01:42, time: 0.771, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0441, loss_cls: 0.1956, acc: 92.9048, loss_bbox: 0.2473, loss_mask: 0.2462, loss: 0.7557 2024-05-28 16:58:15,788 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 12:01:00, time: 0.774, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0442, loss_cls: 0.1949, acc: 92.8623, loss_bbox: 0.2508, loss_mask: 0.2511, loss: 0.7658 2024-05-28 16:58:54,402 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 12:00:18, time: 0.772, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0455, loss_cls: 0.1955, acc: 92.8660, loss_bbox: 0.2483, loss_mask: 0.2472, loss: 0.7591 2024-05-28 16:59:35,658 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 11:59:39, time: 0.825, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0472, loss_cls: 0.2034, acc: 92.6843, loss_bbox: 0.2539, loss_mask: 0.2494, loss: 0.7769 2024-05-28 17:00:18,910 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 11:59:04, time: 0.865, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0446, loss_cls: 0.1969, acc: 92.7712, loss_bbox: 0.2541, loss_mask: 0.2511, loss: 0.7690 2024-05-28 17:00:57,876 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 11:58:22, time: 0.779, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0459, loss_cls: 0.1961, acc: 92.8027, loss_bbox: 0.2538, loss_mask: 0.2451, loss: 0.7635 2024-05-28 17:01:38,814 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 11:57:44, time: 0.819, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0437, loss_cls: 0.1919, acc: 93.0466, loss_bbox: 0.2412, loss_mask: 0.2446, loss: 0.7446 2024-05-28 17:02:19,784 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 11:57:05, time: 0.819, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0449, loss_cls: 0.1931, acc: 92.8850, loss_bbox: 0.2477, loss_mask: 0.2435, loss: 0.7521 2024-05-28 17:02:58,251 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 11:56:22, time: 0.769, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0431, loss_cls: 0.1912, acc: 93.1074, loss_bbox: 0.2422, loss_mask: 0.2453, loss: 0.7434 2024-05-28 17:03:43,557 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 11:55:50, time: 0.906, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0429, loss_cls: 0.1909, acc: 93.0300, loss_bbox: 0.2411, loss_mask: 0.2384, loss: 0.7360 2024-05-28 17:04:22,166 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 11:55:08, time: 0.772, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0427, loss_cls: 0.1971, acc: 92.8748, loss_bbox: 0.2471, loss_mask: 0.2470, loss: 0.7567 2024-05-28 17:05:00,950 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 11:54:25, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0439, loss_cls: 0.2014, acc: 92.7107, loss_bbox: 0.2544, loss_mask: 0.2523, loss: 0.7753 2024-05-28 17:05:39,557 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 11:53:43, time: 0.773, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0434, loss_cls: 0.1917, acc: 93.0601, loss_bbox: 0.2443, loss_mask: 0.2454, loss: 0.7469 2024-05-28 17:06:17,948 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 11:53:00, time: 0.768, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0438, loss_cls: 0.1962, acc: 92.8755, loss_bbox: 0.2475, loss_mask: 0.2406, loss: 0.7513 2024-05-28 17:06:56,559 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 11:52:18, time: 0.772, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0465, loss_cls: 0.2068, acc: 92.5603, loss_bbox: 0.2583, loss_mask: 0.2498, loss: 0.7849 2024-05-28 17:07:37,938 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 11:51:40, time: 0.828, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0430, loss_cls: 0.1995, acc: 92.9297, loss_bbox: 0.2448, loss_mask: 0.2481, loss: 0.7590 2024-05-28 17:08:21,082 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 11:51:04, time: 0.863, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0246, loss_rpn_bbox: 0.0467, loss_cls: 0.1978, acc: 92.7136, loss_bbox: 0.2576, loss_mask: 0.2488, loss: 0.7756 2024-05-28 17:09:02,055 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 11:50:26, time: 0.820, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0424, loss_cls: 0.1916, acc: 93.0034, loss_bbox: 0.2418, loss_mask: 0.2510, loss: 0.7498 2024-05-28 17:09:40,735 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 11:49:43, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0425, loss_cls: 0.1945, acc: 92.9587, loss_bbox: 0.2466, loss_mask: 0.2491, loss: 0.7561 2024-05-28 17:10:21,708 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 11:49:05, time: 0.819, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0467, loss_cls: 0.2005, acc: 92.7554, loss_bbox: 0.2514, loss_mask: 0.2488, loss: 0.7703 2024-05-28 17:11:02,538 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 11:48:26, time: 0.817, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0456, loss_cls: 0.1935, acc: 92.9390, loss_bbox: 0.2445, loss_mask: 0.2435, loss: 0.7508 2024-05-28 17:11:43,654 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 11:47:47, time: 0.822, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0461, loss_cls: 0.1991, acc: 92.7080, loss_bbox: 0.2524, loss_mask: 0.2535, loss: 0.7746 2024-05-28 17:12:24,773 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 11:47:08, time: 0.822, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0447, loss_cls: 0.1929, acc: 92.9048, loss_bbox: 0.2471, loss_mask: 0.2480, loss: 0.7538 2024-05-28 17:13:03,509 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 11:46:26, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0446, loss_cls: 0.2000, acc: 92.7092, loss_bbox: 0.2535, loss_mask: 0.2498, loss: 0.7697 2024-05-28 17:13:42,292 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 11:45:44, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0434, loss_cls: 0.1842, acc: 93.2883, loss_bbox: 0.2414, loss_mask: 0.2477, loss: 0.7378 2024-05-28 17:14:20,782 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 11:45:02, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0459, loss_cls: 0.1982, acc: 92.7603, loss_bbox: 0.2544, loss_mask: 0.2520, loss: 0.7748 2024-05-28 17:15:01,995 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 11:44:23, time: 0.824, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0460, loss_cls: 0.2008, acc: 92.7048, loss_bbox: 0.2575, loss_mask: 0.2536, loss: 0.7814 2024-05-28 17:15:40,514 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 11:43:41, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0423, loss_cls: 0.1902, acc: 93.0356, loss_bbox: 0.2465, loss_mask: 0.2520, loss: 0.7525 2024-05-28 17:16:21,225 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 11:43:01, time: 0.814, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0438, loss_cls: 0.1987, acc: 92.8716, loss_bbox: 0.2500, loss_mask: 0.2471, loss: 0.7620 2024-05-28 17:17:04,334 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 11:42:26, time: 0.862, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0472, loss_cls: 0.1968, acc: 92.8069, loss_bbox: 0.2502, loss_mask: 0.2465, loss: 0.7636 2024-05-28 17:17:44,887 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 11:41:46, time: 0.811, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0435, loss_cls: 0.1847, acc: 93.4006, loss_bbox: 0.2346, loss_mask: 0.2436, loss: 0.7275 2024-05-28 17:18:25,607 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 11:41:07, time: 0.814, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0448, loss_cls: 0.1955, acc: 93.0400, loss_bbox: 0.2448, loss_mask: 0.2466, loss: 0.7537 2024-05-28 17:19:06,249 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 11:40:28, time: 0.813, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0477, loss_cls: 0.2022, acc: 92.7271, loss_bbox: 0.2554, loss_mask: 0.2501, loss: 0.7796 2024-05-28 17:19:47,730 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 11:39:50, time: 0.830, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0443, loss_cls: 0.1985, acc: 92.8315, loss_bbox: 0.2464, loss_mask: 0.2447, loss: 0.7561 2024-05-28 17:20:26,320 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 11:39:07, time: 0.772, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0457, loss_cls: 0.1947, acc: 92.9507, loss_bbox: 0.2484, loss_mask: 0.2430, loss: 0.7553 2024-05-28 17:21:05,161 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 11:38:25, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0420, loss_cls: 0.1879, acc: 93.1069, loss_bbox: 0.2420, loss_mask: 0.2391, loss: 0.7330 2024-05-28 17:21:43,959 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 11:37:43, time: 0.776, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0452, loss_cls: 0.2022, acc: 92.6641, loss_bbox: 0.2560, loss_mask: 0.2516, loss: 0.7778 2024-05-28 17:22:22,449 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 11:37:01, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0471, loss_cls: 0.2024, acc: 92.7004, loss_bbox: 0.2532, loss_mask: 0.2512, loss: 0.7783 2024-05-28 17:23:03,610 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 11:36:22, time: 0.823, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0425, loss_cls: 0.1955, acc: 92.9585, loss_bbox: 0.2477, loss_mask: 0.2468, loss: 0.7548 2024-05-28 17:23:44,242 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 11:35:43, time: 0.813, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0451, loss_cls: 0.1913, acc: 93.0591, loss_bbox: 0.2447, loss_mask: 0.2486, loss: 0.7541 2024-05-28 17:24:23,003 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 11:35:01, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0475, loss_cls: 0.2047, acc: 92.4658, loss_bbox: 0.2635, loss_mask: 0.2535, loss: 0.7923 2024-05-28 17:25:05,636 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 11:34:24, time: 0.853, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0433, loss_cls: 0.1960, acc: 92.9529, loss_bbox: 0.2443, loss_mask: 0.2474, loss: 0.7521 2024-05-28 17:25:47,130 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 11:33:46, time: 0.830, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0444, loss_cls: 0.2016, acc: 92.7576, loss_bbox: 0.2514, loss_mask: 0.2566, loss: 0.7784 2024-05-28 17:26:25,460 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 11:33:04, time: 0.767, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0422, loss_cls: 0.1969, acc: 92.9551, loss_bbox: 0.2457, loss_mask: 0.2465, loss: 0.7521 2024-05-28 17:27:11,689 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 11:32:32, time: 0.924, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0434, loss_cls: 0.1967, acc: 92.9341, loss_bbox: 0.2494, loss_mask: 0.2505, loss: 0.7617 2024-05-28 17:27:50,444 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 11:31:50, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0465, loss_cls: 0.2049, acc: 92.6570, loss_bbox: 0.2525, loss_mask: 0.2518, loss: 0.7806 2024-05-28 17:28:29,413 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 11:31:09, time: 0.779, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0410, loss_cls: 0.1917, acc: 92.9016, loss_bbox: 0.2481, loss_mask: 0.2431, loss: 0.7460 2024-05-28 17:29:08,106 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 11:30:26, time: 0.774, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0447, loss_cls: 0.1933, acc: 93.0188, loss_bbox: 0.2474, loss_mask: 0.2492, loss: 0.7552 2024-05-28 17:29:46,971 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 11:29:44, time: 0.777, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0434, loss_cls: 0.1944, acc: 92.7722, loss_bbox: 0.2474, loss_mask: 0.2468, loss: 0.7537 2024-05-28 17:30:30,187 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 11:29:09, time: 0.864, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0455, loss_cls: 0.1943, acc: 92.8931, loss_bbox: 0.2449, loss_mask: 0.2558, loss: 0.7638 2024-05-28 17:31:08,715 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 11:28:26, time: 0.771, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0428, loss_cls: 0.1942, acc: 93.0300, loss_bbox: 0.2420, loss_mask: 0.2468, loss: 0.7496 2024-05-28 17:31:47,204 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 11:27:44, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0455, loss_cls: 0.1908, acc: 93.0715, loss_bbox: 0.2416, loss_mask: 0.2482, loss: 0.7484 2024-05-28 17:32:25,579 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 11:27:01, time: 0.767, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0447, loss_cls: 0.2006, acc: 92.6890, loss_bbox: 0.2471, loss_mask: 0.2548, loss: 0.7690 2024-05-28 17:33:06,232 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 11:26:22, time: 0.813, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0460, loss_cls: 0.2065, acc: 92.5891, loss_bbox: 0.2579, loss_mask: 0.2469, loss: 0.7804 2024-05-28 17:33:34,794 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-05-28 17:36:01,686 - mmdet - INFO - Evaluating bbox... 2024-05-28 17:36:29,197 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.429 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.669 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.469 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.254 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.464 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.604 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.553 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.553 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.553 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.352 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.594 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.727 2024-05-28 17:36:29,198 - mmdet - INFO - Evaluating segm... 2024-05-28 17:36:56,819 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.387 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.627 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.411 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.171 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.418 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.614 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.500 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.500 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.500 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.280 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.705 2024-05-28 17:36:57,279 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 17:36:57,282 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4290, bbox_mAP_50: 0.6690, bbox_mAP_75: 0.4690, bbox_mAP_s: 0.2540, bbox_mAP_m: 0.4640, bbox_mAP_l: 0.6040, bbox_mAP_copypaste: 0.429 0.669 0.469 0.254 0.464 0.604, segm_mAP: 0.3870, segm_mAP_50: 0.6270, segm_mAP_75: 0.4110, segm_mAP_s: 0.1710, segm_mAP_m: 0.4180, segm_mAP_l: 0.6140, segm_mAP_copypaste: 0.387 0.627 0.411 0.171 0.418 0.614 2024-05-28 17:37:45,411 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 11:24:56, time: 0.962, data_time: 0.119, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0467, loss_cls: 0.1889, acc: 93.0647, loss_bbox: 0.2490, loss_mask: 0.2498, loss: 0.7569 2024-05-28 17:38:29,371 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 11:24:21, time: 0.879, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0430, loss_cls: 0.1922, acc: 92.9521, loss_bbox: 0.2511, loss_mask: 0.2435, loss: 0.7525 2024-05-28 17:39:08,022 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 11:23:39, time: 0.773, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0433, loss_cls: 0.1907, acc: 92.9272, loss_bbox: 0.2524, loss_mask: 0.2450, loss: 0.7505 2024-05-28 17:39:46,855 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 11:22:57, time: 0.777, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0454, loss_cls: 0.1946, acc: 92.8738, loss_bbox: 0.2524, loss_mask: 0.2453, loss: 0.7587 2024-05-28 17:40:25,825 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 11:22:15, time: 0.779, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0452, loss_cls: 0.1939, acc: 92.9358, loss_bbox: 0.2461, loss_mask: 0.2421, loss: 0.7481 2024-05-28 17:41:07,065 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 11:21:37, time: 0.825, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0468, loss_cls: 0.1976, acc: 92.7195, loss_bbox: 0.2578, loss_mask: 0.2489, loss: 0.7736 2024-05-28 17:41:45,555 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 17:41:45,556 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 11:20:54, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0412, loss_cls: 0.1853, acc: 93.2036, loss_bbox: 0.2397, loss_mask: 0.2356, loss: 0.7215 2024-05-28 17:42:24,079 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 11:20:12, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0407, loss_cls: 0.1745, acc: 93.5752, loss_bbox: 0.2319, loss_mask: 0.2317, loss: 0.6983 2024-05-28 17:43:04,915 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 11:19:33, time: 0.817, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0418, loss_cls: 0.1848, acc: 93.1973, loss_bbox: 0.2380, loss_mask: 0.2416, loss: 0.7262 2024-05-28 17:43:43,426 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 11:18:51, time: 0.770, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0418, loss_cls: 0.1814, acc: 93.3909, loss_bbox: 0.2262, loss_mask: 0.2380, loss: 0.7072 2024-05-28 17:44:25,633 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 11:18:14, time: 0.844, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0441, loss_cls: 0.1897, acc: 93.0754, loss_bbox: 0.2468, loss_mask: 0.2438, loss: 0.7452 2024-05-28 17:45:04,188 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 11:17:32, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0421, loss_cls: 0.1834, acc: 93.2471, loss_bbox: 0.2404, loss_mask: 0.2477, loss: 0.7325 2024-05-28 17:45:42,852 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 11:16:50, time: 0.773, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0463, loss_cls: 0.1957, acc: 92.8669, loss_bbox: 0.2480, loss_mask: 0.2471, loss: 0.7582 2024-05-28 17:46:27,240 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 11:16:15, time: 0.888, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0409, loss_cls: 0.1838, acc: 93.3206, loss_bbox: 0.2388, loss_mask: 0.2434, loss: 0.7268 2024-05-28 17:47:07,922 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 11:15:36, time: 0.814, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0431, loss_cls: 0.1917, acc: 93.0681, loss_bbox: 0.2460, loss_mask: 0.2457, loss: 0.7492 2024-05-28 17:47:49,035 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 11:14:57, time: 0.822, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0444, loss_cls: 0.1862, acc: 93.1421, loss_bbox: 0.2390, loss_mask: 0.2404, loss: 0.7312 2024-05-28 17:48:30,002 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 11:14:18, time: 0.819, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0436, loss_cls: 0.1870, acc: 93.1274, loss_bbox: 0.2458, loss_mask: 0.2401, loss: 0.7376 2024-05-28 17:49:08,501 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 11:13:36, time: 0.770, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0456, loss_cls: 0.1900, acc: 93.0110, loss_bbox: 0.2465, loss_mask: 0.2471, loss: 0.7517 2024-05-28 17:49:51,525 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 11:13:00, time: 0.860, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0414, loss_cls: 0.1735, acc: 93.5806, loss_bbox: 0.2325, loss_mask: 0.2354, loss: 0.7038 2024-05-28 17:50:30,197 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 11:12:18, time: 0.773, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0435, loss_cls: 0.1853, acc: 93.1392, loss_bbox: 0.2411, loss_mask: 0.2393, loss: 0.7304 2024-05-28 17:51:08,789 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 11:11:36, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0464, loss_cls: 0.1919, acc: 92.8386, loss_bbox: 0.2535, loss_mask: 0.2480, loss: 0.7612 2024-05-28 17:51:49,775 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 11:10:57, time: 0.820, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0441, loss_cls: 0.1895, acc: 92.9124, loss_bbox: 0.2492, loss_mask: 0.2509, loss: 0.7547 2024-05-28 17:52:28,492 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 11:10:15, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0446, loss_cls: 0.1858, acc: 93.1931, loss_bbox: 0.2420, loss_mask: 0.2448, loss: 0.7375 2024-05-28 17:53:07,156 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 11:09:33, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0420, loss_cls: 0.1826, acc: 93.3347, loss_bbox: 0.2354, loss_mask: 0.2407, loss: 0.7204 2024-05-28 17:53:50,121 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 11:08:57, time: 0.859, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0434, loss_cls: 0.1859, acc: 93.2031, loss_bbox: 0.2387, loss_mask: 0.2385, loss: 0.7257 2024-05-28 17:54:28,549 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 11:08:15, time: 0.769, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0435, loss_cls: 0.1862, acc: 93.1641, loss_bbox: 0.2362, loss_mask: 0.2352, loss: 0.7210 2024-05-28 17:55:10,131 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 17:55:10,131 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 11:07:37, time: 0.832, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0442, loss_cls: 0.1847, acc: 93.1260, loss_bbox: 0.2386, loss_mask: 0.2422, loss: 0.7330 2024-05-28 17:55:53,516 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 11:07:01, time: 0.868, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0445, loss_cls: 0.1908, acc: 92.9216, loss_bbox: 0.2499, loss_mask: 0.2496, loss: 0.7567 2024-05-28 17:56:34,805 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 11:06:22, time: 0.826, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0449, loss_cls: 0.1871, acc: 93.0859, loss_bbox: 0.2454, loss_mask: 0.2458, loss: 0.7446 2024-05-28 17:57:15,785 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 11:05:43, time: 0.819, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0432, loss_cls: 0.1878, acc: 93.1150, loss_bbox: 0.2448, loss_mask: 0.2423, loss: 0.7401 2024-05-28 17:57:54,285 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 11:05:01, time: 0.770, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0458, loss_cls: 0.2012, acc: 92.5388, loss_bbox: 0.2592, loss_mask: 0.2491, loss: 0.7773 2024-05-28 17:58:33,058 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 11:04:19, time: 0.775, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0443, loss_cls: 0.1976, acc: 92.7810, loss_bbox: 0.2485, loss_mask: 0.2498, loss: 0.7626 2024-05-28 17:59:11,608 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 11:03:37, time: 0.771, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0461, loss_cls: 0.1874, acc: 93.1028, loss_bbox: 0.2446, loss_mask: 0.2449, loss: 0.7430 2024-05-28 17:59:53,411 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 11:02:59, time: 0.836, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0481, loss_cls: 0.2047, acc: 92.5103, loss_bbox: 0.2571, loss_mask: 0.2511, loss: 0.7853 2024-05-28 18:00:31,907 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 11:02:17, time: 0.770, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0440, loss_cls: 0.1878, acc: 93.1223, loss_bbox: 0.2440, loss_mask: 0.2389, loss: 0.7346 2024-05-28 18:01:10,674 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 11:01:35, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0446, loss_cls: 0.1982, acc: 92.5894, loss_bbox: 0.2552, loss_mask: 0.2478, loss: 0.7681 2024-05-28 18:01:53,183 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 11:00:58, time: 0.850, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0427, loss_cls: 0.1811, acc: 93.3455, loss_bbox: 0.2352, loss_mask: 0.2404, loss: 0.7194 2024-05-28 18:02:31,930 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 11:00:17, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0442, loss_cls: 0.1938, acc: 93.0095, loss_bbox: 0.2504, loss_mask: 0.2506, loss: 0.7604 2024-05-28 18:03:15,469 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 10:59:41, time: 0.871, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0433, loss_cls: 0.1927, acc: 93.0222, loss_bbox: 0.2466, loss_mask: 0.2497, loss: 0.7538 2024-05-28 18:04:01,233 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 10:59:08, time: 0.915, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0437, loss_cls: 0.1839, acc: 93.2844, loss_bbox: 0.2396, loss_mask: 0.2444, loss: 0.7329 2024-05-28 18:04:40,082 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 10:58:26, time: 0.777, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0445, loss_cls: 0.1937, acc: 92.8855, loss_bbox: 0.2511, loss_mask: 0.2507, loss: 0.7618 2024-05-28 18:05:18,672 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 10:57:44, time: 0.772, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0418, loss_cls: 0.1921, acc: 93.0481, loss_bbox: 0.2396, loss_mask: 0.2406, loss: 0.7344 2024-05-28 18:05:57,419 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 10:57:03, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0429, loss_cls: 0.1886, acc: 93.0400, loss_bbox: 0.2452, loss_mask: 0.2384, loss: 0.7367 2024-05-28 18:06:36,102 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 10:56:21, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0432, loss_cls: 0.1900, acc: 92.9653, loss_bbox: 0.2447, loss_mask: 0.2399, loss: 0.7399 2024-05-28 18:07:14,654 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 10:55:39, time: 0.771, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0415, loss_cls: 0.1878, acc: 93.2266, loss_bbox: 0.2390, loss_mask: 0.2437, loss: 0.7314 2024-05-28 18:07:55,529 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 10:54:59, time: 0.817, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0432, loss_cls: 0.1838, acc: 93.2717, loss_bbox: 0.2350, loss_mask: 0.2422, loss: 0.7254 2024-05-28 18:08:33,818 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 18:08:33,818 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 10:54:17, time: 0.766, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0408, loss_cls: 0.1754, acc: 93.5474, loss_bbox: 0.2307, loss_mask: 0.2405, loss: 0.7061 2024-05-28 18:09:12,126 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 10:53:35, time: 0.766, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0435, loss_cls: 0.1921, acc: 93.0083, loss_bbox: 0.2453, loss_mask: 0.2395, loss: 0.7422 2024-05-28 18:09:52,851 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 10:52:56, time: 0.814, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0431, loss_cls: 0.1854, acc: 93.2043, loss_bbox: 0.2404, loss_mask: 0.2457, loss: 0.7348 2024-05-28 18:10:35,335 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 10:52:18, time: 0.850, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0428, loss_cls: 0.1840, acc: 93.2422, loss_bbox: 0.2338, loss_mask: 0.2383, loss: 0.7200 2024-05-28 18:11:18,292 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 10:51:42, time: 0.859, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0425, loss_cls: 0.1932, acc: 92.9102, loss_bbox: 0.2450, loss_mask: 0.2409, loss: 0.7421 2024-05-28 18:11:59,261 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 10:51:03, time: 0.819, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0399, loss_cls: 0.1845, acc: 93.2463, loss_bbox: 0.2328, loss_mask: 0.2365, loss: 0.7141 2024-05-28 18:12:40,509 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 10:50:24, time: 0.825, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0409, loss_cls: 0.1858, acc: 93.3032, loss_bbox: 0.2334, loss_mask: 0.2365, loss: 0.7163 2024-05-28 18:13:18,872 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 10:49:42, time: 0.767, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0470, loss_cls: 0.1914, acc: 92.9216, loss_bbox: 0.2474, loss_mask: 0.2513, loss: 0.7591 2024-05-28 18:13:57,646 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 10:49:00, time: 0.776, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0428, loss_cls: 0.1879, acc: 93.0979, loss_bbox: 0.2461, loss_mask: 0.2414, loss: 0.7383 2024-05-28 18:14:36,409 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 10:48:18, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0428, loss_cls: 0.1836, acc: 93.3252, loss_bbox: 0.2375, loss_mask: 0.2412, loss: 0.7265 2024-05-28 18:15:14,994 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 10:47:37, time: 0.772, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0449, loss_cls: 0.1889, acc: 93.1865, loss_bbox: 0.2443, loss_mask: 0.2421, loss: 0.7422 2024-05-28 18:15:53,425 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 10:46:54, time: 0.769, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0405, loss_cls: 0.1850, acc: 93.1980, loss_bbox: 0.2378, loss_mask: 0.2394, loss: 0.7217 2024-05-28 18:16:34,793 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 10:46:16, time: 0.827, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0429, loss_cls: 0.1837, acc: 93.3535, loss_bbox: 0.2391, loss_mask: 0.2382, loss: 0.7242 2024-05-28 18:17:15,625 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 10:45:37, time: 0.817, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0417, loss_cls: 0.1856, acc: 93.2930, loss_bbox: 0.2336, loss_mask: 0.2422, loss: 0.7232 2024-05-28 18:17:56,141 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 10:44:55, time: 0.768, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0448, loss_cls: 0.1881, acc: 93.1936, loss_bbox: 0.2477, loss_mask: 0.2461, loss: 0.7488 2024-05-28 18:18:41,421 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 10:44:23, time: 0.948, data_time: 0.077, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0421, loss_cls: 0.1867, acc: 93.2158, loss_bbox: 0.2393, loss_mask: 0.2484, loss: 0.7367 2024-05-28 18:19:23,909 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 10:43:46, time: 0.850, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0418, loss_cls: 0.1799, acc: 93.4358, loss_bbox: 0.2306, loss_mask: 0.2436, loss: 0.7168 2024-05-28 18:20:02,473 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 10:43:04, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0446, loss_cls: 0.1796, acc: 93.3918, loss_bbox: 0.2375, loss_mask: 0.2378, loss: 0.7205 2024-05-28 18:20:40,578 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 10:42:22, time: 0.762, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0419, loss_cls: 0.1859, acc: 93.1340, loss_bbox: 0.2397, loss_mask: 0.2396, loss: 0.7278 2024-05-28 18:21:19,345 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 10:41:40, time: 0.775, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0418, loss_cls: 0.1792, acc: 93.4697, loss_bbox: 0.2330, loss_mask: 0.2374, loss: 0.7123 2024-05-28 18:21:57,790 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 18:21:57,791 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 10:40:58, time: 0.769, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0438, loss_cls: 0.1947, acc: 92.9297, loss_bbox: 0.2476, loss_mask: 0.2460, loss: 0.7531 2024-05-28 18:22:36,072 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 10:40:16, time: 0.766, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0395, loss_cls: 0.1828, acc: 93.3865, loss_bbox: 0.2306, loss_mask: 0.2327, loss: 0.7050 2024-05-28 18:23:14,752 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 10:39:34, time: 0.773, data_time: 0.024, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0428, loss_cls: 0.1879, acc: 93.1863, loss_bbox: 0.2406, loss_mask: 0.2404, loss: 0.7331 2024-05-28 18:23:53,510 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 10:38:52, time: 0.776, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0439, loss_cls: 0.1865, acc: 93.2078, loss_bbox: 0.2391, loss_mask: 0.2409, loss: 0.7321 2024-05-28 18:24:34,419 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 10:38:13, time: 0.818, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0401, loss_cls: 0.1809, acc: 93.3660, loss_bbox: 0.2341, loss_mask: 0.2367, loss: 0.7111 2024-05-28 18:25:19,877 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 10:37:39, time: 0.909, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0444, loss_cls: 0.1904, acc: 93.0608, loss_bbox: 0.2457, loss_mask: 0.2391, loss: 0.7406 2024-05-28 18:26:01,054 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 10:37:00, time: 0.824, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0421, loss_cls: 0.1986, acc: 92.8269, loss_bbox: 0.2518, loss_mask: 0.2422, loss: 0.7550 2024-05-28 18:26:44,204 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 10:36:24, time: 0.863, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0439, loss_cls: 0.1914, acc: 92.9929, loss_bbox: 0.2442, loss_mask: 0.2418, loss: 0.7430 2024-05-28 18:27:25,699 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 10:35:45, time: 0.830, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0430, loss_cls: 0.1820, acc: 93.3875, loss_bbox: 0.2331, loss_mask: 0.2379, loss: 0.7175 2024-05-28 18:28:04,242 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 10:35:04, time: 0.771, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0432, loss_cls: 0.1860, acc: 93.1277, loss_bbox: 0.2413, loss_mask: 0.2423, loss: 0.7368 2024-05-28 18:28:43,031 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 10:34:22, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0417, loss_cls: 0.1903, acc: 92.9509, loss_bbox: 0.2491, loss_mask: 0.2445, loss: 0.7461 2024-05-28 18:29:21,333 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 10:33:40, time: 0.766, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0413, loss_cls: 0.1806, acc: 93.3721, loss_bbox: 0.2311, loss_mask: 0.2292, loss: 0.7025 2024-05-28 18:30:00,332 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 10:32:58, time: 0.780, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0408, loss_cls: 0.1827, acc: 93.2212, loss_bbox: 0.2439, loss_mask: 0.2318, loss: 0.7180 2024-05-28 18:30:38,909 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 10:32:16, time: 0.772, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0452, loss_cls: 0.1925, acc: 92.9448, loss_bbox: 0.2447, loss_mask: 0.2452, loss: 0.7500 2024-05-28 18:31:17,335 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 10:31:34, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0425, loss_cls: 0.1820, acc: 93.2693, loss_bbox: 0.2341, loss_mask: 0.2333, loss: 0.7135 2024-05-28 18:32:01,568 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 10:30:59, time: 0.885, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0426, loss_cls: 0.1854, acc: 93.1455, loss_bbox: 0.2429, loss_mask: 0.2461, loss: 0.7390 2024-05-28 18:32:46,935 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 10:30:25, time: 0.907, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0438, loss_cls: 0.1838, acc: 93.2671, loss_bbox: 0.2407, loss_mask: 0.2437, loss: 0.7334 2024-05-28 18:33:27,413 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 10:29:45, time: 0.810, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0417, loss_cls: 0.1812, acc: 93.4233, loss_bbox: 0.2266, loss_mask: 0.2362, loss: 0.7072 2024-05-28 18:34:05,853 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 10:29:03, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0433, loss_cls: 0.1883, acc: 93.1150, loss_bbox: 0.2401, loss_mask: 0.2456, loss: 0.7376 2024-05-28 18:34:48,417 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 10:28:26, time: 0.851, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0446, loss_cls: 0.1944, acc: 92.8323, loss_bbox: 0.2490, loss_mask: 0.2485, loss: 0.7577 2024-05-28 18:35:26,783 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 18:35:26,784 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 10:27:44, time: 0.767, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0408, loss_cls: 0.1800, acc: 93.4529, loss_bbox: 0.2380, loss_mask: 0.2396, loss: 0.7191 2024-05-28 18:36:05,177 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 10:27:02, time: 0.768, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0421, loss_cls: 0.1842, acc: 93.3242, loss_bbox: 0.2399, loss_mask: 0.2422, loss: 0.7287 2024-05-28 18:36:43,936 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 10:26:20, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0422, loss_cls: 0.1901, acc: 93.1030, loss_bbox: 0.2455, loss_mask: 0.2403, loss: 0.7383 2024-05-28 18:37:23,066 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 10:25:39, time: 0.783, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0458, loss_cls: 0.1838, acc: 93.2925, loss_bbox: 0.2416, loss_mask: 0.2460, loss: 0.7394 2024-05-28 18:38:01,486 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 10:24:57, time: 0.768, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0432, loss_cls: 0.1874, acc: 93.1812, loss_bbox: 0.2406, loss_mask: 0.2439, loss: 0.7347 2024-05-28 18:38:39,567 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 10:24:15, time: 0.762, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0424, loss_cls: 0.1857, acc: 93.2295, loss_bbox: 0.2394, loss_mask: 0.2420, loss: 0.7294 2024-05-28 18:39:20,394 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 10:23:36, time: 0.816, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0437, loss_cls: 0.1911, acc: 92.9971, loss_bbox: 0.2401, loss_mask: 0.2399, loss: 0.7360 2024-05-28 18:40:03,704 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 10:22:59, time: 0.866, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0446, loss_cls: 0.1928, acc: 92.9414, loss_bbox: 0.2434, loss_mask: 0.2453, loss: 0.7476 2024-05-28 18:40:48,768 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 10:22:25, time: 0.901, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0425, loss_cls: 0.1913, acc: 92.9321, loss_bbox: 0.2444, loss_mask: 0.2462, loss: 0.7450 2024-05-28 18:41:27,209 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 10:21:43, time: 0.769, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0410, loss_cls: 0.1890, acc: 93.1145, loss_bbox: 0.2421, loss_mask: 0.2422, loss: 0.7336 2024-05-28 18:42:08,006 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 10:21:03, time: 0.816, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0410, loss_cls: 0.1818, acc: 93.3325, loss_bbox: 0.2329, loss_mask: 0.2311, loss: 0.7066 2024-05-28 18:42:48,857 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 10:20:24, time: 0.817, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0441, loss_cls: 0.1835, acc: 93.2400, loss_bbox: 0.2356, loss_mask: 0.2413, loss: 0.7256 2024-05-28 18:43:27,511 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 10:19:42, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0398, loss_cls: 0.1846, acc: 93.3503, loss_bbox: 0.2336, loss_mask: 0.2392, loss: 0.7177 2024-05-28 18:44:06,426 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 10:19:01, time: 0.778, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0460, loss_cls: 0.1935, acc: 92.9114, loss_bbox: 0.2489, loss_mask: 0.2448, loss: 0.7549 2024-05-28 18:44:45,320 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 10:18:19, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0463, loss_cls: 0.1931, acc: 92.9805, loss_bbox: 0.2451, loss_mask: 0.2427, loss: 0.7472 2024-05-28 18:45:23,903 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 10:17:38, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0435, loss_cls: 0.1923, acc: 92.9563, loss_bbox: 0.2498, loss_mask: 0.2446, loss: 0.7545 2024-05-28 18:46:05,333 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 10:16:59, time: 0.829, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0421, loss_cls: 0.1842, acc: 93.3269, loss_bbox: 0.2327, loss_mask: 0.2391, loss: 0.7181 2024-05-28 18:46:48,464 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 10:16:22, time: 0.863, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0435, loss_cls: 0.1961, acc: 92.8862, loss_bbox: 0.2477, loss_mask: 0.2431, loss: 0.7522 2024-05-28 18:47:27,356 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 10:15:41, time: 0.778, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0470, loss_cls: 0.2021, acc: 92.5327, loss_bbox: 0.2619, loss_mask: 0.2490, loss: 0.7836 2024-05-28 18:48:08,286 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 10:15:02, time: 0.818, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0441, loss_cls: 0.1934, acc: 92.9255, loss_bbox: 0.2421, loss_mask: 0.2474, loss: 0.7482 2024-05-28 18:48:51,878 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 18:48:51,878 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 10:14:25, time: 0.872, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0441, loss_cls: 0.1911, acc: 92.9993, loss_bbox: 0.2420, loss_mask: 0.2461, loss: 0.7452 2024-05-28 18:49:31,083 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 10:13:44, time: 0.784, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0438, loss_cls: 0.1912, acc: 92.8872, loss_bbox: 0.2531, loss_mask: 0.2430, loss: 0.7522 2024-05-28 18:50:14,484 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 10:13:08, time: 0.868, data_time: 0.047, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0439, loss_cls: 0.1938, acc: 92.8760, loss_bbox: 0.2473, loss_mask: 0.2476, loss: 0.7532 2024-05-28 18:50:53,126 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 10:12:26, time: 0.773, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0434, loss_cls: 0.1912, acc: 92.9871, loss_bbox: 0.2406, loss_mask: 0.2445, loss: 0.7414 2024-05-28 18:51:31,662 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 10:11:44, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0393, loss_cls: 0.1828, acc: 93.3708, loss_bbox: 0.2347, loss_mask: 0.2369, loss: 0.7123 2024-05-28 18:52:10,180 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 10:11:02, time: 0.770, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0422, loss_cls: 0.1909, acc: 92.9626, loss_bbox: 0.2438, loss_mask: 0.2408, loss: 0.7383 2024-05-28 18:52:48,881 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 10:10:21, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0450, loss_cls: 0.1931, acc: 92.9189, loss_bbox: 0.2508, loss_mask: 0.2423, loss: 0.7522 2024-05-28 18:53:29,416 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 10:09:41, time: 0.811, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0421, loss_cls: 0.1859, acc: 93.1707, loss_bbox: 0.2401, loss_mask: 0.2387, loss: 0.7304 2024-05-28 18:54:10,473 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 10:09:02, time: 0.821, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0427, loss_cls: 0.1818, acc: 93.3208, loss_bbox: 0.2391, loss_mask: 0.2394, loss: 0.7222 2024-05-28 18:54:51,221 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 10:08:23, time: 0.815, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0450, loss_cls: 0.1944, acc: 92.9360, loss_bbox: 0.2469, loss_mask: 0.2477, loss: 0.7556 2024-05-28 18:55:32,163 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 10:07:43, time: 0.819, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0414, loss_cls: 0.1778, acc: 93.5032, loss_bbox: 0.2289, loss_mask: 0.2426, loss: 0.7114 2024-05-28 18:56:12,634 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 10:07:04, time: 0.809, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0411, loss_cls: 0.1888, acc: 93.1152, loss_bbox: 0.2425, loss_mask: 0.2418, loss: 0.7349 2024-05-28 18:56:55,206 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 10:06:26, time: 0.851, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0428, loss_cls: 0.1795, acc: 93.3835, loss_bbox: 0.2366, loss_mask: 0.2400, loss: 0.7192 2024-05-28 18:57:33,982 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 10:05:45, time: 0.775, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0427, loss_cls: 0.1893, acc: 93.0439, loss_bbox: 0.2383, loss_mask: 0.2467, loss: 0.7375 2024-05-28 18:58:14,870 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 10:05:05, time: 0.818, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0408, loss_cls: 0.1838, acc: 93.2693, loss_bbox: 0.2377, loss_mask: 0.2381, loss: 0.7202 2024-05-28 18:58:53,383 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 10:04:24, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0404, loss_cls: 0.1783, acc: 93.5354, loss_bbox: 0.2301, loss_mask: 0.2368, loss: 0.7055 2024-05-28 18:59:31,925 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 10:03:42, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0462, loss_cls: 0.2008, acc: 92.6506, loss_bbox: 0.2525, loss_mask: 0.2471, loss: 0.7680 2024-05-28 19:00:10,966 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 10:03:01, time: 0.781, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0461, loss_cls: 0.1930, acc: 92.9351, loss_bbox: 0.2467, loss_mask: 0.2481, loss: 0.7550 2024-05-28 19:00:52,663 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 10:02:22, time: 0.833, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0432, loss_cls: 0.1853, acc: 93.2007, loss_bbox: 0.2369, loss_mask: 0.2412, loss: 0.7281 2024-05-28 19:01:33,472 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 10:01:43, time: 0.817, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0447, loss_cls: 0.1876, acc: 93.0339, loss_bbox: 0.2485, loss_mask: 0.2444, loss: 0.7466 2024-05-28 19:02:12,044 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 19:02:12,044 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 10:01:01, time: 0.772, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0417, loss_cls: 0.1858, acc: 93.1914, loss_bbox: 0.2379, loss_mask: 0.2390, loss: 0.7246 2024-05-28 19:02:50,535 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 10:00:19, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0436, loss_cls: 0.1918, acc: 92.9258, loss_bbox: 0.2480, loss_mask: 0.2446, loss: 0.7492 2024-05-28 19:03:33,629 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 9:59:42, time: 0.862, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0418, loss_cls: 0.1894, acc: 93.1829, loss_bbox: 0.2405, loss_mask: 0.2421, loss: 0.7351 2024-05-28 19:04:16,938 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 9:59:06, time: 0.866, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0418, loss_cls: 0.1866, acc: 93.1970, loss_bbox: 0.2401, loss_mask: 0.2400, loss: 0.7311 2024-05-28 19:04:55,739 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 9:58:24, time: 0.775, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0451, loss_cls: 0.1759, acc: 93.6169, loss_bbox: 0.2256, loss_mask: 0.2358, loss: 0.7042 2024-05-28 19:05:39,134 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 9:57:47, time: 0.868, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0466, loss_cls: 0.1890, acc: 93.0808, loss_bbox: 0.2436, loss_mask: 0.2396, loss: 0.7422 2024-05-28 19:06:17,721 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 9:57:06, time: 0.772, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0422, loss_cls: 0.1821, acc: 93.3242, loss_bbox: 0.2354, loss_mask: 0.2387, loss: 0.7191 2024-05-28 19:06:56,338 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 9:56:24, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0392, loss_cls: 0.1819, acc: 93.4438, loss_bbox: 0.2283, loss_mask: 0.2322, loss: 0.7014 2024-05-28 19:07:35,098 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 9:55:43, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0427, loss_cls: 0.1887, acc: 93.0361, loss_bbox: 0.2452, loss_mask: 0.2390, loss: 0.7367 2024-05-28 19:08:13,820 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 9:55:01, time: 0.774, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0441, loss_cls: 0.1888, acc: 93.0820, loss_bbox: 0.2386, loss_mask: 0.2383, loss: 0.7320 2024-05-28 19:08:52,457 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 9:54:19, time: 0.773, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0428, loss_cls: 0.1863, acc: 93.0518, loss_bbox: 0.2420, loss_mask: 0.2445, loss: 0.7370 2024-05-28 19:09:33,375 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 9:53:40, time: 0.818, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0412, loss_cls: 0.1832, acc: 93.3486, loss_bbox: 0.2335, loss_mask: 0.2452, loss: 0.7232 2024-05-28 19:10:18,525 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 9:53:05, time: 0.903, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0435, loss_cls: 0.1949, acc: 92.9546, loss_bbox: 0.2439, loss_mask: 0.2472, loss: 0.7525 2024-05-28 19:10:59,182 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 9:52:26, time: 0.813, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0428, loss_cls: 0.1880, acc: 93.0991, loss_bbox: 0.2405, loss_mask: 0.2452, loss: 0.7357 2024-05-28 19:11:40,163 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 9:51:46, time: 0.820, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0422, loss_cls: 0.1831, acc: 93.3567, loss_bbox: 0.2352, loss_mask: 0.2439, loss: 0.7258 2024-05-28 19:12:20,860 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 9:51:07, time: 0.814, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0437, loss_cls: 0.1842, acc: 93.2312, loss_bbox: 0.2401, loss_mask: 0.2371, loss: 0.7247 2024-05-28 19:12:59,463 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 9:50:25, time: 0.772, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0437, loss_cls: 0.1909, acc: 93.1267, loss_bbox: 0.2354, loss_mask: 0.2391, loss: 0.7295 2024-05-28 19:13:40,444 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 9:49:46, time: 0.820, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0438, loss_cls: 0.1899, acc: 93.0471, loss_bbox: 0.2404, loss_mask: 0.2426, loss: 0.7365 2024-05-28 19:14:19,192 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 9:49:05, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0425, loss_cls: 0.1885, acc: 93.0598, loss_bbox: 0.2446, loss_mask: 0.2459, loss: 0.7432 2024-05-28 19:14:57,902 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 9:48:23, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0414, loss_cls: 0.1849, acc: 93.1833, loss_bbox: 0.2398, loss_mask: 0.2391, loss: 0.7259 2024-05-28 19:15:21,695 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-05-28 19:17:45,480 - mmdet - INFO - Evaluating bbox... 2024-05-28 19:18:10,115 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.438 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.677 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.473 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.254 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.478 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.615 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.350 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.604 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.737 2024-05-28 19:18:10,115 - mmdet - INFO - Evaluating segm... 2024-05-28 19:18:34,181 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.394 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.637 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.414 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.177 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.424 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.503 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.281 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.551 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.711 2024-05-28 19:18:34,600 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 19:18:34,601 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4380, bbox_mAP_50: 0.6770, bbox_mAP_75: 0.4730, bbox_mAP_s: 0.2540, bbox_mAP_m: 0.4780, bbox_mAP_l: 0.6150, bbox_mAP_copypaste: 0.438 0.677 0.473 0.254 0.478 0.615, segm_mAP: 0.3940, segm_mAP_50: 0.6370, segm_mAP_75: 0.4140, segm_mAP_s: 0.1770, segm_mAP_m: 0.4240, segm_mAP_l: 0.6300, segm_mAP_copypaste: 0.394 0.637 0.414 0.177 0.424 0.630 2024-05-28 19:19:21,426 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 9:47:02, time: 0.936, data_time: 0.148, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0400, loss_cls: 0.1712, acc: 93.6682, loss_bbox: 0.2236, loss_mask: 0.2345, loss: 0.6879 2024-05-28 19:20:00,711 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 9:46:21, time: 0.786, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0430, loss_cls: 0.1809, acc: 93.3481, loss_bbox: 0.2390, loss_mask: 0.2375, loss: 0.7197 2024-05-28 19:20:41,731 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 9:45:42, time: 0.820, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0411, loss_cls: 0.1773, acc: 93.4983, loss_bbox: 0.2315, loss_mask: 0.2375, loss: 0.7073 2024-05-28 19:21:20,372 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 9:45:00, time: 0.773, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0421, loss_cls: 0.1786, acc: 93.4519, loss_bbox: 0.2312, loss_mask: 0.2414, loss: 0.7123 2024-05-28 19:22:01,729 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 9:44:21, time: 0.827, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0406, loss_cls: 0.1699, acc: 93.7258, loss_bbox: 0.2261, loss_mask: 0.2303, loss: 0.6849 2024-05-28 19:22:40,408 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 9:43:40, time: 0.773, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0411, loss_cls: 0.1787, acc: 93.3855, loss_bbox: 0.2344, loss_mask: 0.2360, loss: 0.7102 2024-05-28 19:23:21,300 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 9:43:00, time: 0.818, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0413, loss_cls: 0.1723, acc: 93.6111, loss_bbox: 0.2263, loss_mask: 0.2312, loss: 0.6910 2024-05-28 19:24:02,598 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 9:42:21, time: 0.826, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0446, loss_cls: 0.1863, acc: 93.0359, loss_bbox: 0.2441, loss_mask: 0.2421, loss: 0.7377 2024-05-28 19:24:41,060 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 9:41:40, time: 0.769, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0410, loss_cls: 0.1766, acc: 93.3088, loss_bbox: 0.2344, loss_mask: 0.2397, loss: 0.7103 2024-05-28 19:25:22,172 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 9:41:01, time: 0.822, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0410, loss_cls: 0.1822, acc: 93.3730, loss_bbox: 0.2327, loss_mask: 0.2299, loss: 0.7040 2024-05-28 19:26:03,060 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 9:40:21, time: 0.818, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0403, loss_cls: 0.1815, acc: 93.3389, loss_bbox: 0.2359, loss_mask: 0.2323, loss: 0.7092 2024-05-28 19:26:41,643 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 9:39:40, time: 0.772, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0398, loss_cls: 0.1732, acc: 93.5139, loss_bbox: 0.2315, loss_mask: 0.2353, loss: 0.6979 2024-05-28 19:27:22,445 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 9:39:00, time: 0.816, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0445, loss_cls: 0.1843, acc: 93.2183, loss_bbox: 0.2435, loss_mask: 0.2420, loss: 0.7347 2024-05-28 19:28:00,837 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 9:38:19, time: 0.768, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0421, loss_cls: 0.1758, acc: 93.4490, loss_bbox: 0.2301, loss_mask: 0.2398, loss: 0.7073 2024-05-28 19:28:44,175 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 9:37:42, time: 0.867, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0395, loss_cls: 0.1671, acc: 93.8560, loss_bbox: 0.2237, loss_mask: 0.2299, loss: 0.6765 2024-05-28 19:29:23,215 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 9:37:01, time: 0.781, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0415, loss_cls: 0.1748, acc: 93.5200, loss_bbox: 0.2332, loss_mask: 0.2382, loss: 0.7080 2024-05-28 19:30:04,982 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 9:36:22, time: 0.835, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0419, loss_cls: 0.1783, acc: 93.3521, loss_bbox: 0.2346, loss_mask: 0.2378, loss: 0.7107 2024-05-28 19:30:46,819 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 9:35:44, time: 0.837, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0408, loss_cls: 0.1688, acc: 93.7212, loss_bbox: 0.2276, loss_mask: 0.2316, loss: 0.6852 2024-05-28 19:31:26,027 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 9:35:03, time: 0.784, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0410, loss_cls: 0.1786, acc: 93.4368, loss_bbox: 0.2312, loss_mask: 0.2382, loss: 0.7075 2024-05-28 19:32:05,018 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 9:34:22, time: 0.780, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0422, loss_cls: 0.1792, acc: 93.4050, loss_bbox: 0.2328, loss_mask: 0.2392, loss: 0.7131 2024-05-28 19:32:48,845 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 9:33:45, time: 0.877, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0429, loss_cls: 0.1790, acc: 93.4915, loss_bbox: 0.2303, loss_mask: 0.2397, loss: 0.7103 2024-05-28 19:33:29,941 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 9:33:06, time: 0.822, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0418, loss_cls: 0.1800, acc: 93.3174, loss_bbox: 0.2352, loss_mask: 0.2365, loss: 0.7132 2024-05-28 19:34:08,657 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 9:32:25, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0438, loss_cls: 0.1793, acc: 93.3496, loss_bbox: 0.2385, loss_mask: 0.2389, loss: 0.7191 2024-05-28 19:34:47,594 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 9:31:43, time: 0.779, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0419, loss_cls: 0.1779, acc: 93.3911, loss_bbox: 0.2370, loss_mask: 0.2341, loss: 0.7100 2024-05-28 19:35:28,503 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 9:31:04, time: 0.818, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0394, loss_cls: 0.1780, acc: 93.4580, loss_bbox: 0.2338, loss_mask: 0.2382, loss: 0.7074 2024-05-28 19:36:11,964 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 9:30:27, time: 0.869, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0421, loss_cls: 0.1776, acc: 93.3635, loss_bbox: 0.2351, loss_mask: 0.2417, loss: 0.7144 2024-05-28 19:36:50,626 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 9:29:46, time: 0.773, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0432, loss_cls: 0.1854, acc: 93.1995, loss_bbox: 0.2353, loss_mask: 0.2304, loss: 0.7150 2024-05-28 19:37:33,443 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 9:29:08, time: 0.856, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0414, loss_cls: 0.1839, acc: 93.1714, loss_bbox: 0.2394, loss_mask: 0.2369, loss: 0.7196 2024-05-28 19:38:12,401 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 9:28:27, time: 0.779, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0417, loss_cls: 0.1760, acc: 93.4570, loss_bbox: 0.2285, loss_mask: 0.2337, loss: 0.6982 2024-05-28 19:38:55,024 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 9:27:49, time: 0.852, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0395, loss_cls: 0.1745, acc: 93.5635, loss_bbox: 0.2236, loss_mask: 0.2292, loss: 0.6846 2024-05-28 19:39:37,141 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 9:27:11, time: 0.842, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0419, loss_cls: 0.1826, acc: 93.2595, loss_bbox: 0.2397, loss_mask: 0.2368, loss: 0.7217 2024-05-28 19:40:16,180 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 9:26:30, time: 0.781, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0414, loss_cls: 0.1799, acc: 93.2595, loss_bbox: 0.2428, loss_mask: 0.2458, loss: 0.7299 2024-05-28 19:40:57,465 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 9:25:51, time: 0.826, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0411, loss_cls: 0.1725, acc: 93.6228, loss_bbox: 0.2294, loss_mask: 0.2344, loss: 0.6975 2024-05-28 19:41:39,168 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 9:25:12, time: 0.834, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0418, loss_cls: 0.1837, acc: 93.1650, loss_bbox: 0.2396, loss_mask: 0.2387, loss: 0.7238 2024-05-28 19:42:18,078 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 9:24:31, time: 0.778, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0438, loss_cls: 0.1770, acc: 93.4617, loss_bbox: 0.2345, loss_mask: 0.2364, loss: 0.7099 2024-05-28 19:42:59,706 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 9:23:52, time: 0.833, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0414, loss_cls: 0.1810, acc: 93.3579, loss_bbox: 0.2334, loss_mask: 0.2430, loss: 0.7194 2024-05-28 19:43:41,333 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 9:23:14, time: 0.833, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0445, loss_cls: 0.1843, acc: 93.0403, loss_bbox: 0.2366, loss_mask: 0.2376, loss: 0.7227 2024-05-28 19:44:20,260 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 9:22:33, time: 0.778, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0412, loss_cls: 0.1722, acc: 93.5896, loss_bbox: 0.2290, loss_mask: 0.2360, loss: 0.6979 2024-05-28 19:44:59,213 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 9:21:51, time: 0.779, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0420, loss_cls: 0.1741, acc: 93.6448, loss_bbox: 0.2325, loss_mask: 0.2359, loss: 0.7035 2024-05-28 19:45:42,967 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 9:21:15, time: 0.875, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0413, loss_cls: 0.1730, acc: 93.5325, loss_bbox: 0.2256, loss_mask: 0.2348, loss: 0.6935 2024-05-28 19:46:21,604 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 9:20:33, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0414, loss_cls: 0.1813, acc: 93.3357, loss_bbox: 0.2355, loss_mask: 0.2336, loss: 0.7122 2024-05-28 19:47:00,441 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 9:19:52, time: 0.777, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0413, loss_cls: 0.1824, acc: 93.2271, loss_bbox: 0.2388, loss_mask: 0.2428, loss: 0.7246 2024-05-28 19:47:42,817 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 9:19:14, time: 0.848, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0430, loss_cls: 0.1767, acc: 93.4448, loss_bbox: 0.2367, loss_mask: 0.2361, loss: 0.7132 2024-05-28 19:48:24,499 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 9:18:35, time: 0.834, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0434, loss_cls: 0.1870, acc: 93.0859, loss_bbox: 0.2419, loss_mask: 0.2368, loss: 0.7298 2024-05-28 19:49:05,426 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 9:17:56, time: 0.819, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0399, loss_cls: 0.1782, acc: 93.5730, loss_bbox: 0.2245, loss_mask: 0.2363, loss: 0.6991 2024-05-28 19:49:44,589 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 9:17:15, time: 0.783, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0439, loss_cls: 0.1831, acc: 93.2844, loss_bbox: 0.2385, loss_mask: 0.2445, loss: 0.7307 2024-05-28 19:50:23,357 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 9:16:33, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0411, loss_cls: 0.1822, acc: 93.3367, loss_bbox: 0.2328, loss_mask: 0.2388, loss: 0.7138 2024-05-28 19:51:07,009 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 9:15:57, time: 0.873, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0400, loss_cls: 0.1798, acc: 93.4426, loss_bbox: 0.2344, loss_mask: 0.2318, loss: 0.7061 2024-05-28 19:51:46,169 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 9:15:16, time: 0.783, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0423, loss_cls: 0.1868, acc: 93.1218, loss_bbox: 0.2399, loss_mask: 0.2403, loss: 0.7287 2024-05-28 19:52:27,133 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 9:14:36, time: 0.820, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0405, loss_cls: 0.1803, acc: 93.3867, loss_bbox: 0.2308, loss_mask: 0.2415, loss: 0.7135 2024-05-28 19:53:08,767 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 9:13:57, time: 0.833, data_time: 0.053, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0404, loss_cls: 0.1734, acc: 93.4927, loss_bbox: 0.2328, loss_mask: 0.2300, loss: 0.6957 2024-05-28 19:53:47,505 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 9:13:16, time: 0.775, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0436, loss_cls: 0.1821, acc: 93.1470, loss_bbox: 0.2424, loss_mask: 0.2363, loss: 0.7242 2024-05-28 19:54:26,590 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 9:12:35, time: 0.782, data_time: 0.049, memory: 13582, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0438, loss_cls: 0.1957, acc: 92.8330, loss_bbox: 0.2504, loss_mask: 0.2434, loss: 0.7555 2024-05-28 19:55:05,135 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 9:11:54, time: 0.771, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0393, loss_cls: 0.1684, acc: 93.8037, loss_bbox: 0.2227, loss_mask: 0.2346, loss: 0.6832 2024-05-28 19:55:46,092 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 9:11:14, time: 0.819, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0414, loss_cls: 0.1779, acc: 93.4509, loss_bbox: 0.2307, loss_mask: 0.2380, loss: 0.7088 2024-05-28 19:56:28,066 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 9:10:36, time: 0.839, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0413, loss_cls: 0.1725, acc: 93.5972, loss_bbox: 0.2311, loss_mask: 0.2382, loss: 0.7013 2024-05-28 19:57:09,121 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 9:09:57, time: 0.821, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0401, loss_cls: 0.1841, acc: 93.2847, loss_bbox: 0.2375, loss_mask: 0.2383, loss: 0.7186 2024-05-28 19:57:47,969 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 9:09:15, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0420, loss_cls: 0.1818, acc: 93.2625, loss_bbox: 0.2447, loss_mask: 0.2411, loss: 0.7277 2024-05-28 19:58:27,014 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 9:08:34, time: 0.781, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0403, loss_cls: 0.1730, acc: 93.6165, loss_bbox: 0.2285, loss_mask: 0.2334, loss: 0.6953 2024-05-28 19:59:10,917 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 9:07:57, time: 0.878, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0414, loss_cls: 0.1836, acc: 93.2253, loss_bbox: 0.2345, loss_mask: 0.2364, loss: 0.7146 2024-05-28 19:59:49,889 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 9:07:16, time: 0.779, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0425, loss_cls: 0.1831, acc: 93.2827, loss_bbox: 0.2334, loss_mask: 0.2356, loss: 0.7134 2024-05-28 20:00:31,074 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 9:06:37, time: 0.824, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0401, loss_cls: 0.1748, acc: 93.6570, loss_bbox: 0.2211, loss_mask: 0.2314, loss: 0.6848 2024-05-28 20:01:12,994 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 9:05:59, time: 0.838, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0454, loss_cls: 0.1827, acc: 93.2795, loss_bbox: 0.2372, loss_mask: 0.2417, loss: 0.7285 2024-05-28 20:01:52,098 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 9:05:18, time: 0.782, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0418, loss_cls: 0.1821, acc: 93.3225, loss_bbox: 0.2356, loss_mask: 0.2350, loss: 0.7136 2024-05-28 20:02:31,421 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 9:04:37, time: 0.786, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0415, loss_cls: 0.1825, acc: 93.1294, loss_bbox: 0.2374, loss_mask: 0.2361, loss: 0.7173 2024-05-28 20:03:13,970 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 9:03:59, time: 0.851, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0417, loss_cls: 0.1781, acc: 93.4778, loss_bbox: 0.2327, loss_mask: 0.2414, loss: 0.7126 2024-05-28 20:03:52,906 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 9:03:18, time: 0.779, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0442, loss_cls: 0.1846, acc: 93.1384, loss_bbox: 0.2379, loss_mask: 0.2421, loss: 0.7285 2024-05-28 20:04:35,257 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 9:02:39, time: 0.847, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0411, loss_cls: 0.1732, acc: 93.6047, loss_bbox: 0.2256, loss_mask: 0.2277, loss: 0.6865 2024-05-28 20:05:16,784 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 9:02:01, time: 0.831, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0427, loss_cls: 0.1877, acc: 93.1497, loss_bbox: 0.2437, loss_mask: 0.2418, loss: 0.7366 2024-05-28 20:05:58,294 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 9:01:22, time: 0.830, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0411, loss_cls: 0.1803, acc: 93.4119, loss_bbox: 0.2350, loss_mask: 0.2370, loss: 0.7126 2024-05-28 20:06:40,130 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 9:00:43, time: 0.837, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0452, loss_cls: 0.1777, acc: 93.4832, loss_bbox: 0.2306, loss_mask: 0.2397, loss: 0.7142 2024-05-28 20:07:18,868 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 9:00:02, time: 0.775, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0404, loss_cls: 0.1699, acc: 93.7786, loss_bbox: 0.2213, loss_mask: 0.2289, loss: 0.6773 2024-05-28 20:08:00,200 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 8:59:23, time: 0.827, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0393, loss_cls: 0.1779, acc: 93.5698, loss_bbox: 0.2242, loss_mask: 0.2350, loss: 0.6959 2024-05-28 20:08:39,398 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 8:58:42, time: 0.784, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0475, loss_cls: 0.1924, acc: 92.9817, loss_bbox: 0.2487, loss_mask: 0.2435, loss: 0.7539 2024-05-28 20:09:20,986 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 8:58:03, time: 0.832, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0447, loss_cls: 0.1883, acc: 93.0112, loss_bbox: 0.2386, loss_mask: 0.2396, loss: 0.7306 2024-05-28 20:09:59,888 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 8:57:22, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0408, loss_cls: 0.1836, acc: 93.1843, loss_bbox: 0.2371, loss_mask: 0.2395, loss: 0.7198 2024-05-28 20:10:39,287 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 8:56:41, time: 0.788, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0446, loss_cls: 0.1872, acc: 93.0266, loss_bbox: 0.2483, loss_mask: 0.2465, loss: 0.7489 2024-05-28 20:11:20,850 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 8:56:02, time: 0.831, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0412, loss_cls: 0.1764, acc: 93.4836, loss_bbox: 0.2315, loss_mask: 0.2328, loss: 0.7006 2024-05-28 20:11:59,662 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 8:55:21, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0395, loss_cls: 0.1758, acc: 93.6111, loss_bbox: 0.2252, loss_mask: 0.2289, loss: 0.6888 2024-05-28 20:12:41,831 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 8:54:42, time: 0.843, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0421, loss_cls: 0.1852, acc: 93.1855, loss_bbox: 0.2394, loss_mask: 0.2363, loss: 0.7218 2024-05-28 20:13:22,804 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 8:54:03, time: 0.819, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0416, loss_cls: 0.1807, acc: 93.3325, loss_bbox: 0.2287, loss_mask: 0.2384, loss: 0.7083 2024-05-28 20:14:01,419 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 8:53:22, time: 0.772, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0399, loss_cls: 0.1804, acc: 93.3723, loss_bbox: 0.2331, loss_mask: 0.2314, loss: 0.7021 2024-05-28 20:14:45,283 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 8:52:44, time: 0.877, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0412, loss_cls: 0.1739, acc: 93.5259, loss_bbox: 0.2304, loss_mask: 0.2372, loss: 0.7025 2024-05-28 20:15:24,317 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 8:52:03, time: 0.781, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0405, loss_cls: 0.1738, acc: 93.5493, loss_bbox: 0.2274, loss_mask: 0.2326, loss: 0.6931 2024-05-28 20:16:05,564 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 8:51:24, time: 0.825, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0444, loss_cls: 0.1917, acc: 92.9509, loss_bbox: 0.2525, loss_mask: 0.2457, loss: 0.7561 2024-05-28 20:16:44,294 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 8:50:43, time: 0.775, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0412, loss_cls: 0.1768, acc: 93.4253, loss_bbox: 0.2327, loss_mask: 0.2358, loss: 0.7056 2024-05-28 20:17:25,838 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 8:50:04, time: 0.831, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0427, loss_cls: 0.1795, acc: 93.2649, loss_bbox: 0.2347, loss_mask: 0.2344, loss: 0.7110 2024-05-28 20:18:07,987 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 8:49:26, time: 0.843, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0407, loss_cls: 0.1850, acc: 93.1956, loss_bbox: 0.2333, loss_mask: 0.2349, loss: 0.7125 2024-05-28 20:18:47,132 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 8:48:45, time: 0.783, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0422, loss_cls: 0.1782, acc: 93.3965, loss_bbox: 0.2327, loss_mask: 0.2374, loss: 0.7097 2024-05-28 20:19:26,095 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 8:48:04, time: 0.779, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0426, loss_cls: 0.1854, acc: 93.1255, loss_bbox: 0.2431, loss_mask: 0.2368, loss: 0.7271 2024-05-28 20:20:07,605 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 8:47:25, time: 0.830, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0436, loss_cls: 0.1849, acc: 93.1584, loss_bbox: 0.2382, loss_mask: 0.2339, loss: 0.7203 2024-05-28 20:20:46,802 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 8:46:44, time: 0.784, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0416, loss_cls: 0.1787, acc: 93.3813, loss_bbox: 0.2317, loss_mask: 0.2369, loss: 0.7083 2024-05-28 20:21:30,723 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 8:46:07, time: 0.878, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0419, loss_cls: 0.1847, acc: 93.2417, loss_bbox: 0.2375, loss_mask: 0.2382, loss: 0.7204 2024-05-28 20:22:09,979 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 8:45:26, time: 0.785, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0420, loss_cls: 0.1865, acc: 93.1641, loss_bbox: 0.2392, loss_mask: 0.2403, loss: 0.7268 2024-05-28 20:22:50,901 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 8:44:46, time: 0.818, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0400, loss_cls: 0.1816, acc: 93.3958, loss_bbox: 0.2351, loss_mask: 0.2399, loss: 0.7156 2024-05-28 20:23:29,931 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 8:44:05, time: 0.781, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0406, loss_cls: 0.1770, acc: 93.6196, loss_bbox: 0.2261, loss_mask: 0.2324, loss: 0.6936 2024-05-28 20:24:11,648 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 8:43:26, time: 0.834, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0415, loss_cls: 0.1707, acc: 93.6553, loss_bbox: 0.2246, loss_mask: 0.2351, loss: 0.6900 2024-05-28 20:24:50,947 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 8:42:46, time: 0.786, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0435, loss_cls: 0.1814, acc: 93.3096, loss_bbox: 0.2353, loss_mask: 0.2361, loss: 0.7174 2024-05-28 20:25:35,252 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 8:42:09, time: 0.886, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0419, loss_cls: 0.1767, acc: 93.5103, loss_bbox: 0.2294, loss_mask: 0.2322, loss: 0.6991 2024-05-28 20:26:13,936 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 8:41:27, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0413, loss_cls: 0.1789, acc: 93.3650, loss_bbox: 0.2359, loss_mask: 0.2419, loss: 0.7178 2024-05-28 20:26:52,723 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 8:40:46, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0417, loss_cls: 0.1844, acc: 93.2554, loss_bbox: 0.2362, loss_mask: 0.2355, loss: 0.7183 2024-05-28 20:27:31,825 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 8:40:05, time: 0.782, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0410, loss_cls: 0.1807, acc: 93.3096, loss_bbox: 0.2326, loss_mask: 0.2306, loss: 0.7044 2024-05-28 20:28:13,346 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 8:39:26, time: 0.830, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0421, loss_cls: 0.1782, acc: 93.4119, loss_bbox: 0.2284, loss_mask: 0.2365, loss: 0.7063 2024-05-28 20:28:54,284 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 8:38:47, time: 0.819, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0433, loss_cls: 0.1820, acc: 93.2683, loss_bbox: 0.2412, loss_mask: 0.2391, loss: 0.7255 2024-05-28 20:29:33,118 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 8:38:06, time: 0.776, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0429, loss_cls: 0.1831, acc: 93.1392, loss_bbox: 0.2454, loss_mask: 0.2361, loss: 0.7256 2024-05-28 20:30:17,645 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 8:37:29, time: 0.891, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0451, loss_cls: 0.1931, acc: 92.9658, loss_bbox: 0.2463, loss_mask: 0.2368, loss: 0.7421 2024-05-28 20:30:58,289 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 8:36:49, time: 0.813, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0410, loss_cls: 0.1705, acc: 93.7056, loss_bbox: 0.2190, loss_mask: 0.2288, loss: 0.6790 2024-05-28 20:31:37,032 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 8:36:08, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0402, loss_cls: 0.1685, acc: 93.7695, loss_bbox: 0.2217, loss_mask: 0.2291, loss: 0.6782 2024-05-28 20:32:20,822 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 8:35:31, time: 0.876, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0473, loss_cls: 0.1935, acc: 92.8533, loss_bbox: 0.2533, loss_mask: 0.2403, loss: 0.7546 2024-05-28 20:33:01,995 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 8:34:51, time: 0.824, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0394, loss_cls: 0.1792, acc: 93.4897, loss_bbox: 0.2265, loss_mask: 0.2295, loss: 0.6935 2024-05-28 20:33:41,097 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 8:34:10, time: 0.782, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0400, loss_cls: 0.1803, acc: 93.3953, loss_bbox: 0.2349, loss_mask: 0.2371, loss: 0.7115 2024-05-28 20:34:19,914 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 8:33:29, time: 0.776, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0408, loss_cls: 0.1805, acc: 93.3250, loss_bbox: 0.2397, loss_mask: 0.2414, loss: 0.7223 2024-05-28 20:35:02,724 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 8:32:51, time: 0.856, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0395, loss_cls: 0.1777, acc: 93.3953, loss_bbox: 0.2306, loss_mask: 0.2332, loss: 0.6993 2024-05-28 20:35:42,039 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 8:32:10, time: 0.786, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0413, loss_cls: 0.1877, acc: 93.0959, loss_bbox: 0.2384, loss_mask: 0.2359, loss: 0.7232 2024-05-28 20:36:20,567 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 8:31:29, time: 0.770, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0393, loss_cls: 0.1700, acc: 93.7285, loss_bbox: 0.2205, loss_mask: 0.2304, loss: 0.6791 2024-05-28 20:37:04,235 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 8:30:52, time: 0.874, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0388, loss_cls: 0.1744, acc: 93.6360, loss_bbox: 0.2267, loss_mask: 0.2297, loss: 0.6880 2024-05-28 20:37:45,658 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 8:30:12, time: 0.829, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0420, loss_cls: 0.1829, acc: 93.3098, loss_bbox: 0.2356, loss_mask: 0.2406, loss: 0.7216 2024-05-28 20:38:27,943 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 8:29:34, time: 0.846, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0443, loss_cls: 0.1859, acc: 93.1311, loss_bbox: 0.2429, loss_mask: 0.2417, loss: 0.7346 2024-05-28 20:39:06,814 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 8:28:53, time: 0.777, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0412, loss_cls: 0.1735, acc: 93.6689, loss_bbox: 0.2252, loss_mask: 0.2308, loss: 0.6884 2024-05-28 20:39:45,962 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 8:28:12, time: 0.783, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0433, loss_cls: 0.1906, acc: 92.9707, loss_bbox: 0.2421, loss_mask: 0.2349, loss: 0.7284 2024-05-28 20:40:30,429 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 8:27:35, time: 0.889, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0401, loss_cls: 0.1835, acc: 93.3340, loss_bbox: 0.2366, loss_mask: 0.2372, loss: 0.7153 2024-05-28 20:41:11,672 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 8:26:56, time: 0.825, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0428, loss_cls: 0.1834, acc: 93.2043, loss_bbox: 0.2404, loss_mask: 0.2435, loss: 0.7307 2024-05-28 20:41:50,368 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 8:26:15, time: 0.774, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0396, loss_cls: 0.1812, acc: 93.4849, loss_bbox: 0.2242, loss_mask: 0.2319, loss: 0.6961 2024-05-28 20:42:33,078 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 8:25:36, time: 0.854, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0429, loss_cls: 0.1875, acc: 92.9751, loss_bbox: 0.2432, loss_mask: 0.2400, loss: 0.7340 2024-05-28 20:43:12,101 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 8:24:55, time: 0.780, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0422, loss_cls: 0.1752, acc: 93.5586, loss_bbox: 0.2283, loss_mask: 0.2379, loss: 0.7036 2024-05-28 20:43:53,030 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 8:24:16, time: 0.819, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0392, loss_cls: 0.1802, acc: 93.2407, loss_bbox: 0.2372, loss_mask: 0.2361, loss: 0.7108 2024-05-28 20:44:31,703 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 8:23:34, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0409, loss_cls: 0.1731, acc: 93.6094, loss_bbox: 0.2259, loss_mask: 0.2334, loss: 0.6925 2024-05-28 20:45:15,257 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 8:22:57, time: 0.871, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0403, loss_cls: 0.1738, acc: 93.6221, loss_bbox: 0.2261, loss_mask: 0.2365, loss: 0.6943 2024-05-28 20:45:53,923 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 8:22:16, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0427, loss_cls: 0.1812, acc: 93.2893, loss_bbox: 0.2351, loss_mask: 0.2412, loss: 0.7206 2024-05-28 20:46:35,115 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 8:21:36, time: 0.824, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0430, loss_cls: 0.1877, acc: 93.2336, loss_bbox: 0.2350, loss_mask: 0.2384, loss: 0.7244 2024-05-28 20:47:14,226 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 8:20:55, time: 0.782, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0400, loss_cls: 0.1814, acc: 93.3345, loss_bbox: 0.2316, loss_mask: 0.2334, loss: 0.7054 2024-05-28 20:47:55,326 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 8:20:16, time: 0.822, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0431, loss_cls: 0.1880, acc: 93.1956, loss_bbox: 0.2394, loss_mask: 0.2400, loss: 0.7313 2024-05-28 20:48:36,721 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 8:19:37, time: 0.828, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0379, loss_cls: 0.1763, acc: 93.4294, loss_bbox: 0.2311, loss_mask: 0.2329, loss: 0.6961 2024-05-28 20:49:20,456 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 8:18:59, time: 0.875, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0420, loss_cls: 0.1875, acc: 93.1316, loss_bbox: 0.2397, loss_mask: 0.2367, loss: 0.7252 2024-05-28 20:49:59,637 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 8:18:18, time: 0.784, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0410, loss_cls: 0.1819, acc: 93.4187, loss_bbox: 0.2359, loss_mask: 0.2379, loss: 0.7162 2024-05-28 20:50:37,992 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 8:17:37, time: 0.767, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0428, loss_cls: 0.1736, acc: 93.5691, loss_bbox: 0.2271, loss_mask: 0.2357, loss: 0.6988 2024-05-28 20:51:20,567 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 8:16:58, time: 0.852, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0432, loss_cls: 0.1797, acc: 93.3845, loss_bbox: 0.2379, loss_mask: 0.2415, loss: 0.7231 2024-05-28 20:52:01,948 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 8:16:19, time: 0.828, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0441, loss_cls: 0.1897, acc: 92.8870, loss_bbox: 0.2495, loss_mask: 0.2394, loss: 0.7419 2024-05-28 20:52:40,907 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 8:15:38, time: 0.779, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0388, loss_cls: 0.1732, acc: 93.5869, loss_bbox: 0.2274, loss_mask: 0.2332, loss: 0.6911 2024-05-28 20:53:25,109 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 8:15:01, time: 0.884, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0432, loss_cls: 0.1821, acc: 93.2839, loss_bbox: 0.2358, loss_mask: 0.2385, loss: 0.7198 2024-05-28 20:54:04,265 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 8:14:20, time: 0.783, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0398, loss_cls: 0.1808, acc: 93.2488, loss_bbox: 0.2328, loss_mask: 0.2341, loss: 0.7063 2024-05-28 20:54:43,079 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 8:13:39, time: 0.776, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0408, loss_cls: 0.1765, acc: 93.6116, loss_bbox: 0.2292, loss_mask: 0.2400, loss: 0.7049 2024-05-28 20:55:22,273 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 8:12:58, time: 0.784, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0421, loss_cls: 0.1788, acc: 93.4324, loss_bbox: 0.2297, loss_mask: 0.2370, loss: 0.7073 2024-05-28 20:56:05,124 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 8:12:20, time: 0.857, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0411, loss_cls: 0.1791, acc: 93.3455, loss_bbox: 0.2365, loss_mask: 0.2382, loss: 0.7145 2024-05-28 20:56:43,866 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 8:11:39, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0413, loss_cls: 0.1782, acc: 93.4519, loss_bbox: 0.2307, loss_mask: 0.2317, loss: 0.7017 2024-05-28 20:57:27,692 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 8:11:01, time: 0.877, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0401, loss_cls: 0.1772, acc: 93.4802, loss_bbox: 0.2288, loss_mask: 0.2333, loss: 0.6979 2024-05-28 20:57:51,677 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-05-28 21:00:17,268 - mmdet - INFO - Evaluating bbox... 2024-05-28 21:00:38,922 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.446 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.685 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.483 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.260 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.483 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.623 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.360 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.615 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.740 2024-05-28 21:00:38,922 - mmdet - INFO - Evaluating segm... 2024-05-28 21:01:06,449 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.399 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.648 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.422 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.179 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.430 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.287 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.558 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.712 2024-05-28 21:01:06,760 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 21:01:06,763 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4460, bbox_mAP_50: 0.6850, bbox_mAP_75: 0.4830, bbox_mAP_s: 0.2600, bbox_mAP_m: 0.4830, bbox_mAP_l: 0.6230, bbox_mAP_copypaste: 0.446 0.685 0.483 0.260 0.483 0.623, segm_mAP: 0.3990, segm_mAP_50: 0.6480, segm_mAP_75: 0.4220, segm_mAP_s: 0.1790, segm_mAP_m: 0.4300, segm_mAP_l: 0.6300, segm_mAP_copypaste: 0.399 0.648 0.422 0.179 0.430 0.630 2024-05-28 21:01:52,547 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 8:09:44, time: 0.915, data_time: 0.102, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0421, loss_cls: 0.1721, acc: 93.5659, loss_bbox: 0.2258, loss_mask: 0.2302, loss: 0.6874 2024-05-28 21:02:33,655 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 8:09:04, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0426, loss_cls: 0.1685, acc: 93.7900, loss_bbox: 0.2218, loss_mask: 0.2292, loss: 0.6808 2024-05-28 21:03:12,063 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 8:08:23, time: 0.768, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0390, loss_cls: 0.1700, acc: 93.6326, loss_bbox: 0.2246, loss_mask: 0.2320, loss: 0.6831 2024-05-28 21:03:52,832 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 8:07:43, time: 0.815, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0397, loss_cls: 0.1680, acc: 93.7510, loss_bbox: 0.2248, loss_mask: 0.2279, loss: 0.6773 2024-05-28 21:04:34,311 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 8:07:04, time: 0.829, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0414, loss_cls: 0.1718, acc: 93.5935, loss_bbox: 0.2327, loss_mask: 0.2348, loss: 0.6978 2024-05-28 21:05:12,816 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 8:06:23, time: 0.770, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0420, loss_cls: 0.1840, acc: 93.1252, loss_bbox: 0.2371, loss_mask: 0.2298, loss: 0.7105 2024-05-28 21:05:51,314 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 8:05:41, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0401, loss_cls: 0.1684, acc: 93.6367, loss_bbox: 0.2196, loss_mask: 0.2300, loss: 0.6747 2024-05-28 21:06:30,089 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 8:05:00, time: 0.776, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0420, loss_cls: 0.1810, acc: 93.2244, loss_bbox: 0.2359, loss_mask: 0.2343, loss: 0.7112 2024-05-28 21:07:08,782 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 8:04:19, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0407, loss_cls: 0.1718, acc: 93.6084, loss_bbox: 0.2277, loss_mask: 0.2297, loss: 0.6858 2024-05-28 21:07:47,724 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 8:03:38, time: 0.779, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0438, loss_cls: 0.1758, acc: 93.4651, loss_bbox: 0.2314, loss_mask: 0.2328, loss: 0.7016 2024-05-28 21:08:26,564 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 8:02:57, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0382, loss_cls: 0.1674, acc: 93.7124, loss_bbox: 0.2297, loss_mask: 0.2308, loss: 0.6825 2024-05-28 21:09:12,330 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 8:02:21, time: 0.915, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0425, loss_cls: 0.1765, acc: 93.3394, loss_bbox: 0.2347, loss_mask: 0.2364, loss: 0.7088 2024-05-28 21:09:51,049 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 8:01:40, time: 0.774, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0418, loss_cls: 0.1749, acc: 93.4861, loss_bbox: 0.2312, loss_mask: 0.2296, loss: 0.6965 2024-05-28 21:10:32,098 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 8:01:00, time: 0.821, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0390, loss_cls: 0.1670, acc: 93.7546, loss_bbox: 0.2274, loss_mask: 0.2308, loss: 0.6808 2024-05-28 21:11:15,328 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 8:00:22, time: 0.865, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0411, loss_cls: 0.1806, acc: 93.2800, loss_bbox: 0.2370, loss_mask: 0.2369, loss: 0.7137 2024-05-28 21:11:59,603 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 7:59:45, time: 0.885, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0419, loss_cls: 0.1655, acc: 93.7646, loss_bbox: 0.2243, loss_mask: 0.2314, loss: 0.6801 2024-05-28 21:12:38,261 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 7:59:04, time: 0.773, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0381, loss_cls: 0.1705, acc: 93.6704, loss_bbox: 0.2242, loss_mask: 0.2282, loss: 0.6780 2024-05-28 21:13:16,535 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 7:58:22, time: 0.765, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0378, loss_cls: 0.1677, acc: 93.6619, loss_bbox: 0.2252, loss_mask: 0.2267, loss: 0.6732 2024-05-28 21:13:55,306 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 7:57:41, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0405, loss_cls: 0.1724, acc: 93.5640, loss_bbox: 0.2297, loss_mask: 0.2326, loss: 0.6933 2024-05-28 21:14:33,720 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 7:57:00, time: 0.768, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0392, loss_cls: 0.1657, acc: 93.8418, loss_bbox: 0.2213, loss_mask: 0.2298, loss: 0.6741 2024-05-28 21:15:12,755 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 7:56:19, time: 0.781, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0395, loss_cls: 0.1653, acc: 93.9158, loss_bbox: 0.2158, loss_mask: 0.2272, loss: 0.6641 2024-05-28 21:15:51,588 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 7:55:38, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0410, loss_cls: 0.1718, acc: 93.6392, loss_bbox: 0.2282, loss_mask: 0.2315, loss: 0.6893 2024-05-28 21:16:30,469 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 7:54:57, time: 0.778, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0426, loss_cls: 0.1765, acc: 93.4785, loss_bbox: 0.2318, loss_mask: 0.2296, loss: 0.6983 2024-05-28 21:17:13,674 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 7:54:19, time: 0.864, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0398, loss_cls: 0.1685, acc: 93.6697, loss_bbox: 0.2306, loss_mask: 0.2320, loss: 0.6878 2024-05-28 21:18:02,477 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 7:53:45, time: 0.976, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0444, loss_cls: 0.1827, acc: 93.1631, loss_bbox: 0.2404, loss_mask: 0.2385, loss: 0.7235 2024-05-28 21:18:43,425 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 7:53:05, time: 0.819, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0401, loss_cls: 0.1738, acc: 93.6069, loss_bbox: 0.2250, loss_mask: 0.2302, loss: 0.6859 2024-05-28 21:19:25,372 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 7:52:26, time: 0.839, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0395, loss_cls: 0.1716, acc: 93.5088, loss_bbox: 0.2285, loss_mask: 0.2306, loss: 0.6893 2024-05-28 21:20:04,607 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 7:51:45, time: 0.784, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0390, loss_cls: 0.1766, acc: 93.3145, loss_bbox: 0.2295, loss_mask: 0.2352, loss: 0.6966 2024-05-28 21:20:45,652 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 7:51:06, time: 0.821, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0417, loss_cls: 0.1735, acc: 93.5483, loss_bbox: 0.2286, loss_mask: 0.2372, loss: 0.6981 2024-05-28 21:21:24,226 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 7:50:25, time: 0.771, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0425, loss_cls: 0.1792, acc: 93.3069, loss_bbox: 0.2366, loss_mask: 0.2361, loss: 0.7132 2024-05-28 21:22:03,144 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 7:49:44, time: 0.778, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0378, loss_cls: 0.1601, acc: 93.9917, loss_bbox: 0.2183, loss_mask: 0.2233, loss: 0.6556 2024-05-28 21:22:41,726 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 7:49:02, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0421, loss_cls: 0.1729, acc: 93.5591, loss_bbox: 0.2283, loss_mask: 0.2345, loss: 0.6958 2024-05-28 21:23:20,944 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 7:48:22, time: 0.784, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0414, loss_cls: 0.1654, acc: 93.8186, loss_bbox: 0.2242, loss_mask: 0.2309, loss: 0.6807 2024-05-28 21:23:59,681 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 7:47:41, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0424, loss_cls: 0.1679, acc: 93.7300, loss_bbox: 0.2227, loss_mask: 0.2305, loss: 0.6819 2024-05-28 21:24:40,232 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 7:47:01, time: 0.810, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0366, loss_cls: 0.1618, acc: 94.0503, loss_bbox: 0.2168, loss_mask: 0.2230, loss: 0.6540 2024-05-28 21:25:21,318 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 7:46:21, time: 0.822, data_time: 0.024, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0398, loss_cls: 0.1742, acc: 93.4236, loss_bbox: 0.2326, loss_mask: 0.2296, loss: 0.6938 2024-05-28 21:26:06,786 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 7:45:45, time: 0.909, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0425, loss_cls: 0.1778, acc: 93.4272, loss_bbox: 0.2312, loss_mask: 0.2358, loss: 0.7062 2024-05-28 21:26:47,919 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 7:45:05, time: 0.823, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0413, loss_cls: 0.1762, acc: 93.3572, loss_bbox: 0.2308, loss_mask: 0.2381, loss: 0.7050 2024-05-28 21:27:30,147 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 7:44:26, time: 0.845, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0397, loss_cls: 0.1681, acc: 93.7168, loss_bbox: 0.2197, loss_mask: 0.2304, loss: 0.6737 2024-05-28 21:28:09,008 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 7:43:45, time: 0.777, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0400, loss_cls: 0.1738, acc: 93.5669, loss_bbox: 0.2271, loss_mask: 0.2362, loss: 0.6954 2024-05-28 21:28:49,729 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 7:43:05, time: 0.814, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0429, loss_cls: 0.1795, acc: 93.3311, loss_bbox: 0.2349, loss_mask: 0.2337, loss: 0.7104 2024-05-28 21:29:28,575 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 7:42:24, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0407, loss_cls: 0.1690, acc: 93.7014, loss_bbox: 0.2233, loss_mask: 0.2289, loss: 0.6783 2024-05-28 21:30:07,290 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 7:41:43, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0399, loss_cls: 0.1820, acc: 93.2134, loss_bbox: 0.2367, loss_mask: 0.2329, loss: 0.7084 2024-05-28 21:30:46,104 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 7:41:02, time: 0.776, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0399, loss_cls: 0.1733, acc: 93.5774, loss_bbox: 0.2303, loss_mask: 0.2366, loss: 0.6978 2024-05-28 21:31:24,780 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 7:40:21, time: 0.774, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0388, loss_cls: 0.1729, acc: 93.6316, loss_bbox: 0.2253, loss_mask: 0.2308, loss: 0.6844 2024-05-28 21:32:05,543 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 7:39:41, time: 0.815, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0396, loss_cls: 0.1745, acc: 93.5464, loss_bbox: 0.2282, loss_mask: 0.2319, loss: 0.6914 2024-05-28 21:32:46,200 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 7:39:02, time: 0.813, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0380, loss_cls: 0.1621, acc: 93.9678, loss_bbox: 0.2081, loss_mask: 0.2212, loss: 0.6458 2024-05-28 21:33:24,845 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 7:38:21, time: 0.773, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0380, loss_cls: 0.1618, acc: 93.9111, loss_bbox: 0.2196, loss_mask: 0.2268, loss: 0.6615 2024-05-28 21:34:12,617 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 7:37:45, time: 0.955, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0379, loss_cls: 0.1703, acc: 93.6953, loss_bbox: 0.2207, loss_mask: 0.2272, loss: 0.6733 2024-05-28 21:34:54,382 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 7:37:06, time: 0.835, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0415, loss_cls: 0.1785, acc: 93.4275, loss_bbox: 0.2286, loss_mask: 0.2332, loss: 0.7000 2024-05-28 21:35:33,371 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 7:36:25, time: 0.780, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0381, loss_cls: 0.1706, acc: 93.7593, loss_bbox: 0.2242, loss_mask: 0.2288, loss: 0.6798 2024-05-28 21:36:12,060 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 7:35:44, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0400, loss_cls: 0.1701, acc: 93.6187, loss_bbox: 0.2308, loss_mask: 0.2262, loss: 0.6853 2024-05-28 21:36:53,510 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 7:35:05, time: 0.829, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0407, loss_cls: 0.1725, acc: 93.6331, loss_bbox: 0.2253, loss_mask: 0.2295, loss: 0.6874 2024-05-28 21:37:32,050 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 7:34:24, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0405, loss_cls: 0.1710, acc: 93.5537, loss_bbox: 0.2309, loss_mask: 0.2349, loss: 0.6937 2024-05-28 21:38:11,067 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 7:33:43, time: 0.781, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0390, loss_cls: 0.1745, acc: 93.4961, loss_bbox: 0.2230, loss_mask: 0.2261, loss: 0.6814 2024-05-28 21:38:52,190 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 7:33:03, time: 0.822, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0413, loss_cls: 0.1752, acc: 93.4941, loss_bbox: 0.2286, loss_mask: 0.2309, loss: 0.6935 2024-05-28 21:39:30,978 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 7:32:22, time: 0.776, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0412, loss_cls: 0.1725, acc: 93.5564, loss_bbox: 0.2251, loss_mask: 0.2298, loss: 0.6864 2024-05-28 21:40:11,811 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 7:31:42, time: 0.817, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0400, loss_cls: 0.1714, acc: 93.4905, loss_bbox: 0.2296, loss_mask: 0.2302, loss: 0.6887 2024-05-28 21:40:52,780 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 7:31:03, time: 0.819, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0417, loss_cls: 0.1746, acc: 93.4805, loss_bbox: 0.2304, loss_mask: 0.2305, loss: 0.6943 2024-05-28 21:41:33,795 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 7:30:23, time: 0.820, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0397, loss_cls: 0.1667, acc: 93.6846, loss_bbox: 0.2248, loss_mask: 0.2327, loss: 0.6805 2024-05-28 21:42:16,794 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 7:29:45, time: 0.860, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0411, loss_cls: 0.1745, acc: 93.6172, loss_bbox: 0.2281, loss_mask: 0.2285, loss: 0.6897 2024-05-28 21:42:59,479 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 7:29:06, time: 0.854, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0396, loss_cls: 0.1772, acc: 93.5120, loss_bbox: 0.2285, loss_mask: 0.2297, loss: 0.6919 2024-05-28 21:43:38,642 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 7:28:25, time: 0.783, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0431, loss_cls: 0.1777, acc: 93.3806, loss_bbox: 0.2366, loss_mask: 0.2335, loss: 0.7093 2024-05-28 21:44:17,554 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 7:27:45, time: 0.778, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0401, loss_cls: 0.1772, acc: 93.3770, loss_bbox: 0.2350, loss_mask: 0.2345, loss: 0.7055 2024-05-28 21:44:58,334 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 7:27:05, time: 0.816, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0392, loss_cls: 0.1774, acc: 93.5283, loss_bbox: 0.2288, loss_mask: 0.2323, loss: 0.6955 2024-05-28 21:45:37,219 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 7:26:24, time: 0.778, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0398, loss_cls: 0.1751, acc: 93.4934, loss_bbox: 0.2252, loss_mask: 0.2277, loss: 0.6868 2024-05-28 21:46:15,723 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 7:25:43, time: 0.770, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0416, loss_cls: 0.1750, acc: 93.4211, loss_bbox: 0.2315, loss_mask: 0.2293, loss: 0.6966 2024-05-28 21:46:56,811 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 7:25:03, time: 0.822, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0432, loss_cls: 0.1734, acc: 93.5305, loss_bbox: 0.2263, loss_mask: 0.2347, loss: 0.6970 2024-05-28 21:47:35,711 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 7:24:22, time: 0.778, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0406, loss_cls: 0.1743, acc: 93.4402, loss_bbox: 0.2300, loss_mask: 0.2328, loss: 0.6953 2024-05-28 21:48:19,047 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 7:23:44, time: 0.867, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0391, loss_cls: 0.1753, acc: 93.5938, loss_bbox: 0.2221, loss_mask: 0.2299, loss: 0.6848 2024-05-28 21:49:00,335 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 7:23:04, time: 0.826, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0411, loss_cls: 0.1718, acc: 93.6335, loss_bbox: 0.2223, loss_mask: 0.2305, loss: 0.6847 2024-05-28 21:49:41,478 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 7:22:25, time: 0.823, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0414, loss_cls: 0.1798, acc: 93.3179, loss_bbox: 0.2337, loss_mask: 0.2329, loss: 0.7053 2024-05-28 21:50:24,522 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 7:21:46, time: 0.861, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0410, loss_cls: 0.1737, acc: 93.5525, loss_bbox: 0.2288, loss_mask: 0.2334, loss: 0.6948 2024-05-28 21:51:03,655 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 7:21:06, time: 0.783, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0407, loss_cls: 0.1752, acc: 93.4424, loss_bbox: 0.2290, loss_mask: 0.2382, loss: 0.7002 2024-05-28 21:51:42,556 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 7:20:25, time: 0.778, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0401, loss_cls: 0.1696, acc: 93.7078, loss_bbox: 0.2225, loss_mask: 0.2285, loss: 0.6785 2024-05-28 21:52:21,212 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 7:19:44, time: 0.773, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0424, loss_cls: 0.1821, acc: 93.3357, loss_bbox: 0.2347, loss_mask: 0.2354, loss: 0.7125 2024-05-28 21:53:00,220 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 7:19:03, time: 0.780, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0407, loss_cls: 0.1701, acc: 93.7639, loss_bbox: 0.2235, loss_mask: 0.2294, loss: 0.6812 2024-05-28 21:53:40,996 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 7:18:23, time: 0.815, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0387, loss_cls: 0.1628, acc: 94.0422, loss_bbox: 0.2146, loss_mask: 0.2271, loss: 0.6589 2024-05-28 21:54:19,750 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 7:17:42, time: 0.775, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0388, loss_cls: 0.1737, acc: 93.5593, loss_bbox: 0.2264, loss_mask: 0.2329, loss: 0.6885 2024-05-28 21:54:58,363 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 7:17:01, time: 0.772, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0390, loss_cls: 0.1713, acc: 93.5574, loss_bbox: 0.2287, loss_mask: 0.2314, loss: 0.6886 2024-05-28 21:55:39,597 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 7:16:21, time: 0.825, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0420, loss_cls: 0.1705, acc: 93.6357, loss_bbox: 0.2235, loss_mask: 0.2286, loss: 0.6827 2024-05-28 21:56:27,299 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 7:15:46, time: 0.954, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0396, loss_cls: 0.1641, acc: 93.8271, loss_bbox: 0.2209, loss_mask: 0.2330, loss: 0.6745 2024-05-28 21:57:08,473 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 7:15:06, time: 0.823, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0376, loss_cls: 0.1691, acc: 93.7842, loss_bbox: 0.2234, loss_mask: 0.2290, loss: 0.6764 2024-05-28 21:57:47,059 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 7:14:25, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0403, loss_cls: 0.1734, acc: 93.5830, loss_bbox: 0.2274, loss_mask: 0.2314, loss: 0.6923 2024-05-28 21:58:25,829 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 7:13:44, time: 0.776, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0408, loss_cls: 0.1693, acc: 93.6343, loss_bbox: 0.2242, loss_mask: 0.2314, loss: 0.6840 2024-05-28 21:59:06,475 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 7:13:04, time: 0.813, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0395, loss_cls: 0.1691, acc: 93.6897, loss_bbox: 0.2230, loss_mask: 0.2355, loss: 0.6848 2024-05-28 21:59:45,383 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 7:12:23, time: 0.778, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0418, loss_cls: 0.1796, acc: 93.3838, loss_bbox: 0.2371, loss_mask: 0.2393, loss: 0.7162 2024-05-28 22:00:24,270 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 7:11:42, time: 0.778, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0414, loss_cls: 0.1769, acc: 93.4006, loss_bbox: 0.2335, loss_mask: 0.2385, loss: 0.7105 2024-05-28 22:01:03,183 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 7:11:02, time: 0.778, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0406, loss_cls: 0.1702, acc: 93.7849, loss_bbox: 0.2198, loss_mask: 0.2267, loss: 0.6748 2024-05-28 22:01:44,448 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 7:10:22, time: 0.825, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0399, loss_cls: 0.1706, acc: 93.6179, loss_bbox: 0.2252, loss_mask: 0.2298, loss: 0.6834 2024-05-28 22:02:23,303 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 7:09:41, time: 0.777, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0402, loss_cls: 0.1757, acc: 93.4678, loss_bbox: 0.2298, loss_mask: 0.2296, loss: 0.6949 2024-05-28 22:03:02,303 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 7:09:00, time: 0.781, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0392, loss_cls: 0.1653, acc: 93.8994, loss_bbox: 0.2171, loss_mask: 0.2250, loss: 0.6636 2024-05-28 22:03:47,923 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 7:08:23, time: 0.912, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0409, loss_cls: 0.1803, acc: 93.2952, loss_bbox: 0.2352, loss_mask: 0.2350, loss: 0.7098 2024-05-28 22:04:31,498 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 7:07:45, time: 0.871, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0404, loss_cls: 0.1742, acc: 93.6101, loss_bbox: 0.2237, loss_mask: 0.2298, loss: 0.6875 2024-05-28 22:05:12,422 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 7:07:05, time: 0.818, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0407, loss_cls: 0.1725, acc: 93.5967, loss_bbox: 0.2278, loss_mask: 0.2342, loss: 0.6929 2024-05-28 22:05:51,352 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 7:06:24, time: 0.779, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0417, loss_cls: 0.1750, acc: 93.4729, loss_bbox: 0.2295, loss_mask: 0.2317, loss: 0.6961 2024-05-28 22:06:30,090 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 7:05:44, time: 0.775, data_time: 0.024, memory: 13582, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0382, loss_cls: 0.1625, acc: 94.0178, loss_bbox: 0.2087, loss_mask: 0.2244, loss: 0.6496 2024-05-28 22:07:11,198 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 7:05:04, time: 0.822, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0427, loss_cls: 0.1798, acc: 93.2500, loss_bbox: 0.2354, loss_mask: 0.2333, loss: 0.7100 2024-05-28 22:07:50,206 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 7:04:23, time: 0.780, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0419, loss_cls: 0.1728, acc: 93.5859, loss_bbox: 0.2278, loss_mask: 0.2310, loss: 0.6903 2024-05-28 22:08:29,010 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 7:03:42, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0410, loss_cls: 0.1721, acc: 93.6101, loss_bbox: 0.2211, loss_mask: 0.2321, loss: 0.6848 2024-05-28 22:09:07,785 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 7:03:01, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0403, loss_cls: 0.1690, acc: 93.6658, loss_bbox: 0.2268, loss_mask: 0.2330, loss: 0.6886 2024-05-28 22:09:48,626 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 7:02:21, time: 0.817, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0411, loss_cls: 0.1733, acc: 93.6116, loss_bbox: 0.2270, loss_mask: 0.2362, loss: 0.6948 2024-05-28 22:10:29,422 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 7:01:42, time: 0.816, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0416, loss_cls: 0.1724, acc: 93.5652, loss_bbox: 0.2276, loss_mask: 0.2331, loss: 0.6931 2024-05-28 22:11:07,834 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 7:01:00, time: 0.768, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0373, loss_cls: 0.1656, acc: 93.8342, loss_bbox: 0.2223, loss_mask: 0.2303, loss: 0.6707 2024-05-28 22:11:51,169 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 7:00:22, time: 0.867, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0386, loss_cls: 0.1675, acc: 93.7476, loss_bbox: 0.2199, loss_mask: 0.2288, loss: 0.6723 2024-05-28 22:12:34,974 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 6:59:44, time: 0.876, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0409, loss_cls: 0.1736, acc: 93.4968, loss_bbox: 0.2283, loss_mask: 0.2303, loss: 0.6928 2024-05-28 22:13:15,913 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 6:59:04, time: 0.819, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0416, loss_cls: 0.1831, acc: 93.1880, loss_bbox: 0.2368, loss_mask: 0.2356, loss: 0.7163 2024-05-28 22:13:54,861 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 6:58:23, time: 0.779, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0410, loss_cls: 0.1730, acc: 93.4788, loss_bbox: 0.2324, loss_mask: 0.2279, loss: 0.6924 2024-05-28 22:14:33,417 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 6:57:42, time: 0.771, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0377, loss_cls: 0.1658, acc: 93.9602, loss_bbox: 0.2137, loss_mask: 0.2216, loss: 0.6557 2024-05-28 22:15:14,663 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 6:57:03, time: 0.825, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0426, loss_cls: 0.1766, acc: 93.4104, loss_bbox: 0.2295, loss_mask: 0.2364, loss: 0.7033 2024-05-28 22:15:53,286 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 6:56:22, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0402, loss_cls: 0.1710, acc: 93.6555, loss_bbox: 0.2246, loss_mask: 0.2330, loss: 0.6863 2024-05-28 22:16:31,707 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 6:55:41, time: 0.768, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0389, loss_cls: 0.1662, acc: 93.8762, loss_bbox: 0.2199, loss_mask: 0.2290, loss: 0.6720 2024-05-28 22:17:10,424 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 6:55:00, time: 0.774, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0421, loss_cls: 0.1796, acc: 93.3882, loss_bbox: 0.2305, loss_mask: 0.2313, loss: 0.7024 2024-05-28 22:17:53,192 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 6:54:21, time: 0.855, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0392, loss_cls: 0.1781, acc: 93.3325, loss_bbox: 0.2297, loss_mask: 0.2359, loss: 0.7011 2024-05-28 22:18:32,011 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 6:53:40, time: 0.776, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0422, loss_cls: 0.1769, acc: 93.4685, loss_bbox: 0.2286, loss_mask: 0.2300, loss: 0.6977 2024-05-28 22:19:12,446 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 6:53:00, time: 0.809, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0402, loss_cls: 0.1735, acc: 93.6477, loss_bbox: 0.2216, loss_mask: 0.2273, loss: 0.6802 2024-05-28 22:19:56,637 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 6:52:22, time: 0.883, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0410, loss_cls: 0.1767, acc: 93.4031, loss_bbox: 0.2332, loss_mask: 0.2351, loss: 0.7049 2024-05-28 22:20:37,797 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 6:51:42, time: 0.824, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0408, loss_cls: 0.1749, acc: 93.4861, loss_bbox: 0.2354, loss_mask: 0.2350, loss: 0.7035 2024-05-28 22:21:18,589 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 6:51:03, time: 0.815, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0388, loss_cls: 0.1693, acc: 93.7842, loss_bbox: 0.2173, loss_mask: 0.2320, loss: 0.6749 2024-05-28 22:21:57,118 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 6:50:21, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0404, loss_cls: 0.1751, acc: 93.4441, loss_bbox: 0.2297, loss_mask: 0.2369, loss: 0.7002 2024-05-28 22:22:36,120 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 6:49:41, time: 0.780, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0409, loss_cls: 0.1747, acc: 93.5386, loss_bbox: 0.2315, loss_mask: 0.2337, loss: 0.6991 2024-05-28 22:23:17,366 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 6:49:01, time: 0.825, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0420, loss_cls: 0.1774, acc: 93.3215, loss_bbox: 0.2341, loss_mask: 0.2327, loss: 0.7064 2024-05-28 22:23:56,263 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 6:48:20, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0412, loss_cls: 0.1777, acc: 93.3523, loss_bbox: 0.2346, loss_mask: 0.2329, loss: 0.7050 2024-05-28 22:24:37,436 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 6:47:41, time: 0.823, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0413, loss_cls: 0.1728, acc: 93.5940, loss_bbox: 0.2273, loss_mask: 0.2321, loss: 0.6920 2024-05-28 22:25:16,328 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 6:47:00, time: 0.778, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0402, loss_cls: 0.1690, acc: 93.6804, loss_bbox: 0.2232, loss_mask: 0.2310, loss: 0.6813 2024-05-28 22:25:55,129 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 6:46:19, time: 0.776, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0400, loss_cls: 0.1702, acc: 93.6611, loss_bbox: 0.2271, loss_mask: 0.2273, loss: 0.6822 2024-05-28 22:26:36,782 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 6:45:40, time: 0.833, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0412, loss_cls: 0.1779, acc: 93.4009, loss_bbox: 0.2327, loss_mask: 0.2346, loss: 0.7044 2024-05-28 22:27:17,761 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 6:45:00, time: 0.820, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0431, loss_cls: 0.1777, acc: 93.3184, loss_bbox: 0.2314, loss_mask: 0.2389, loss: 0.7102 2024-05-28 22:28:00,731 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 6:44:21, time: 0.859, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0412, loss_cls: 0.1757, acc: 93.5005, loss_bbox: 0.2270, loss_mask: 0.2310, loss: 0.6929 2024-05-28 22:28:39,475 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 6:43:40, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0414, loss_cls: 0.1769, acc: 93.3342, loss_bbox: 0.2341, loss_mask: 0.2331, loss: 0.7045 2024-05-28 22:29:20,583 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 6:43:01, time: 0.822, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0403, loss_cls: 0.1805, acc: 93.3833, loss_bbox: 0.2346, loss_mask: 0.2326, loss: 0.7061 2024-05-28 22:29:59,572 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 6:42:20, time: 0.780, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0408, loss_cls: 0.1834, acc: 93.2722, loss_bbox: 0.2320, loss_mask: 0.2311, loss: 0.7048 2024-05-28 22:30:41,631 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 6:41:41, time: 0.841, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0401, loss_cls: 0.1758, acc: 93.5007, loss_bbox: 0.2275, loss_mask: 0.2312, loss: 0.6936 2024-05-28 22:31:20,574 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 6:41:00, time: 0.779, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0407, loss_cls: 0.1757, acc: 93.4663, loss_bbox: 0.2300, loss_mask: 0.2360, loss: 0.7013 2024-05-28 22:31:59,448 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 6:40:19, time: 0.778, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0403, loss_cls: 0.1787, acc: 93.3862, loss_bbox: 0.2321, loss_mask: 0.2318, loss: 0.7002 2024-05-28 22:32:40,611 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 6:39:39, time: 0.823, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0418, loss_cls: 0.1777, acc: 93.4482, loss_bbox: 0.2280, loss_mask: 0.2280, loss: 0.6960 2024-05-28 22:33:19,174 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 6:38:58, time: 0.771, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0410, loss_cls: 0.1694, acc: 93.7695, loss_bbox: 0.2215, loss_mask: 0.2311, loss: 0.6812 2024-05-28 22:33:58,224 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 6:38:18, time: 0.781, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0434, loss_cls: 0.1758, acc: 93.4895, loss_bbox: 0.2308, loss_mask: 0.2361, loss: 0.7048 2024-05-28 22:34:44,263 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 6:37:40, time: 0.921, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0400, loss_cls: 0.1708, acc: 93.6726, loss_bbox: 0.2247, loss_mask: 0.2332, loss: 0.6869 2024-05-28 22:35:27,495 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 6:37:02, time: 0.865, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0395, loss_cls: 0.1684, acc: 93.7612, loss_bbox: 0.2235, loss_mask: 0.2320, loss: 0.6799 2024-05-28 22:36:06,160 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 6:36:21, time: 0.773, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0410, loss_cls: 0.1783, acc: 93.3281, loss_bbox: 0.2346, loss_mask: 0.2368, loss: 0.7102 2024-05-28 22:36:47,160 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 6:35:41, time: 0.820, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0406, loss_cls: 0.1697, acc: 93.6702, loss_bbox: 0.2239, loss_mask: 0.2281, loss: 0.6809 2024-05-28 22:37:26,055 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 6:35:00, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0405, loss_cls: 0.1717, acc: 93.6799, loss_bbox: 0.2259, loss_mask: 0.2289, loss: 0.6846 2024-05-28 22:38:04,825 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 6:34:19, time: 0.775, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0415, loss_cls: 0.1754, acc: 93.5691, loss_bbox: 0.2280, loss_mask: 0.2317, loss: 0.6956 2024-05-28 22:38:48,086 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 6:33:41, time: 0.865, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0408, loss_cls: 0.1845, acc: 93.1309, loss_bbox: 0.2394, loss_mask: 0.2364, loss: 0.7196 2024-05-28 22:39:26,892 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 6:33:00, time: 0.776, data_time: 0.024, memory: 13582, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0395, loss_cls: 0.1742, acc: 93.6370, loss_bbox: 0.2234, loss_mask: 0.2304, loss: 0.6862 2024-05-28 22:39:50,819 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-05-28 22:42:13,521 - mmdet - INFO - Evaluating bbox... 2024-05-28 22:42:36,131 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.446 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.683 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.487 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.261 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.484 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.624 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.564 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.564 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.564 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.365 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.608 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.744 2024-05-28 22:42:36,131 - mmdet - INFO - Evaluating segm... 2024-05-28 22:42:59,357 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.400 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.644 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.422 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.181 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.431 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.630 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.291 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.557 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.715 2024-05-28 22:42:59,714 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-28 22:42:59,717 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4460, bbox_mAP_50: 0.6830, bbox_mAP_75: 0.4870, bbox_mAP_s: 0.2610, bbox_mAP_m: 0.4840, bbox_mAP_l: 0.6240, bbox_mAP_copypaste: 0.446 0.683 0.487 0.261 0.484 0.624, segm_mAP: 0.4000, segm_mAP_50: 0.6440, segm_mAP_75: 0.4220, segm_mAP_s: 0.1810, segm_mAP_m: 0.4310, segm_mAP_l: 0.6300, segm_mAP_copypaste: 0.400 0.644 0.422 0.181 0.431 0.630 2024-05-28 22:43:56,122 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 6:31:52, time: 1.128, data_time: 0.107, memory: 13582, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0409, loss_cls: 0.1673, acc: 93.7466, loss_bbox: 0.2211, loss_mask: 0.2271, loss: 0.6735 2024-05-28 22:44:34,760 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 6:31:11, time: 0.773, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0375, loss_cls: 0.1604, acc: 94.0337, loss_bbox: 0.2125, loss_mask: 0.2232, loss: 0.6489 2024-05-28 22:45:16,148 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 6:30:31, time: 0.828, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0384, loss_cls: 0.1579, acc: 94.0605, loss_bbox: 0.2113, loss_mask: 0.2235, loss: 0.6467 2024-05-28 22:45:55,273 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 6:29:51, time: 0.782, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0399, loss_cls: 0.1675, acc: 93.6602, loss_bbox: 0.2248, loss_mask: 0.2259, loss: 0.6751 2024-05-28 22:46:34,011 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 6:29:10, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0370, loss_cls: 0.1593, acc: 94.0730, loss_bbox: 0.2096, loss_mask: 0.2195, loss: 0.6405 2024-05-28 22:47:17,446 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 6:28:31, time: 0.869, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0390, loss_cls: 0.1612, acc: 93.8721, loss_bbox: 0.2134, loss_mask: 0.2209, loss: 0.6498 2024-05-28 22:47:58,812 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 6:27:52, time: 0.827, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0405, loss_cls: 0.1702, acc: 93.5759, loss_bbox: 0.2263, loss_mask: 0.2271, loss: 0.6806 2024-05-28 22:48:38,179 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 6:27:11, time: 0.787, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0394, loss_cls: 0.1626, acc: 93.8533, loss_bbox: 0.2140, loss_mask: 0.2149, loss: 0.6467 2024-05-28 22:49:20,582 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 6:26:32, time: 0.848, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0377, loss_cls: 0.1600, acc: 93.9363, loss_bbox: 0.2142, loss_mask: 0.2246, loss: 0.6516 2024-05-28 22:49:58,984 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 6:25:51, time: 0.768, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0350, loss_cls: 0.1548, acc: 94.1467, loss_bbox: 0.2087, loss_mask: 0.2189, loss: 0.6306 2024-05-28 22:50:37,502 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 6:25:10, time: 0.770, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0347, loss_cls: 0.1584, acc: 94.0281, loss_bbox: 0.2100, loss_mask: 0.2176, loss: 0.6360 2024-05-28 22:51:20,420 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 6:24:31, time: 0.858, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0369, loss_cls: 0.1558, acc: 93.9873, loss_bbox: 0.2124, loss_mask: 0.2259, loss: 0.6457 2024-05-28 22:52:01,758 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 6:23:51, time: 0.827, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0370, loss_cls: 0.1591, acc: 93.9346, loss_bbox: 0.2157, loss_mask: 0.2233, loss: 0.6497 2024-05-28 22:52:40,863 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 6:23:11, time: 0.782, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0391, loss_cls: 0.1576, acc: 93.9358, loss_bbox: 0.2154, loss_mask: 0.2231, loss: 0.6502 2024-05-28 22:53:19,481 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 6:22:30, time: 0.772, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0378, loss_cls: 0.1564, acc: 94.1069, loss_bbox: 0.2095, loss_mask: 0.2206, loss: 0.6382 2024-05-28 22:54:02,532 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 6:21:51, time: 0.861, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0338, loss_cls: 0.1488, acc: 94.3167, loss_bbox: 0.2045, loss_mask: 0.2150, loss: 0.6148 2024-05-28 22:54:43,904 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 6:21:12, time: 0.828, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0378, loss_cls: 0.1641, acc: 93.7998, loss_bbox: 0.2206, loss_mask: 0.2225, loss: 0.6599 2024-05-28 22:55:25,902 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 6:20:32, time: 0.840, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0383, loss_cls: 0.1569, acc: 94.0623, loss_bbox: 0.2093, loss_mask: 0.2159, loss: 0.6362 2024-05-28 22:56:04,645 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 6:19:51, time: 0.775, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0389, loss_cls: 0.1570, acc: 94.0984, loss_bbox: 0.2142, loss_mask: 0.2216, loss: 0.6465 2024-05-28 22:56:43,406 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 6:19:11, time: 0.775, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0373, loss_cls: 0.1675, acc: 93.7170, loss_bbox: 0.2250, loss_mask: 0.2249, loss: 0.6701 2024-05-28 22:57:24,795 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 6:18:31, time: 0.828, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0375, loss_cls: 0.1547, acc: 94.1086, loss_bbox: 0.2103, loss_mask: 0.2212, loss: 0.6375 2024-05-28 22:58:03,391 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 6:17:50, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0348, loss_cls: 0.1553, acc: 94.1328, loss_bbox: 0.2074, loss_mask: 0.2181, loss: 0.6299 2024-05-28 22:58:44,200 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 6:17:10, time: 0.816, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0379, loss_cls: 0.1575, acc: 93.9236, loss_bbox: 0.2141, loss_mask: 0.2182, loss: 0.6434 2024-05-28 22:59:26,114 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 6:16:31, time: 0.838, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0387, loss_cls: 0.1565, acc: 94.0071, loss_bbox: 0.2124, loss_mask: 0.2188, loss: 0.6412 2024-05-28 23:00:07,404 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 6:15:51, time: 0.826, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0358, loss_cls: 0.1535, acc: 94.1440, loss_bbox: 0.2104, loss_mask: 0.2218, loss: 0.6362 2024-05-28 23:00:46,714 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 6:15:11, time: 0.786, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0392, loss_cls: 0.1596, acc: 93.9482, loss_bbox: 0.2178, loss_mask: 0.2240, loss: 0.6569 2024-05-28 23:01:28,367 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 6:14:31, time: 0.833, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0398, loss_cls: 0.1593, acc: 93.9824, loss_bbox: 0.2154, loss_mask: 0.2256, loss: 0.6555 2024-05-28 23:02:14,534 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 6:13:54, time: 0.923, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0355, loss_cls: 0.1540, acc: 94.0950, loss_bbox: 0.2059, loss_mask: 0.2208, loss: 0.6310 2024-05-28 23:02:52,972 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 6:13:13, time: 0.769, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0349, loss_cls: 0.1519, acc: 94.2493, loss_bbox: 0.2067, loss_mask: 0.2174, loss: 0.6243 2024-05-28 23:03:31,677 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 6:12:32, time: 0.774, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0380, loss_cls: 0.1574, acc: 94.0938, loss_bbox: 0.2151, loss_mask: 0.2209, loss: 0.6464 2024-05-28 23:04:10,534 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 6:11:51, time: 0.777, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0365, loss_cls: 0.1540, acc: 94.1172, loss_bbox: 0.2119, loss_mask: 0.2171, loss: 0.6336 2024-05-28 23:04:49,123 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 6:11:10, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0372, loss_cls: 0.1533, acc: 94.2097, loss_bbox: 0.2088, loss_mask: 0.2193, loss: 0.6337 2024-05-28 23:05:30,244 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 6:10:30, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0369, loss_cls: 0.1512, acc: 94.1948, loss_bbox: 0.2048, loss_mask: 0.2244, loss: 0.6320 2024-05-28 23:06:11,587 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 6:09:51, time: 0.827, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0395, loss_cls: 0.1621, acc: 93.8516, loss_bbox: 0.2152, loss_mask: 0.2292, loss: 0.6619 2024-05-28 23:06:50,564 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 6:09:10, time: 0.780, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0392, loss_cls: 0.1618, acc: 93.7830, loss_bbox: 0.2193, loss_mask: 0.2220, loss: 0.6576 2024-05-28 23:07:31,786 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 6:08:30, time: 0.824, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0360, loss_cls: 0.1582, acc: 93.9141, loss_bbox: 0.2168, loss_mask: 0.2247, loss: 0.6495 2024-05-28 23:08:12,562 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 6:07:51, time: 0.815, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0377, loss_cls: 0.1559, acc: 94.0159, loss_bbox: 0.2162, loss_mask: 0.2188, loss: 0.6424 2024-05-28 23:08:51,015 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 6:07:10, time: 0.769, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0339, loss_cls: 0.1543, acc: 94.1216, loss_bbox: 0.2089, loss_mask: 0.2180, loss: 0.6281 2024-05-28 23:09:32,114 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 6:06:30, time: 0.822, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0375, loss_cls: 0.1564, acc: 93.9810, loss_bbox: 0.2099, loss_mask: 0.2160, loss: 0.6339 2024-05-28 23:10:19,316 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 6:05:53, time: 0.944, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0374, loss_cls: 0.1551, acc: 94.0669, loss_bbox: 0.2103, loss_mask: 0.2203, loss: 0.6376 2024-05-28 23:10:58,087 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 6:05:12, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0393, loss_cls: 0.1580, acc: 93.9600, loss_bbox: 0.2192, loss_mask: 0.2231, loss: 0.6541 2024-05-28 23:11:36,647 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 6:04:31, time: 0.771, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0338, loss_cls: 0.1491, acc: 94.2634, loss_bbox: 0.2104, loss_mask: 0.2186, loss: 0.6245 2024-05-28 23:12:15,338 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 6:03:50, time: 0.774, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0359, loss_cls: 0.1561, acc: 94.0603, loss_bbox: 0.2122, loss_mask: 0.2227, loss: 0.6410 2024-05-28 23:12:54,535 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 6:03:10, time: 0.784, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0387, loss_cls: 0.1591, acc: 93.9482, loss_bbox: 0.2161, loss_mask: 0.2233, loss: 0.6525 2024-05-28 23:13:37,878 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 6:02:31, time: 0.867, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0376, loss_cls: 0.1510, acc: 94.2051, loss_bbox: 0.2102, loss_mask: 0.2178, loss: 0.6314 2024-05-28 23:14:16,389 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 6:01:50, time: 0.770, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0347, loss_cls: 0.1474, acc: 94.4329, loss_bbox: 0.1998, loss_mask: 0.2120, loss: 0.6078 2024-05-28 23:14:58,787 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 6:01:11, time: 0.848, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0372, loss_cls: 0.1494, acc: 94.2712, loss_bbox: 0.2054, loss_mask: 0.2164, loss: 0.6228 2024-05-28 23:15:37,722 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 6:00:30, time: 0.779, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0372, loss_cls: 0.1553, acc: 94.0979, loss_bbox: 0.2126, loss_mask: 0.2228, loss: 0.6419 2024-05-28 23:16:21,342 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 5:59:51, time: 0.872, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0372, loss_cls: 0.1537, acc: 94.1160, loss_bbox: 0.2138, loss_mask: 0.2217, loss: 0.6414 2024-05-28 23:17:04,756 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 5:59:13, time: 0.868, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0382, loss_cls: 0.1577, acc: 94.0137, loss_bbox: 0.2135, loss_mask: 0.2212, loss: 0.6456 2024-05-28 23:17:45,875 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 5:58:33, time: 0.822, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0392, loss_cls: 0.1589, acc: 93.9666, loss_bbox: 0.2150, loss_mask: 0.2233, loss: 0.6511 2024-05-28 23:18:24,968 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 5:57:52, time: 0.782, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0348, loss_cls: 0.1469, acc: 94.3428, loss_bbox: 0.2061, loss_mask: 0.2200, loss: 0.6215 2024-05-28 23:19:04,085 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 5:57:12, time: 0.782, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0369, loss_cls: 0.1532, acc: 94.2073, loss_bbox: 0.2113, loss_mask: 0.2202, loss: 0.6372 2024-05-28 23:19:43,160 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 5:56:31, time: 0.781, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0359, loss_cls: 0.1509, acc: 94.2378, loss_bbox: 0.2110, loss_mask: 0.2212, loss: 0.6326 2024-05-28 23:20:22,581 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 5:55:50, time: 0.788, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0369, loss_cls: 0.1569, acc: 94.0176, loss_bbox: 0.2121, loss_mask: 0.2202, loss: 0.6408 2024-05-28 23:21:07,904 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 5:55:12, time: 0.906, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0381, loss_cls: 0.1571, acc: 93.9736, loss_bbox: 0.2170, loss_mask: 0.2255, loss: 0.6522 2024-05-28 23:21:47,127 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 5:54:32, time: 0.785, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0367, loss_cls: 0.1527, acc: 94.2122, loss_bbox: 0.2075, loss_mask: 0.2184, loss: 0.6286 2024-05-28 23:22:26,143 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 5:53:51, time: 0.780, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0367, loss_cls: 0.1553, acc: 94.0339, loss_bbox: 0.2058, loss_mask: 0.2180, loss: 0.6299 2024-05-28 23:23:08,298 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 5:53:12, time: 0.844, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0386, loss_cls: 0.1561, acc: 94.0405, loss_bbox: 0.2116, loss_mask: 0.2221, loss: 0.6435 2024-05-28 23:23:47,511 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 5:52:31, time: 0.784, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0346, loss_cls: 0.1426, acc: 94.5059, loss_bbox: 0.1964, loss_mask: 0.2130, loss: 0.5994 2024-05-28 23:24:30,781 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 5:51:52, time: 0.866, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0390, loss_cls: 0.1610, acc: 93.8506, loss_bbox: 0.2175, loss_mask: 0.2240, loss: 0.6562 2024-05-28 23:25:14,706 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 5:51:14, time: 0.878, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0385, loss_cls: 0.1596, acc: 93.9460, loss_bbox: 0.2173, loss_mask: 0.2250, loss: 0.6564 2024-05-28 23:25:55,341 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 5:50:34, time: 0.813, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0356, loss_cls: 0.1487, acc: 94.3428, loss_bbox: 0.2001, loss_mask: 0.2125, loss: 0.6105 2024-05-28 23:26:34,450 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 5:49:53, time: 0.782, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0377, loss_cls: 0.1626, acc: 93.8142, loss_bbox: 0.2185, loss_mask: 0.2237, loss: 0.6563 2024-05-28 23:27:13,204 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 5:49:12, time: 0.775, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0376, loss_cls: 0.1539, acc: 94.1331, loss_bbox: 0.2112, loss_mask: 0.2198, loss: 0.6371 2024-05-28 23:27:51,891 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 5:48:32, time: 0.774, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0377, loss_cls: 0.1608, acc: 93.9077, loss_bbox: 0.2174, loss_mask: 0.2209, loss: 0.6509 2024-05-28 23:28:33,252 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 5:47:52, time: 0.827, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0383, loss_cls: 0.1539, acc: 94.1365, loss_bbox: 0.2102, loss_mask: 0.2175, loss: 0.6334 2024-05-28 23:29:14,905 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 5:47:12, time: 0.833, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0348, loss_cls: 0.1502, acc: 94.3008, loss_bbox: 0.2034, loss_mask: 0.2128, loss: 0.6152 2024-05-28 23:29:53,602 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 5:46:31, time: 0.774, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0387, loss_cls: 0.1552, acc: 94.0527, loss_bbox: 0.2098, loss_mask: 0.2181, loss: 0.6364 2024-05-28 23:30:32,282 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 5:45:51, time: 0.774, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0373, loss_cls: 0.1505, acc: 94.3232, loss_bbox: 0.2026, loss_mask: 0.2168, loss: 0.6217 2024-05-28 23:31:14,102 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 5:45:11, time: 0.836, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0366, loss_cls: 0.1540, acc: 94.1392, loss_bbox: 0.2076, loss_mask: 0.2173, loss: 0.6293 2024-05-28 23:31:54,723 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 5:44:31, time: 0.812, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0362, loss_cls: 0.1449, acc: 94.4832, loss_bbox: 0.2008, loss_mask: 0.2125, loss: 0.6078 2024-05-28 23:32:36,109 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 5:43:51, time: 0.828, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0362, loss_cls: 0.1516, acc: 94.2551, loss_bbox: 0.2081, loss_mask: 0.2169, loss: 0.6261 2024-05-28 23:33:17,731 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 5:43:12, time: 0.832, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0380, loss_cls: 0.1547, acc: 94.0969, loss_bbox: 0.2132, loss_mask: 0.2202, loss: 0.6396 2024-05-28 23:34:01,830 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 5:42:33, time: 0.882, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0346, loss_cls: 0.1488, acc: 94.3350, loss_bbox: 0.2050, loss_mask: 0.2141, loss: 0.6162 2024-05-28 23:34:40,738 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 5:41:53, time: 0.778, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0380, loss_cls: 0.1595, acc: 93.9221, loss_bbox: 0.2154, loss_mask: 0.2222, loss: 0.6491 2024-05-28 23:35:19,522 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 5:41:12, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0361, loss_cls: 0.1534, acc: 94.2129, loss_bbox: 0.2045, loss_mask: 0.2145, loss: 0.6219 2024-05-28 23:35:58,205 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 5:40:31, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0357, loss_cls: 0.1512, acc: 94.2026, loss_bbox: 0.2074, loss_mask: 0.2194, loss: 0.6276 2024-05-28 23:36:38,773 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 5:39:51, time: 0.811, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0358, loss_cls: 0.1512, acc: 94.2268, loss_bbox: 0.2080, loss_mask: 0.2199, loss: 0.6288 2024-05-28 23:37:20,056 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 5:39:11, time: 0.826, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0372, loss_cls: 0.1527, acc: 94.1880, loss_bbox: 0.2061, loss_mask: 0.2221, loss: 0.6341 2024-05-28 23:37:58,547 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 5:38:30, time: 0.770, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0373, loss_cls: 0.1542, acc: 94.0686, loss_bbox: 0.2115, loss_mask: 0.2234, loss: 0.6407 2024-05-28 23:38:37,541 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 5:37:50, time: 0.780, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0363, loss_cls: 0.1478, acc: 94.3977, loss_bbox: 0.2018, loss_mask: 0.2142, loss: 0.6139 2024-05-28 23:39:21,431 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 5:37:11, time: 0.878, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0381, loss_cls: 0.1518, acc: 94.2747, loss_bbox: 0.2050, loss_mask: 0.2174, loss: 0.6271 2024-05-28 23:40:04,702 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 5:36:32, time: 0.865, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0350, loss_cls: 0.1490, acc: 94.3035, loss_bbox: 0.2018, loss_mask: 0.2135, loss: 0.6134 2024-05-28 23:40:46,068 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 5:35:52, time: 0.827, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0372, loss_cls: 0.1546, acc: 94.0554, loss_bbox: 0.2128, loss_mask: 0.2196, loss: 0.6377 2024-05-28 23:41:25,097 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 5:35:12, time: 0.781, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0375, loss_cls: 0.1505, acc: 94.1987, loss_bbox: 0.2059, loss_mask: 0.2170, loss: 0.6243 2024-05-28 23:42:06,313 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 5:34:32, time: 0.824, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0371, loss_cls: 0.1541, acc: 94.1924, loss_bbox: 0.2055, loss_mask: 0.2151, loss: 0.6272 2024-05-28 23:42:45,449 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 5:33:51, time: 0.783, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0380, loss_cls: 0.1566, acc: 94.0371, loss_bbox: 0.2129, loss_mask: 0.2195, loss: 0.6420 2024-05-28 23:43:24,381 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 5:33:11, time: 0.779, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0385, loss_cls: 0.1595, acc: 93.9646, loss_bbox: 0.2142, loss_mask: 0.2285, loss: 0.6564 2024-05-28 23:44:05,115 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 5:32:31, time: 0.815, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0392, loss_cls: 0.1581, acc: 94.0100, loss_bbox: 0.2147, loss_mask: 0.2253, loss: 0.6524 2024-05-28 23:44:44,048 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 5:31:50, time: 0.779, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0364, loss_cls: 0.1447, acc: 94.5427, loss_bbox: 0.1987, loss_mask: 0.2077, loss: 0.6001 2024-05-28 23:45:24,905 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 5:31:10, time: 0.817, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0395, loss_cls: 0.1571, acc: 94.0088, loss_bbox: 0.2155, loss_mask: 0.2220, loss: 0.6496 2024-05-28 23:46:03,595 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 5:30:29, time: 0.774, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0347, loss_cls: 0.1506, acc: 94.2205, loss_bbox: 0.2071, loss_mask: 0.2204, loss: 0.6260 2024-05-28 23:46:42,715 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 5:29:49, time: 0.782, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0402, loss_cls: 0.1623, acc: 93.8257, loss_bbox: 0.2174, loss_mask: 0.2238, loss: 0.6593 2024-05-28 23:47:27,014 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 5:29:10, time: 0.886, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0382, loss_cls: 0.1581, acc: 94.0464, loss_bbox: 0.2114, loss_mask: 0.2184, loss: 0.6414 2024-05-28 23:48:10,266 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 5:28:31, time: 0.865, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0361, loss_cls: 0.1475, acc: 94.3704, loss_bbox: 0.2005, loss_mask: 0.2144, loss: 0.6126 2024-05-28 23:48:51,653 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 5:27:51, time: 0.828, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0353, loss_cls: 0.1460, acc: 94.3718, loss_bbox: 0.2022, loss_mask: 0.2142, loss: 0.6110 2024-05-28 23:49:32,742 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 5:27:11, time: 0.822, data_time: 0.023, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0355, loss_cls: 0.1492, acc: 94.3372, loss_bbox: 0.2011, loss_mask: 0.2137, loss: 0.6127 2024-05-28 23:50:11,748 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 5:26:31, time: 0.780, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0351, loss_cls: 0.1494, acc: 94.3755, loss_bbox: 0.2041, loss_mask: 0.2185, loss: 0.6211 2024-05-28 23:50:50,933 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 5:25:50, time: 0.784, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0365, loss_cls: 0.1601, acc: 93.9373, loss_bbox: 0.2129, loss_mask: 0.2193, loss: 0.6425 2024-05-28 23:51:31,909 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 5:25:10, time: 0.820, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0360, loss_cls: 0.1490, acc: 94.2605, loss_bbox: 0.2033, loss_mask: 0.2199, loss: 0.6230 2024-05-28 23:52:12,825 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 5:24:30, time: 0.818, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0384, loss_cls: 0.1545, acc: 94.0371, loss_bbox: 0.2114, loss_mask: 0.2217, loss: 0.6400 2024-05-28 23:52:51,445 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 5:23:50, time: 0.772, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0385, loss_cls: 0.1569, acc: 93.9927, loss_bbox: 0.2163, loss_mask: 0.2223, loss: 0.6481 2024-05-28 23:53:30,137 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 5:23:09, time: 0.774, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0380, loss_cls: 0.1560, acc: 94.1274, loss_bbox: 0.2139, loss_mask: 0.2270, loss: 0.6495 2024-05-28 23:54:09,047 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 5:22:28, time: 0.778, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0406, loss_cls: 0.1560, acc: 93.9797, loss_bbox: 0.2143, loss_mask: 0.2247, loss: 0.6516 2024-05-28 23:54:54,806 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 5:21:50, time: 0.915, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0367, loss_cls: 0.1532, acc: 94.1316, loss_bbox: 0.2120, loss_mask: 0.2160, loss: 0.6321 2024-05-28 23:55:35,849 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 5:21:10, time: 0.821, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0361, loss_cls: 0.1472, acc: 94.2668, loss_bbox: 0.2053, loss_mask: 0.2166, loss: 0.6186 2024-05-28 23:56:14,327 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 5:20:29, time: 0.770, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0345, loss_cls: 0.1522, acc: 94.2451, loss_bbox: 0.2058, loss_mask: 0.2129, loss: 0.6185 2024-05-28 23:56:56,277 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 5:19:50, time: 0.839, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0359, loss_cls: 0.1502, acc: 94.2700, loss_bbox: 0.2075, loss_mask: 0.2206, loss: 0.6274 2024-05-28 23:57:36,991 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 5:19:10, time: 0.814, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0368, loss_cls: 0.1542, acc: 94.1479, loss_bbox: 0.2053, loss_mask: 0.2175, loss: 0.6284 2024-05-28 23:58:15,848 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 5:18:29, time: 0.777, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0358, loss_cls: 0.1545, acc: 94.1504, loss_bbox: 0.2125, loss_mask: 0.2196, loss: 0.6367 2024-05-28 23:58:56,865 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 5:17:49, time: 0.820, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0338, loss_cls: 0.1476, acc: 94.3638, loss_bbox: 0.2014, loss_mask: 0.2154, loss: 0.6112 2024-05-28 23:59:37,901 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 5:17:09, time: 0.821, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0365, loss_cls: 0.1497, acc: 94.2122, loss_bbox: 0.2045, loss_mask: 0.2092, loss: 0.6139 2024-05-29 00:00:16,891 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 5:16:29, time: 0.780, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0391, loss_cls: 0.1547, acc: 94.0610, loss_bbox: 0.2064, loss_mask: 0.2193, loss: 0.6349 2024-05-29 00:00:55,777 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 5:15:48, time: 0.778, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0370, loss_cls: 0.1531, acc: 94.1919, loss_bbox: 0.2058, loss_mask: 0.2175, loss: 0.6269 2024-05-29 00:01:36,993 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 5:15:08, time: 0.824, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0370, loss_cls: 0.1541, acc: 94.1523, loss_bbox: 0.2055, loss_mask: 0.2169, loss: 0.6285 2024-05-29 00:02:18,336 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 5:14:28, time: 0.827, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0380, loss_cls: 0.1513, acc: 94.3008, loss_bbox: 0.2058, loss_mask: 0.2177, loss: 0.6275 2024-05-29 00:02:59,254 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 5:13:48, time: 0.818, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0382, loss_cls: 0.1615, acc: 93.8337, loss_bbox: 0.2162, loss_mask: 0.2224, loss: 0.6535 2024-05-29 00:03:39,985 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 5:13:08, time: 0.815, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0344, loss_cls: 0.1450, acc: 94.5591, loss_bbox: 0.1963, loss_mask: 0.2068, loss: 0.5956 2024-05-29 00:04:18,694 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 5:12:28, time: 0.774, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0363, loss_cls: 0.1518, acc: 94.2566, loss_bbox: 0.2066, loss_mask: 0.2201, loss: 0.6290 2024-05-29 00:04:59,887 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 5:11:48, time: 0.824, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0358, loss_cls: 0.1519, acc: 94.1929, loss_bbox: 0.2038, loss_mask: 0.2168, loss: 0.6225 2024-05-29 00:05:40,985 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 5:11:08, time: 0.822, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0383, loss_cls: 0.1522, acc: 94.1582, loss_bbox: 0.2077, loss_mask: 0.2198, loss: 0.6322 2024-05-29 00:06:21,856 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 5:10:28, time: 0.817, data_time: 0.025, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0376, loss_cls: 0.1510, acc: 94.1887, loss_bbox: 0.2079, loss_mask: 0.2204, loss: 0.6315 2024-05-29 00:07:00,753 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 5:09:47, time: 0.778, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0359, loss_cls: 0.1509, acc: 94.2661, loss_bbox: 0.2036, loss_mask: 0.2139, loss: 0.6188 2024-05-29 00:07:41,649 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 5:09:07, time: 0.817, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0373, loss_cls: 0.1552, acc: 94.1057, loss_bbox: 0.2140, loss_mask: 0.2185, loss: 0.6394 2024-05-29 00:08:20,414 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 5:08:27, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0359, loss_cls: 0.1528, acc: 94.1382, loss_bbox: 0.2085, loss_mask: 0.2177, loss: 0.6283 2024-05-29 00:08:59,636 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 5:07:46, time: 0.784, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0386, loss_cls: 0.1604, acc: 93.8796, loss_bbox: 0.2185, loss_mask: 0.2222, loss: 0.6556 2024-05-29 00:09:41,441 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 5:07:06, time: 0.836, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0376, loss_cls: 0.1577, acc: 94.0452, loss_bbox: 0.2112, loss_mask: 0.2178, loss: 0.6396 2024-05-29 00:10:22,472 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 5:06:27, time: 0.821, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0351, loss_cls: 0.1475, acc: 94.3477, loss_bbox: 0.2083, loss_mask: 0.2198, loss: 0.6244 2024-05-29 00:11:03,541 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 5:05:47, time: 0.821, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0350, loss_cls: 0.1477, acc: 94.3032, loss_bbox: 0.2035, loss_mask: 0.2211, loss: 0.6215 2024-05-29 00:11:44,714 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 5:05:07, time: 0.824, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0378, loss_cls: 0.1526, acc: 94.1296, loss_bbox: 0.2120, loss_mask: 0.2184, loss: 0.6347 2024-05-29 00:12:24,060 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 5:04:26, time: 0.787, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0368, loss_cls: 0.1508, acc: 94.2607, loss_bbox: 0.2078, loss_mask: 0.2153, loss: 0.6239 2024-05-29 00:13:02,760 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 5:03:46, time: 0.774, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0359, loss_cls: 0.1542, acc: 94.0796, loss_bbox: 0.2114, loss_mask: 0.2219, loss: 0.6370 2024-05-29 00:13:48,220 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 5:03:07, time: 0.909, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0364, loss_cls: 0.1457, acc: 94.3975, loss_bbox: 0.2019, loss_mask: 0.2178, loss: 0.6145 2024-05-29 00:14:29,346 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 5:02:27, time: 0.823, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0380, loss_cls: 0.1553, acc: 94.0894, loss_bbox: 0.2103, loss_mask: 0.2214, loss: 0.6395 2024-05-29 00:15:08,505 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 5:01:47, time: 0.783, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0374, loss_cls: 0.1532, acc: 94.1277, loss_bbox: 0.2047, loss_mask: 0.2148, loss: 0.6259 2024-05-29 00:15:47,530 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 5:01:06, time: 0.780, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0380, loss_cls: 0.1511, acc: 94.2651, loss_bbox: 0.2117, loss_mask: 0.2174, loss: 0.6329 2024-05-29 00:16:26,680 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 5:00:26, time: 0.783, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0373, loss_cls: 0.1621, acc: 93.8640, loss_bbox: 0.2160, loss_mask: 0.2195, loss: 0.6500 2024-05-29 00:17:05,256 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 4:59:45, time: 0.772, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0367, loss_cls: 0.1522, acc: 94.2185, loss_bbox: 0.2118, loss_mask: 0.2233, loss: 0.6373 2024-05-29 00:17:47,158 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 4:59:05, time: 0.838, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0379, loss_cls: 0.1529, acc: 94.0974, loss_bbox: 0.2112, loss_mask: 0.2194, loss: 0.6368 2024-05-29 00:18:28,472 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 4:58:25, time: 0.826, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0364, loss_cls: 0.1565, acc: 94.0347, loss_bbox: 0.2132, loss_mask: 0.2181, loss: 0.6383 2024-05-29 00:19:09,503 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 4:57:45, time: 0.821, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0364, loss_cls: 0.1484, acc: 94.3918, loss_bbox: 0.2024, loss_mask: 0.2156, loss: 0.6158 2024-05-29 00:19:51,315 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 4:57:06, time: 0.836, data_time: 0.047, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0359, loss_cls: 0.1506, acc: 94.2478, loss_bbox: 0.2033, loss_mask: 0.2160, loss: 0.6204 2024-05-29 00:20:30,384 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 4:56:25, time: 0.782, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0353, loss_cls: 0.1486, acc: 94.2783, loss_bbox: 0.2052, loss_mask: 0.2149, loss: 0.6175 2024-05-29 00:21:13,981 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 4:55:46, time: 0.872, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0368, loss_cls: 0.1487, acc: 94.2422, loss_bbox: 0.2064, loss_mask: 0.2172, loss: 0.6233 2024-05-29 00:21:57,376 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 4:55:07, time: 0.868, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0364, loss_cls: 0.1487, acc: 94.3003, loss_bbox: 0.2061, loss_mask: 0.2164, loss: 0.6204 2024-05-29 00:22:21,525 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-05-29 00:24:45,807 - mmdet - INFO - Evaluating bbox... 2024-05-29 00:25:05,799 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.469 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.699 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.507 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.279 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.504 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.653 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.581 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.581 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.581 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.376 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.622 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.760 2024-05-29 00:25:05,799 - mmdet - INFO - Evaluating segm... 2024-05-29 00:25:28,631 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.416 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.659 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.443 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.195 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.448 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.642 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.303 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.566 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.718 2024-05-29 00:25:28,998 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 00:25:29,000 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4690, bbox_mAP_50: 0.6990, bbox_mAP_75: 0.5070, bbox_mAP_s: 0.2790, bbox_mAP_m: 0.5040, bbox_mAP_l: 0.6530, bbox_mAP_copypaste: 0.469 0.699 0.507 0.279 0.504 0.653, segm_mAP: 0.4160, segm_mAP_50: 0.6590, segm_mAP_75: 0.4430, segm_mAP_s: 0.1950, segm_mAP_m: 0.4480, segm_mAP_l: 0.6420, segm_mAP_copypaste: 0.416 0.659 0.443 0.195 0.448 0.642 2024-05-29 00:26:17,989 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 4:53:58, time: 0.979, data_time: 0.096, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0358, loss_cls: 0.1495, acc: 94.2695, loss_bbox: 0.2052, loss_mask: 0.2161, loss: 0.6204 2024-05-29 00:27:01,277 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 4:53:18, time: 0.866, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0379, loss_cls: 0.1520, acc: 94.1414, loss_bbox: 0.2121, loss_mask: 0.2189, loss: 0.6347 2024-05-29 00:27:40,032 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 4:52:38, time: 0.775, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0372, loss_cls: 0.1507, acc: 94.2312, loss_bbox: 0.2046, loss_mask: 0.2144, loss: 0.6208 2024-05-29 00:28:18,885 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 4:51:57, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0360, loss_cls: 0.1596, acc: 93.8777, loss_bbox: 0.2179, loss_mask: 0.2232, loss: 0.6508 2024-05-29 00:28:57,637 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 4:51:16, time: 0.775, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0358, loss_cls: 0.1472, acc: 94.3511, loss_bbox: 0.2062, loss_mask: 0.2164, loss: 0.6189 2024-05-29 00:29:35,930 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 4:50:36, time: 0.766, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0367, loss_cls: 0.1491, acc: 94.2898, loss_bbox: 0.2079, loss_mask: 0.2189, loss: 0.6266 2024-05-29 00:30:14,395 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 4:49:55, time: 0.769, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0388, loss_cls: 0.1540, acc: 94.0618, loss_bbox: 0.2117, loss_mask: 0.2232, loss: 0.6416 2024-05-29 00:30:52,672 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 4:49:14, time: 0.766, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0319, loss_cls: 0.1385, acc: 94.6899, loss_bbox: 0.1896, loss_mask: 0.2090, loss: 0.5809 2024-05-29 00:31:31,075 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 4:48:33, time: 0.768, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0359, loss_cls: 0.1444, acc: 94.4368, loss_bbox: 0.2029, loss_mask: 0.2128, loss: 0.6082 2024-05-29 00:32:12,182 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 4:47:53, time: 0.822, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0360, loss_cls: 0.1449, acc: 94.4697, loss_bbox: 0.2001, loss_mask: 0.2146, loss: 0.6090 2024-05-29 00:32:50,560 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 4:47:13, time: 0.768, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0355, loss_cls: 0.1523, acc: 94.2664, loss_bbox: 0.2089, loss_mask: 0.2188, loss: 0.6290 2024-05-29 00:33:32,805 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 4:46:33, time: 0.845, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0361, loss_cls: 0.1477, acc: 94.3225, loss_bbox: 0.2046, loss_mask: 0.2132, loss: 0.6151 2024-05-29 00:34:18,690 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 4:45:55, time: 0.918, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0371, loss_cls: 0.1534, acc: 94.0974, loss_bbox: 0.2080, loss_mask: 0.2197, loss: 0.6322 2024-05-29 00:34:57,368 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 4:45:14, time: 0.773, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0383, loss_cls: 0.1543, acc: 94.0881, loss_bbox: 0.2118, loss_mask: 0.2151, loss: 0.6334 2024-05-29 00:35:38,530 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 4:44:34, time: 0.823, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0358, loss_cls: 0.1472, acc: 94.3391, loss_bbox: 0.2049, loss_mask: 0.2155, loss: 0.6174 2024-05-29 00:36:16,957 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 4:43:53, time: 0.768, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0353, loss_cls: 0.1457, acc: 94.4597, loss_bbox: 0.2021, loss_mask: 0.2148, loss: 0.6107 2024-05-29 00:36:58,002 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 4:43:13, time: 0.821, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0350, loss_cls: 0.1515, acc: 94.2366, loss_bbox: 0.2069, loss_mask: 0.2155, loss: 0.6227 2024-05-29 00:37:36,837 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 4:42:33, time: 0.777, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0383, loss_cls: 0.1511, acc: 94.1677, loss_bbox: 0.2103, loss_mask: 0.2200, loss: 0.6337 2024-05-29 00:38:15,490 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 4:41:52, time: 0.773, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0362, loss_cls: 0.1542, acc: 94.0830, loss_bbox: 0.2159, loss_mask: 0.2198, loss: 0.6391 2024-05-29 00:38:57,064 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 4:41:12, time: 0.831, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0369, loss_cls: 0.1481, acc: 94.3352, loss_bbox: 0.2028, loss_mask: 0.2147, loss: 0.6163 2024-05-29 00:39:35,550 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 4:40:32, time: 0.770, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0348, loss_cls: 0.1462, acc: 94.4207, loss_bbox: 0.1988, loss_mask: 0.2133, loss: 0.6069 2024-05-29 00:40:14,174 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 4:39:51, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0350, loss_cls: 0.1426, acc: 94.5977, loss_bbox: 0.1929, loss_mask: 0.2074, loss: 0.5907 2024-05-29 00:40:54,631 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 4:39:10, time: 0.767, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0369, loss_cls: 0.1505, acc: 94.2397, loss_bbox: 0.2074, loss_mask: 0.2180, loss: 0.6270 2024-05-29 00:41:40,382 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 4:38:32, time: 0.957, data_time: 0.073, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0356, loss_cls: 0.1474, acc: 94.3181, loss_bbox: 0.2069, loss_mask: 0.2237, loss: 0.6275 2024-05-29 00:42:24,703 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 4:37:53, time: 0.886, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0372, loss_cls: 0.1566, acc: 93.9944, loss_bbox: 0.2111, loss_mask: 0.2186, loss: 0.6381 2024-05-29 00:43:03,197 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 4:37:13, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0360, loss_cls: 0.1504, acc: 94.2920, loss_bbox: 0.2049, loss_mask: 0.2146, loss: 0.6201 2024-05-29 00:43:43,771 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 4:36:33, time: 0.811, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0367, loss_cls: 0.1501, acc: 94.2922, loss_bbox: 0.2047, loss_mask: 0.2176, loss: 0.6238 2024-05-29 00:44:22,240 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 4:35:52, time: 0.770, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0344, loss_cls: 0.1432, acc: 94.5024, loss_bbox: 0.1973, loss_mask: 0.2101, loss: 0.5983 2024-05-29 00:45:01,045 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 4:35:11, time: 0.776, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0360, loss_cls: 0.1454, acc: 94.4094, loss_bbox: 0.2049, loss_mask: 0.2156, loss: 0.6157 2024-05-29 00:45:39,434 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 4:34:31, time: 0.768, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0351, loss_cls: 0.1483, acc: 94.3516, loss_bbox: 0.2011, loss_mask: 0.2175, loss: 0.6148 2024-05-29 00:46:18,239 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 4:33:50, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0361, loss_cls: 0.1458, acc: 94.4097, loss_bbox: 0.1998, loss_mask: 0.2145, loss: 0.6095 2024-05-29 00:46:59,108 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 4:33:10, time: 0.817, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0371, loss_cls: 0.1515, acc: 94.1592, loss_bbox: 0.2083, loss_mask: 0.2181, loss: 0.6288 2024-05-29 00:47:37,865 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 4:32:29, time: 0.775, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0340, loss_cls: 0.1453, acc: 94.4739, loss_bbox: 0.2009, loss_mask: 0.2111, loss: 0.6036 2024-05-29 00:48:16,273 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 4:31:49, time: 0.768, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0334, loss_cls: 0.1404, acc: 94.6533, loss_bbox: 0.1961, loss_mask: 0.2077, loss: 0.5897 2024-05-29 00:48:59,142 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 4:31:09, time: 0.857, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0343, loss_cls: 0.1495, acc: 94.3091, loss_bbox: 0.2034, loss_mask: 0.2198, loss: 0.6199 2024-05-29 00:49:48,707 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 4:30:32, time: 0.991, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0361, loss_cls: 0.1476, acc: 94.3340, loss_bbox: 0.2025, loss_mask: 0.2175, loss: 0.6187 2024-05-29 00:50:27,241 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 4:29:51, time: 0.771, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0354, loss_cls: 0.1486, acc: 94.2917, loss_bbox: 0.2069, loss_mask: 0.2169, loss: 0.6205 2024-05-29 00:51:06,060 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 4:29:11, time: 0.776, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0362, loss_cls: 0.1518, acc: 94.1167, loss_bbox: 0.2085, loss_mask: 0.2158, loss: 0.6253 2024-05-29 00:51:46,761 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 4:28:30, time: 0.814, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0354, loss_cls: 0.1447, acc: 94.4736, loss_bbox: 0.2018, loss_mask: 0.2157, loss: 0.6111 2024-05-29 00:52:25,411 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 4:27:50, time: 0.773, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1527, acc: 94.1367, loss_bbox: 0.2086, loss_mask: 0.2173, loss: 0.6276 2024-05-29 00:53:03,868 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 4:27:09, time: 0.769, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0357, loss_cls: 0.1462, acc: 94.4434, loss_bbox: 0.2015, loss_mask: 0.2124, loss: 0.6094 2024-05-29 00:53:42,723 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 4:26:29, time: 0.777, data_time: 0.052, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0386, loss_cls: 0.1543, acc: 94.0293, loss_bbox: 0.2164, loss_mask: 0.2192, loss: 0.6428 2024-05-29 00:54:21,541 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 4:25:48, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0373, loss_cls: 0.1557, acc: 94.0918, loss_bbox: 0.2115, loss_mask: 0.2167, loss: 0.6351 2024-05-29 00:55:02,489 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 4:25:08, time: 0.819, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0351, loss_cls: 0.1426, acc: 94.4692, loss_bbox: 0.1955, loss_mask: 0.2086, loss: 0.5949 2024-05-29 00:55:43,554 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 4:24:28, time: 0.821, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0379, loss_cls: 0.1581, acc: 93.9409, loss_bbox: 0.2145, loss_mask: 0.2167, loss: 0.6416 2024-05-29 00:56:24,657 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 4:23:48, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0375, loss_cls: 0.1487, acc: 94.4104, loss_bbox: 0.2062, loss_mask: 0.2227, loss: 0.6297 2024-05-29 00:57:03,890 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 4:23:08, time: 0.785, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0401, loss_cls: 0.1606, acc: 93.8655, loss_bbox: 0.2225, loss_mask: 0.2241, loss: 0.6618 2024-05-29 00:57:51,522 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 4:22:30, time: 0.953, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0365, loss_cls: 0.1487, acc: 94.3281, loss_bbox: 0.2059, loss_mask: 0.2180, loss: 0.6226 2024-05-29 00:58:30,000 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 4:21:49, time: 0.770, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0378, loss_cls: 0.1570, acc: 93.9822, loss_bbox: 0.2152, loss_mask: 0.2211, loss: 0.6459 2024-05-29 00:59:10,825 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 4:21:09, time: 0.816, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0379, loss_cls: 0.1549, acc: 94.1150, loss_bbox: 0.2139, loss_mask: 0.2202, loss: 0.6412 2024-05-29 00:59:49,978 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 4:20:28, time: 0.783, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0391, loss_cls: 0.1552, acc: 94.0105, loss_bbox: 0.2159, loss_mask: 0.2227, loss: 0.6482 2024-05-29 01:00:28,514 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 4:19:48, time: 0.771, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0348, loss_cls: 0.1432, acc: 94.5471, loss_bbox: 0.2001, loss_mask: 0.2103, loss: 0.6017 2024-05-29 01:01:07,505 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 4:19:07, time: 0.780, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0370, loss_cls: 0.1526, acc: 94.1160, loss_bbox: 0.2070, loss_mask: 0.2172, loss: 0.6281 2024-05-29 01:01:46,227 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 4:18:27, time: 0.774, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0369, loss_cls: 0.1510, acc: 94.2800, loss_bbox: 0.2111, loss_mask: 0.2180, loss: 0.6311 2024-05-29 01:02:26,775 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 4:17:46, time: 0.811, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0353, loss_cls: 0.1448, acc: 94.4116, loss_bbox: 0.2024, loss_mask: 0.2144, loss: 0.6090 2024-05-29 01:03:05,510 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 4:17:06, time: 0.775, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0360, loss_cls: 0.1520, acc: 94.1809, loss_bbox: 0.2092, loss_mask: 0.2191, loss: 0.6311 2024-05-29 01:03:46,299 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 4:16:26, time: 0.816, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0344, loss_cls: 0.1466, acc: 94.3843, loss_bbox: 0.1993, loss_mask: 0.2134, loss: 0.6057 2024-05-29 01:04:27,508 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 4:15:46, time: 0.824, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0373, loss_cls: 0.1537, acc: 94.2307, loss_bbox: 0.2102, loss_mask: 0.2198, loss: 0.6345 2024-05-29 01:05:06,138 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 4:15:05, time: 0.773, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0393, loss_cls: 0.1549, acc: 94.0715, loss_bbox: 0.2124, loss_mask: 0.2242, loss: 0.6455 2024-05-29 01:05:55,685 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 4:14:28, time: 0.991, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0373, loss_cls: 0.1491, acc: 94.3113, loss_bbox: 0.2027, loss_mask: 0.2182, loss: 0.6203 2024-05-29 01:06:34,189 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 4:13:47, time: 0.770, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0382, loss_cls: 0.1547, acc: 94.1353, loss_bbox: 0.2106, loss_mask: 0.2215, loss: 0.6395 2024-05-29 01:07:12,902 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 4:13:06, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0356, loss_cls: 0.1539, acc: 94.1316, loss_bbox: 0.2063, loss_mask: 0.2142, loss: 0.6234 2024-05-29 01:07:51,272 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 4:12:26, time: 0.767, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0364, loss_cls: 0.1495, acc: 94.2463, loss_bbox: 0.2033, loss_mask: 0.2163, loss: 0.6197 2024-05-29 01:08:29,712 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 4:11:45, time: 0.769, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0374, loss_cls: 0.1497, acc: 94.2163, loss_bbox: 0.2060, loss_mask: 0.2143, loss: 0.6206 2024-05-29 01:09:08,102 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 4:11:04, time: 0.768, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0359, loss_cls: 0.1457, acc: 94.3860, loss_bbox: 0.2010, loss_mask: 0.2169, loss: 0.6122 2024-05-29 01:09:46,795 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 4:10:24, time: 0.774, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0358, loss_cls: 0.1464, acc: 94.3813, loss_bbox: 0.2046, loss_mask: 0.2130, loss: 0.6127 2024-05-29 01:10:27,543 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 4:09:44, time: 0.815, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0349, loss_cls: 0.1446, acc: 94.4224, loss_bbox: 0.1999, loss_mask: 0.2113, loss: 0.6038 2024-05-29 01:11:06,334 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 4:09:03, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0361, loss_cls: 0.1490, acc: 94.2971, loss_bbox: 0.2051, loss_mask: 0.2145, loss: 0.6176 2024-05-29 01:11:47,034 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 4:08:23, time: 0.814, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0359, loss_cls: 0.1457, acc: 94.4172, loss_bbox: 0.2053, loss_mask: 0.2188, loss: 0.6188 2024-05-29 01:12:30,057 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 4:07:44, time: 0.860, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0376, loss_cls: 0.1567, acc: 93.9907, loss_bbox: 0.2118, loss_mask: 0.2213, loss: 0.6420 2024-05-29 01:13:08,928 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 4:07:03, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0349, loss_cls: 0.1515, acc: 94.2373, loss_bbox: 0.2089, loss_mask: 0.2173, loss: 0.6266 2024-05-29 01:13:54,323 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 4:06:24, time: 0.908, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0369, loss_cls: 0.1533, acc: 94.0920, loss_bbox: 0.2124, loss_mask: 0.2183, loss: 0.6343 2024-05-29 01:14:35,832 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 4:05:44, time: 0.830, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0376, loss_cls: 0.1494, acc: 94.2449, loss_bbox: 0.2054, loss_mask: 0.2165, loss: 0.6225 2024-05-29 01:15:14,515 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 4:05:04, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0343, loss_cls: 0.1457, acc: 94.4651, loss_bbox: 0.1992, loss_mask: 0.2109, loss: 0.6033 2024-05-29 01:15:53,008 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 4:04:23, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0356, loss_cls: 0.1494, acc: 94.3645, loss_bbox: 0.2066, loss_mask: 0.2162, loss: 0.6203 2024-05-29 01:16:31,649 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 4:03:43, time: 0.773, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0356, loss_cls: 0.1469, acc: 94.3154, loss_bbox: 0.2024, loss_mask: 0.2173, loss: 0.6142 2024-05-29 01:17:10,545 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 4:03:02, time: 0.778, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0363, loss_cls: 0.1525, acc: 94.1484, loss_bbox: 0.2114, loss_mask: 0.2169, loss: 0.6311 2024-05-29 01:17:51,511 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 4:02:22, time: 0.819, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0379, loss_cls: 0.1537, acc: 94.1038, loss_bbox: 0.2087, loss_mask: 0.2188, loss: 0.6336 2024-05-29 01:18:30,410 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 4:01:42, time: 0.778, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0354, loss_cls: 0.1454, acc: 94.3936, loss_bbox: 0.2011, loss_mask: 0.2136, loss: 0.6095 2024-05-29 01:19:09,078 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 4:01:01, time: 0.773, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0361, loss_cls: 0.1501, acc: 94.3271, loss_bbox: 0.2063, loss_mask: 0.2185, loss: 0.6243 2024-05-29 01:19:54,472 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 4:00:22, time: 0.908, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0390, loss_cls: 0.1585, acc: 93.8877, loss_bbox: 0.2163, loss_mask: 0.2186, loss: 0.6468 2024-05-29 01:20:33,195 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 3:59:41, time: 0.774, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0367, loss_cls: 0.1517, acc: 94.2148, loss_bbox: 0.2085, loss_mask: 0.2151, loss: 0.6254 2024-05-29 01:21:18,031 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 3:59:02, time: 0.897, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0393, loss_cls: 0.1556, acc: 94.1199, loss_bbox: 0.2166, loss_mask: 0.2235, loss: 0.6506 2024-05-29 01:22:01,793 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 3:58:23, time: 0.875, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0366, loss_cls: 0.1505, acc: 94.2839, loss_bbox: 0.2065, loss_mask: 0.2198, loss: 0.6277 2024-05-29 01:22:40,632 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 3:57:43, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0385, loss_cls: 0.1486, acc: 94.2551, loss_bbox: 0.2044, loss_mask: 0.2199, loss: 0.6254 2024-05-29 01:23:18,950 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 3:57:02, time: 0.766, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0339, loss_cls: 0.1408, acc: 94.6340, loss_bbox: 0.1978, loss_mask: 0.2182, loss: 0.6029 2024-05-29 01:23:57,286 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 3:56:21, time: 0.767, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0347, loss_cls: 0.1411, acc: 94.6399, loss_bbox: 0.1946, loss_mask: 0.2145, loss: 0.5995 2024-05-29 01:24:36,163 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 3:55:41, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0358, loss_cls: 0.1489, acc: 94.3342, loss_bbox: 0.2043, loss_mask: 0.2184, loss: 0.6204 2024-05-29 01:25:14,731 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 3:55:00, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0364, loss_cls: 0.1516, acc: 94.2053, loss_bbox: 0.2013, loss_mask: 0.2109, loss: 0.6143 2024-05-29 01:25:55,120 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 3:54:20, time: 0.808, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0349, loss_cls: 0.1451, acc: 94.4905, loss_bbox: 0.1995, loss_mask: 0.2112, loss: 0.6028 2024-05-29 01:26:36,802 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 3:53:40, time: 0.834, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0393, loss_cls: 0.1557, acc: 93.9983, loss_bbox: 0.2159, loss_mask: 0.2180, loss: 0.6433 2024-05-29 01:27:15,623 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 3:53:00, time: 0.776, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0357, loss_cls: 0.1520, acc: 94.0891, loss_bbox: 0.2101, loss_mask: 0.2183, loss: 0.6295 2024-05-29 01:28:03,375 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 3:52:21, time: 0.955, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0364, loss_cls: 0.1510, acc: 94.1907, loss_bbox: 0.2051, loss_mask: 0.2151, loss: 0.6208 2024-05-29 01:28:41,718 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 3:51:41, time: 0.767, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0327, loss_cls: 0.1392, acc: 94.6860, loss_bbox: 0.1931, loss_mask: 0.2072, loss: 0.5852 2024-05-29 01:29:20,110 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 3:51:00, time: 0.768, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0365, loss_cls: 0.1500, acc: 94.1978, loss_bbox: 0.2033, loss_mask: 0.2181, loss: 0.6214 2024-05-29 01:30:03,084 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 3:50:20, time: 0.859, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0356, loss_cls: 0.1448, acc: 94.3718, loss_bbox: 0.2010, loss_mask: 0.2165, loss: 0.6111 2024-05-29 01:30:41,621 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 3:49:40, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0389, loss_cls: 0.1590, acc: 93.8906, loss_bbox: 0.2145, loss_mask: 0.2222, loss: 0.6495 2024-05-29 01:31:20,332 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 3:48:59, time: 0.775, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0333, loss_cls: 0.1376, acc: 94.6965, loss_bbox: 0.1905, loss_mask: 0.2106, loss: 0.5854 2024-05-29 01:31:59,205 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 3:48:19, time: 0.777, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0382, loss_cls: 0.1536, acc: 94.1711, loss_bbox: 0.2055, loss_mask: 0.2131, loss: 0.6249 2024-05-29 01:32:39,959 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 3:47:39, time: 0.815, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0363, loss_cls: 0.1513, acc: 94.2952, loss_bbox: 0.2078, loss_mask: 0.2184, loss: 0.6280 2024-05-29 01:33:18,895 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 3:46:58, time: 0.779, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0362, loss_cls: 0.1446, acc: 94.3884, loss_bbox: 0.1946, loss_mask: 0.2142, loss: 0.6023 2024-05-29 01:33:57,673 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 3:46:18, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0366, loss_cls: 0.1455, acc: 94.3835, loss_bbox: 0.2031, loss_mask: 0.2200, loss: 0.6177 2024-05-29 01:34:41,281 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 3:45:38, time: 0.872, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0361, loss_cls: 0.1434, acc: 94.5227, loss_bbox: 0.1996, loss_mask: 0.2174, loss: 0.6099 2024-05-29 01:35:19,877 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 3:44:58, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0352, loss_cls: 0.1424, acc: 94.5891, loss_bbox: 0.1945, loss_mask: 0.2132, loss: 0.5978 2024-05-29 01:36:05,046 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 3:44:19, time: 0.903, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0388, loss_cls: 0.1542, acc: 94.0417, loss_bbox: 0.2155, loss_mask: 0.2189, loss: 0.6407 2024-05-29 01:36:43,785 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 3:43:38, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0371, loss_cls: 0.1511, acc: 94.2629, loss_bbox: 0.2047, loss_mask: 0.2121, loss: 0.6188 2024-05-29 01:37:24,404 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 3:42:58, time: 0.812, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0362, loss_cls: 0.1475, acc: 94.3267, loss_bbox: 0.2025, loss_mask: 0.2170, loss: 0.6162 2024-05-29 01:38:05,148 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 3:42:18, time: 0.815, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0368, loss_cls: 0.1484, acc: 94.3562, loss_bbox: 0.2028, loss_mask: 0.2137, loss: 0.6149 2024-05-29 01:38:43,805 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 3:41:37, time: 0.773, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0359, loss_cls: 0.1513, acc: 94.1641, loss_bbox: 0.2035, loss_mask: 0.2195, loss: 0.6232 2024-05-29 01:39:21,906 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 3:40:57, time: 0.762, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0353, loss_cls: 0.1464, acc: 94.3811, loss_bbox: 0.2008, loss_mask: 0.2153, loss: 0.6103 2024-05-29 01:40:02,985 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 3:40:17, time: 0.822, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0364, loss_cls: 0.1524, acc: 94.1333, loss_bbox: 0.2114, loss_mask: 0.2189, loss: 0.6337 2024-05-29 01:40:41,323 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 3:39:36, time: 0.767, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0378, loss_cls: 0.1503, acc: 94.2693, loss_bbox: 0.2061, loss_mask: 0.2190, loss: 0.6279 2024-05-29 01:41:19,974 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 3:38:56, time: 0.773, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0374, loss_cls: 0.1489, acc: 94.3325, loss_bbox: 0.2062, loss_mask: 0.2190, loss: 0.6258 2024-05-29 01:41:58,792 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 3:38:15, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0372, loss_cls: 0.1519, acc: 94.1462, loss_bbox: 0.2141, loss_mask: 0.2179, loss: 0.6340 2024-05-29 01:42:39,627 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 3:37:35, time: 0.817, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0370, loss_cls: 0.1481, acc: 94.3164, loss_bbox: 0.2059, loss_mask: 0.2130, loss: 0.6172 2024-05-29 01:43:22,393 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 3:36:55, time: 0.855, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0341, loss_cls: 0.1413, acc: 94.6191, loss_bbox: 0.1957, loss_mask: 0.2145, loss: 0.5976 2024-05-29 01:44:07,813 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 3:36:16, time: 0.909, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0352, loss_cls: 0.1456, acc: 94.4614, loss_bbox: 0.2040, loss_mask: 0.2183, loss: 0.6166 2024-05-29 01:44:46,666 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 3:35:36, time: 0.777, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0383, loss_cls: 0.1551, acc: 94.0864, loss_bbox: 0.2123, loss_mask: 0.2195, loss: 0.6394 2024-05-29 01:45:27,145 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 3:34:56, time: 0.810, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0382, loss_cls: 0.1519, acc: 94.1416, loss_bbox: 0.2127, loss_mask: 0.2191, loss: 0.6363 2024-05-29 01:46:05,704 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 3:34:15, time: 0.771, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0340, loss_cls: 0.1483, acc: 94.3225, loss_bbox: 0.2086, loss_mask: 0.2149, loss: 0.6189 2024-05-29 01:46:46,697 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 3:33:35, time: 0.820, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0379, loss_cls: 0.1488, acc: 94.3335, loss_bbox: 0.2074, loss_mask: 0.2198, loss: 0.6279 2024-05-29 01:47:25,093 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 3:32:55, time: 0.768, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0342, loss_cls: 0.1448, acc: 94.4470, loss_bbox: 0.1988, loss_mask: 0.2153, loss: 0.6051 2024-05-29 01:48:03,900 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 3:32:14, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0381, loss_cls: 0.1500, acc: 94.3171, loss_bbox: 0.2120, loss_mask: 0.2235, loss: 0.6371 2024-05-29 01:48:42,577 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 3:31:34, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0372, loss_cls: 0.1511, acc: 94.2100, loss_bbox: 0.2111, loss_mask: 0.2234, loss: 0.6351 2024-05-29 01:49:21,254 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 3:30:53, time: 0.773, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0377, loss_cls: 0.1485, acc: 94.3164, loss_bbox: 0.2059, loss_mask: 0.2161, loss: 0.6206 2024-05-29 01:49:59,953 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 3:30:12, time: 0.774, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0386, loss_cls: 0.1554, acc: 94.1526, loss_bbox: 0.2077, loss_mask: 0.2174, loss: 0.6341 2024-05-29 01:50:40,693 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 3:29:32, time: 0.815, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0363, loss_cls: 0.1479, acc: 94.3538, loss_bbox: 0.2057, loss_mask: 0.2184, loss: 0.6221 2024-05-29 01:51:24,027 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 3:28:53, time: 0.867, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0359, loss_cls: 0.1504, acc: 94.2283, loss_bbox: 0.2097, loss_mask: 0.2165, loss: 0.6263 2024-05-29 01:52:09,271 - mmdet - INFO - Epoch [10][6450/7330] lr: 1.000e-05, eta: 3:28:14, time: 0.905, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0357, loss_cls: 0.1436, acc: 94.4746, loss_bbox: 0.1990, loss_mask: 0.2157, loss: 0.6060 2024-05-29 01:52:47,800 - mmdet - INFO - Epoch [10][6500/7330] lr: 1.000e-05, eta: 3:27:33, time: 0.771, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0361, loss_cls: 0.1507, acc: 94.2473, loss_bbox: 0.2101, loss_mask: 0.2201, loss: 0.6304 2024-05-29 01:53:28,298 - mmdet - INFO - Epoch [10][6550/7330] lr: 1.000e-05, eta: 3:26:53, time: 0.810, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0369, loss_cls: 0.1477, acc: 94.3564, loss_bbox: 0.2070, loss_mask: 0.2152, loss: 0.6207 2024-05-29 01:54:06,582 - mmdet - INFO - Epoch [10][6600/7330] lr: 1.000e-05, eta: 3:26:12, time: 0.766, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0354, loss_cls: 0.1504, acc: 94.2175, loss_bbox: 0.2075, loss_mask: 0.2151, loss: 0.6216 2024-05-29 01:54:46,907 - mmdet - INFO - Epoch [10][6650/7330] lr: 1.000e-05, eta: 3:25:32, time: 0.807, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0371, loss_cls: 0.1525, acc: 94.1570, loss_bbox: 0.2083, loss_mask: 0.2168, loss: 0.6278 2024-05-29 01:55:25,618 - mmdet - INFO - Epoch [10][6700/7330] lr: 1.000e-05, eta: 3:24:52, time: 0.774, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0343, loss_cls: 0.1510, acc: 94.2095, loss_bbox: 0.2033, loss_mask: 0.2191, loss: 0.6207 2024-05-29 01:56:04,834 - mmdet - INFO - Epoch [10][6750/7330] lr: 1.000e-05, eta: 3:24:11, time: 0.784, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0358, loss_cls: 0.1448, acc: 94.4004, loss_bbox: 0.2007, loss_mask: 0.2125, loss: 0.6078 2024-05-29 01:56:43,169 - mmdet - INFO - Epoch [10][6800/7330] lr: 1.000e-05, eta: 3:23:31, time: 0.767, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0330, loss_cls: 0.1398, acc: 94.6404, loss_bbox: 0.1930, loss_mask: 0.2123, loss: 0.5902 2024-05-29 01:57:22,158 - mmdet - INFO - Epoch [10][6850/7330] lr: 1.000e-05, eta: 3:22:50, time: 0.779, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0365, loss_cls: 0.1537, acc: 94.1619, loss_bbox: 0.2112, loss_mask: 0.2204, loss: 0.6353 2024-05-29 01:58:00,670 - mmdet - INFO - Epoch [10][6900/7330] lr: 1.000e-05, eta: 3:22:10, time: 0.771, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0355, loss_cls: 0.1427, acc: 94.5884, loss_bbox: 0.1945, loss_mask: 0.2136, loss: 0.5987 2024-05-29 01:58:39,525 - mmdet - INFO - Epoch [10][6950/7330] lr: 1.000e-05, eta: 3:21:29, time: 0.777, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0364, loss_cls: 0.1473, acc: 94.3579, loss_bbox: 0.2047, loss_mask: 0.2124, loss: 0.6142 2024-05-29 01:59:27,654 - mmdet - INFO - Epoch [10][7000/7330] lr: 1.000e-05, eta: 3:20:51, time: 0.963, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0369, loss_cls: 0.1492, acc: 94.2820, loss_bbox: 0.2030, loss_mask: 0.2189, loss: 0.6224 2024-05-29 02:00:08,788 - mmdet - INFO - Epoch [10][7050/7330] lr: 1.000e-05, eta: 3:20:11, time: 0.823, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0354, loss_cls: 0.1462, acc: 94.3713, loss_bbox: 0.2038, loss_mask: 0.2140, loss: 0.6128 2024-05-29 02:00:47,182 - mmdet - INFO - Epoch [10][7100/7330] lr: 1.000e-05, eta: 3:19:30, time: 0.768, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0357, loss_cls: 0.1441, acc: 94.6157, loss_bbox: 0.1968, loss_mask: 0.2124, loss: 0.6015 2024-05-29 02:01:27,903 - mmdet - INFO - Epoch [10][7150/7330] lr: 1.000e-05, eta: 3:18:50, time: 0.814, data_time: 0.026, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0354, loss_cls: 0.1448, acc: 94.5339, loss_bbox: 0.1989, loss_mask: 0.2135, loss: 0.6066 2024-05-29 02:02:06,374 - mmdet - INFO - Epoch [10][7200/7330] lr: 1.000e-05, eta: 3:18:10, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0357, loss_cls: 0.1478, acc: 94.3718, loss_bbox: 0.2023, loss_mask: 0.2161, loss: 0.6155 2024-05-29 02:02:47,075 - mmdet - INFO - Epoch [10][7250/7330] lr: 1.000e-05, eta: 3:17:29, time: 0.814, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0370, loss_cls: 0.1469, acc: 94.3704, loss_bbox: 0.2055, loss_mask: 0.2180, loss: 0.6205 2024-05-29 02:03:25,423 - mmdet - INFO - Epoch [10][7300/7330] lr: 1.000e-05, eta: 3:16:49, time: 0.767, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0343, loss_cls: 0.1408, acc: 94.6304, loss_bbox: 0.1940, loss_mask: 0.2125, loss: 0.5937 2024-05-29 02:03:49,346 - mmdet - INFO - Saving checkpoint at 10 epochs 2024-05-29 02:06:12,344 - mmdet - INFO - Evaluating bbox... 2024-05-29 02:06:31,785 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.469 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.699 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.505 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.279 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.505 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.653 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.376 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.620 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.761 2024-05-29 02:06:31,785 - mmdet - INFO - Evaluating segm... 2024-05-29 02:06:55,912 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.416 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.658 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.441 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.197 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.447 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.641 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.564 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.715 2024-05-29 02:06:56,220 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 02:06:56,223 - mmdet - INFO - Epoch(val) [10][625] bbox_mAP: 0.4690, bbox_mAP_50: 0.6990, bbox_mAP_75: 0.5050, bbox_mAP_s: 0.2790, bbox_mAP_m: 0.5050, bbox_mAP_l: 0.6530, bbox_mAP_copypaste: 0.469 0.699 0.505 0.279 0.505 0.653, segm_mAP: 0.4160, segm_mAP_50: 0.6580, segm_mAP_75: 0.4410, segm_mAP_s: 0.1970, segm_mAP_m: 0.4470, segm_mAP_l: 0.6410, segm_mAP_copypaste: 0.416 0.658 0.441 0.197 0.447 0.641 2024-05-29 02:07:45,538 - mmdet - INFO - Epoch [11][50/7330] lr: 1.000e-05, eta: 3:15:42, time: 0.985, data_time: 0.096, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0383, loss_cls: 0.1508, acc: 94.1211, loss_bbox: 0.2109, loss_mask: 0.2192, loss: 0.6323 2024-05-29 02:08:23,915 - mmdet - INFO - Epoch [11][100/7330] lr: 1.000e-05, eta: 3:15:01, time: 0.768, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0366, loss_cls: 0.1530, acc: 94.1028, loss_bbox: 0.2121, loss_mask: 0.2222, loss: 0.6375 2024-05-29 02:09:02,498 - mmdet - INFO - Epoch [11][150/7330] lr: 1.000e-05, eta: 3:14:21, time: 0.771, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0360, loss_cls: 0.1450, acc: 94.4573, loss_bbox: 0.2002, loss_mask: 0.2191, loss: 0.6127 2024-05-29 02:09:40,728 - mmdet - INFO - Epoch [11][200/7330] lr: 1.000e-05, eta: 3:13:40, time: 0.765, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0328, loss_cls: 0.1385, acc: 94.7312, loss_bbox: 0.1921, loss_mask: 0.2072, loss: 0.5822 2024-05-29 02:10:19,535 - mmdet - INFO - Epoch [11][250/7330] lr: 1.000e-05, eta: 3:13:00, time: 0.776, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0370, loss_cls: 0.1461, acc: 94.4419, loss_bbox: 0.2046, loss_mask: 0.2157, loss: 0.6165 2024-05-29 02:10:58,244 - mmdet - INFO - Epoch [11][300/7330] lr: 1.000e-05, eta: 3:12:19, time: 0.774, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0385, loss_cls: 0.1547, acc: 94.0662, loss_bbox: 0.2137, loss_mask: 0.2168, loss: 0.6365 2024-05-29 02:11:36,656 - mmdet - INFO - Epoch [11][350/7330] lr: 1.000e-05, eta: 3:11:39, time: 0.768, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0369, loss_cls: 0.1491, acc: 94.2373, loss_bbox: 0.2084, loss_mask: 0.2162, loss: 0.6231 2024-05-29 02:12:15,211 - mmdet - INFO - Epoch [11][400/7330] lr: 1.000e-05, eta: 3:10:58, time: 0.771, data_time: 0.047, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1468, acc: 94.4170, loss_bbox: 0.2069, loss_mask: 0.2195, loss: 0.6223 2024-05-29 02:12:56,028 - mmdet - INFO - Epoch [11][450/7330] lr: 1.000e-05, eta: 3:10:18, time: 0.816, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0347, loss_cls: 0.1469, acc: 94.4062, loss_bbox: 0.2010, loss_mask: 0.2139, loss: 0.6099 2024-05-29 02:13:34,451 - mmdet - INFO - Epoch [11][500/7330] lr: 1.000e-05, eta: 3:09:37, time: 0.768, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0352, loss_cls: 0.1390, acc: 94.6860, loss_bbox: 0.1940, loss_mask: 0.2121, loss: 0.5934 2024-05-29 02:14:17,366 - mmdet - INFO - Epoch [11][550/7330] lr: 1.000e-05, eta: 3:08:58, time: 0.858, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0363, loss_cls: 0.1407, acc: 94.6006, loss_bbox: 0.1979, loss_mask: 0.2161, loss: 0.6033 2024-05-29 02:15:05,005 - mmdet - INFO - Epoch [11][600/7330] lr: 1.000e-05, eta: 3:08:19, time: 0.953, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0353, loss_cls: 0.1429, acc: 94.4446, loss_bbox: 0.1982, loss_mask: 0.2108, loss: 0.5993 2024-05-29 02:15:46,237 - mmdet - INFO - Epoch [11][650/7330] lr: 1.000e-05, eta: 3:07:39, time: 0.825, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0362, loss_cls: 0.1491, acc: 94.1909, loss_bbox: 0.2019, loss_mask: 0.2188, loss: 0.6192 2024-05-29 02:16:27,440 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 02:16:27,440 - mmdet - INFO - Epoch [11][700/7330] lr: 1.000e-05, eta: 3:06:59, time: 0.824, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0360, loss_cls: 0.1512, acc: 94.2285, loss_bbox: 0.2060, loss_mask: 0.2182, loss: 0.6256 2024-05-29 02:17:06,469 - mmdet - INFO - Epoch [11][750/7330] lr: 1.000e-05, eta: 3:06:19, time: 0.781, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0357, loss_cls: 0.1462, acc: 94.3970, loss_bbox: 0.2066, loss_mask: 0.2145, loss: 0.6163 2024-05-29 02:17:45,031 - mmdet - INFO - Epoch [11][800/7330] lr: 1.000e-05, eta: 3:05:38, time: 0.771, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0362, loss_cls: 0.1502, acc: 94.2427, loss_bbox: 0.2077, loss_mask: 0.2148, loss: 0.6231 2024-05-29 02:18:23,735 - mmdet - INFO - Epoch [11][850/7330] lr: 1.000e-05, eta: 3:04:58, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0356, loss_cls: 0.1489, acc: 94.3916, loss_bbox: 0.2022, loss_mask: 0.2119, loss: 0.6120 2024-05-29 02:19:02,063 - mmdet - INFO - Epoch [11][900/7330] lr: 1.000e-05, eta: 3:04:17, time: 0.767, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0363, loss_cls: 0.1489, acc: 94.2080, loss_bbox: 0.2092, loss_mask: 0.2174, loss: 0.6251 2024-05-29 02:19:40,676 - mmdet - INFO - Epoch [11][950/7330] lr: 1.000e-05, eta: 3:03:37, time: 0.772, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0361, loss_cls: 0.1457, acc: 94.4653, loss_bbox: 0.1991, loss_mask: 0.2166, loss: 0.6112 2024-05-29 02:20:19,716 - mmdet - INFO - Epoch [11][1000/7330] lr: 1.000e-05, eta: 3:02:56, time: 0.781, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0374, loss_cls: 0.1506, acc: 94.1914, loss_bbox: 0.2116, loss_mask: 0.2211, loss: 0.6337 2024-05-29 02:21:00,379 - mmdet - INFO - Epoch [11][1050/7330] lr: 1.000e-05, eta: 3:02:16, time: 0.813, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0351, loss_cls: 0.1397, acc: 94.6216, loss_bbox: 0.1928, loss_mask: 0.2109, loss: 0.5918 2024-05-29 02:21:38,843 - mmdet - INFO - Epoch [11][1100/7330] lr: 1.000e-05, eta: 3:01:36, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0349, loss_cls: 0.1489, acc: 94.3049, loss_bbox: 0.2041, loss_mask: 0.2145, loss: 0.6148 2024-05-29 02:22:24,471 - mmdet - INFO - Epoch [11][1150/7330] lr: 1.000e-05, eta: 3:00:57, time: 0.912, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0367, loss_cls: 0.1478, acc: 94.2981, loss_bbox: 0.2074, loss_mask: 0.2150, loss: 0.6198 2024-05-29 02:23:13,133 - mmdet - INFO - Epoch [11][1200/7330] lr: 1.000e-05, eta: 3:00:18, time: 0.973, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0359, loss_cls: 0.1513, acc: 94.1150, loss_bbox: 0.2093, loss_mask: 0.2151, loss: 0.6250 2024-05-29 02:23:52,241 - mmdet - INFO - Epoch [11][1250/7330] lr: 1.000e-05, eta: 2:59:38, time: 0.782, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0386, loss_cls: 0.1519, acc: 94.1218, loss_bbox: 0.2104, loss_mask: 0.2160, loss: 0.6308 2024-05-29 02:24:31,079 - mmdet - INFO - Epoch [11][1300/7330] lr: 1.000e-05, eta: 2:58:57, time: 0.777, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0345, loss_cls: 0.1384, acc: 94.6599, loss_bbox: 0.1893, loss_mask: 0.2075, loss: 0.5826 2024-05-29 02:25:10,263 - mmdet - INFO - Epoch [11][1350/7330] lr: 1.000e-05, eta: 2:58:17, time: 0.784, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0366, loss_cls: 0.1462, acc: 94.4333, loss_bbox: 0.2067, loss_mask: 0.2149, loss: 0.6180 2024-05-29 02:25:48,987 - mmdet - INFO - Epoch [11][1400/7330] lr: 1.000e-05, eta: 2:57:36, time: 0.774, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0346, loss_cls: 0.1450, acc: 94.3906, loss_bbox: 0.2008, loss_mask: 0.2133, loss: 0.6070 2024-05-29 02:26:27,992 - mmdet - INFO - Epoch [11][1450/7330] lr: 1.000e-05, eta: 2:56:56, time: 0.780, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0371, loss_cls: 0.1491, acc: 94.3025, loss_bbox: 0.2070, loss_mask: 0.2188, loss: 0.6255 2024-05-29 02:27:06,530 - mmdet - INFO - Epoch [11][1500/7330] lr: 1.000e-05, eta: 2:56:15, time: 0.771, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0353, loss_cls: 0.1439, acc: 94.4534, loss_bbox: 0.1989, loss_mask: 0.2132, loss: 0.6048 2024-05-29 02:27:45,120 - mmdet - INFO - Epoch [11][1550/7330] lr: 1.000e-05, eta: 2:55:35, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0330, loss_cls: 0.1423, acc: 94.5515, loss_bbox: 0.1989, loss_mask: 0.2170, loss: 0.6035 2024-05-29 02:28:23,470 - mmdet - INFO - Epoch [11][1600/7330] lr: 1.000e-05, eta: 2:54:54, time: 0.767, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0349, loss_cls: 0.1428, acc: 94.5288, loss_bbox: 0.2012, loss_mask: 0.2145, loss: 0.6054 2024-05-29 02:29:03,750 - mmdet - INFO - Epoch [11][1650/7330] lr: 1.000e-05, eta: 2:54:14, time: 0.806, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0332, loss_cls: 0.1366, acc: 94.7520, loss_bbox: 0.1928, loss_mask: 0.2111, loss: 0.5874 2024-05-29 02:29:47,289 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 02:29:47,289 - mmdet - INFO - Epoch [11][1700/7330] lr: 1.000e-05, eta: 2:53:35, time: 0.871, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0375, loss_cls: 0.1493, acc: 94.2981, loss_bbox: 0.2086, loss_mask: 0.2196, loss: 0.6290 2024-05-29 02:30:32,738 - mmdet - INFO - Epoch [11][1750/7330] lr: 1.000e-05, eta: 2:52:55, time: 0.909, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0395, loss_cls: 0.1529, acc: 94.0981, loss_bbox: 0.2133, loss_mask: 0.2228, loss: 0.6418 2024-05-29 02:31:13,925 - mmdet - INFO - Epoch [11][1800/7330] lr: 1.000e-05, eta: 2:52:15, time: 0.824, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0369, loss_cls: 0.1371, acc: 94.7131, loss_bbox: 0.1907, loss_mask: 0.2079, loss: 0.5839 2024-05-29 02:31:52,457 - mmdet - INFO - Epoch [11][1850/7330] lr: 1.000e-05, eta: 2:51:35, time: 0.771, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0337, loss_cls: 0.1405, acc: 94.6641, loss_bbox: 0.1952, loss_mask: 0.2102, loss: 0.5909 2024-05-29 02:32:31,227 - mmdet - INFO - Epoch [11][1900/7330] lr: 1.000e-05, eta: 2:50:55, time: 0.775, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0346, loss_cls: 0.1376, acc: 94.7251, loss_bbox: 0.1919, loss_mask: 0.2109, loss: 0.5886 2024-05-29 02:33:10,055 - mmdet - INFO - Epoch [11][1950/7330] lr: 1.000e-05, eta: 2:50:14, time: 0.777, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0387, loss_cls: 0.1512, acc: 94.2488, loss_bbox: 0.2120, loss_mask: 0.2177, loss: 0.6338 2024-05-29 02:33:48,554 - mmdet - INFO - Epoch [11][2000/7330] lr: 1.000e-05, eta: 2:49:34, time: 0.770, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0360, loss_cls: 0.1468, acc: 94.3567, loss_bbox: 0.2103, loss_mask: 0.2170, loss: 0.6224 2024-05-29 02:34:27,218 - mmdet - INFO - Epoch [11][2050/7330] lr: 1.000e-05, eta: 2:48:53, time: 0.773, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0348, loss_cls: 0.1451, acc: 94.4287, loss_bbox: 0.2060, loss_mask: 0.2134, loss: 0.6124 2024-05-29 02:35:05,653 - mmdet - INFO - Epoch [11][2100/7330] lr: 1.000e-05, eta: 2:48:13, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0358, loss_cls: 0.1384, acc: 94.6226, loss_bbox: 0.1963, loss_mask: 0.2111, loss: 0.5951 2024-05-29 02:35:44,884 - mmdet - INFO - Epoch [11][2150/7330] lr: 1.000e-05, eta: 2:47:32, time: 0.784, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0355, loss_cls: 0.1432, acc: 94.4692, loss_bbox: 0.1976, loss_mask: 0.2124, loss: 0.6024 2024-05-29 02:36:23,516 - mmdet - INFO - Epoch [11][2200/7330] lr: 1.000e-05, eta: 2:46:52, time: 0.773, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0350, loss_cls: 0.1454, acc: 94.4258, loss_bbox: 0.2026, loss_mask: 0.2117, loss: 0.6075 2024-05-29 02:37:09,437 - mmdet - INFO - Epoch [11][2250/7330] lr: 1.000e-05, eta: 2:46:13, time: 0.918, data_time: 0.047, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0372, loss_cls: 0.1472, acc: 94.3418, loss_bbox: 0.2064, loss_mask: 0.2140, loss: 0.6182 2024-05-29 02:37:52,258 - mmdet - INFO - Epoch [11][2300/7330] lr: 1.000e-05, eta: 2:45:33, time: 0.856, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0351, loss_cls: 0.1458, acc: 94.4492, loss_bbox: 0.2032, loss_mask: 0.2163, loss: 0.6135 2024-05-29 02:38:33,019 - mmdet - INFO - Epoch [11][2350/7330] lr: 1.000e-05, eta: 2:44:53, time: 0.815, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0353, loss_cls: 0.1382, acc: 94.6611, loss_bbox: 0.1986, loss_mask: 0.2172, loss: 0.6015 2024-05-29 02:39:15,316 - mmdet - INFO - Epoch [11][2400/7330] lr: 1.000e-05, eta: 2:44:13, time: 0.846, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0356, loss_cls: 0.1468, acc: 94.3831, loss_bbox: 0.2044, loss_mask: 0.2121, loss: 0.6113 2024-05-29 02:39:54,499 - mmdet - INFO - Epoch [11][2450/7330] lr: 1.000e-05, eta: 2:43:33, time: 0.784, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0372, loss_cls: 0.1440, acc: 94.5159, loss_bbox: 0.1983, loss_mask: 0.2135, loss: 0.6073 2024-05-29 02:40:32,897 - mmdet - INFO - Epoch [11][2500/7330] lr: 1.000e-05, eta: 2:42:52, time: 0.768, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0346, loss_cls: 0.1439, acc: 94.5452, loss_bbox: 0.1963, loss_mask: 0.2105, loss: 0.5974 2024-05-29 02:41:11,566 - mmdet - INFO - Epoch [11][2550/7330] lr: 1.000e-05, eta: 2:42:12, time: 0.773, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0338, loss_cls: 0.1385, acc: 94.6196, loss_bbox: 0.1915, loss_mask: 0.2092, loss: 0.5847 2024-05-29 02:41:50,705 - mmdet - INFO - Epoch [11][2600/7330] lr: 1.000e-05, eta: 2:41:31, time: 0.783, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0363, loss_cls: 0.1541, acc: 94.0674, loss_bbox: 0.2119, loss_mask: 0.2207, loss: 0.6363 2024-05-29 02:42:29,376 - mmdet - INFO - Epoch [11][2650/7330] lr: 1.000e-05, eta: 2:40:51, time: 0.773, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0379, loss_cls: 0.1497, acc: 94.2507, loss_bbox: 0.2086, loss_mask: 0.2184, loss: 0.6280 2024-05-29 02:43:07,967 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 02:43:07,967 - mmdet - INFO - Epoch [11][2700/7330] lr: 1.000e-05, eta: 2:40:11, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0335, loss_cls: 0.1369, acc: 94.7722, loss_bbox: 0.1917, loss_mask: 0.2083, loss: 0.5822 2024-05-29 02:43:49,101 - mmdet - INFO - Epoch [11][2750/7330] lr: 1.000e-05, eta: 2:39:31, time: 0.823, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0369, loss_cls: 0.1481, acc: 94.3291, loss_bbox: 0.2020, loss_mask: 0.2102, loss: 0.6103 2024-05-29 02:44:27,751 - mmdet - INFO - Epoch [11][2800/7330] lr: 1.000e-05, eta: 2:38:50, time: 0.773, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0341, loss_cls: 0.1393, acc: 94.6221, loss_bbox: 0.1966, loss_mask: 0.2071, loss: 0.5880 2024-05-29 02:45:11,338 - mmdet - INFO - Epoch [11][2850/7330] lr: 1.000e-05, eta: 2:38:11, time: 0.872, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0351, loss_cls: 0.1439, acc: 94.5010, loss_bbox: 0.2033, loss_mask: 0.2143, loss: 0.6097 2024-05-29 02:45:54,732 - mmdet - INFO - Epoch [11][2900/7330] lr: 1.000e-05, eta: 2:37:31, time: 0.868, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0362, loss_cls: 0.1565, acc: 94.0354, loss_bbox: 0.2135, loss_mask: 0.2232, loss: 0.6434 2024-05-29 02:46:35,974 - mmdet - INFO - Epoch [11][2950/7330] lr: 1.000e-05, eta: 2:36:51, time: 0.825, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0377, loss_cls: 0.1483, acc: 94.1670, loss_bbox: 0.2028, loss_mask: 0.2098, loss: 0.6117 2024-05-29 02:47:17,447 - mmdet - INFO - Epoch [11][3000/7330] lr: 1.000e-05, eta: 2:36:11, time: 0.830, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0345, loss_cls: 0.1461, acc: 94.3945, loss_bbox: 0.2028, loss_mask: 0.2152, loss: 0.6119 2024-05-29 02:47:56,433 - mmdet - INFO - Epoch [11][3050/7330] lr: 1.000e-05, eta: 2:35:30, time: 0.780, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0367, loss_cls: 0.1511, acc: 94.2317, loss_bbox: 0.2094, loss_mask: 0.2180, loss: 0.6299 2024-05-29 02:48:35,037 - mmdet - INFO - Epoch [11][3100/7330] lr: 1.000e-05, eta: 2:34:50, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0364, loss_cls: 0.1449, acc: 94.5076, loss_bbox: 0.1962, loss_mask: 0.2147, loss: 0.6052 2024-05-29 02:49:13,694 - mmdet - INFO - Epoch [11][3150/7330] lr: 1.000e-05, eta: 2:34:10, time: 0.773, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0378, loss_cls: 0.1534, acc: 94.1660, loss_bbox: 0.2106, loss_mask: 0.2188, loss: 0.6346 2024-05-29 02:49:52,779 - mmdet - INFO - Epoch [11][3200/7330] lr: 1.000e-05, eta: 2:33:29, time: 0.782, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0371, loss_cls: 0.1483, acc: 94.2122, loss_bbox: 0.2080, loss_mask: 0.2194, loss: 0.6259 2024-05-29 02:50:31,991 - mmdet - INFO - Epoch [11][3250/7330] lr: 1.000e-05, eta: 2:32:49, time: 0.784, data_time: 0.046, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0387, loss_cls: 0.1537, acc: 94.1294, loss_bbox: 0.2116, loss_mask: 0.2201, loss: 0.6379 2024-05-29 02:51:10,722 - mmdet - INFO - Epoch [11][3300/7330] lr: 1.000e-05, eta: 2:32:09, time: 0.775, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0341, loss_cls: 0.1457, acc: 94.4026, loss_bbox: 0.2053, loss_mask: 0.2149, loss: 0.6120 2024-05-29 02:51:51,713 - mmdet - INFO - Epoch [11][3350/7330] lr: 1.000e-05, eta: 2:31:28, time: 0.820, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0351, loss_cls: 0.1452, acc: 94.4512, loss_bbox: 0.1989, loss_mask: 0.2097, loss: 0.6022 2024-05-29 02:52:30,727 - mmdet - INFO - Epoch [11][3400/7330] lr: 1.000e-05, eta: 2:30:48, time: 0.780, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0379, loss_cls: 0.1501, acc: 94.1687, loss_bbox: 0.2108, loss_mask: 0.2205, loss: 0.6333 2024-05-29 02:53:13,596 - mmdet - INFO - Epoch [11][3450/7330] lr: 1.000e-05, eta: 2:30:08, time: 0.857, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0350, loss_cls: 0.1446, acc: 94.4668, loss_bbox: 0.1989, loss_mask: 0.2137, loss: 0.6049 2024-05-29 02:53:57,096 - mmdet - INFO - Epoch [11][3500/7330] lr: 1.000e-05, eta: 2:29:29, time: 0.870, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0372, loss_cls: 0.1487, acc: 94.2209, loss_bbox: 0.2065, loss_mask: 0.2174, loss: 0.6235 2024-05-29 02:54:38,082 - mmdet - INFO - Epoch [11][3550/7330] lr: 1.000e-05, eta: 2:28:49, time: 0.820, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0339, loss_cls: 0.1480, acc: 94.4136, loss_bbox: 0.2034, loss_mask: 0.2154, loss: 0.6136 2024-05-29 02:55:19,947 - mmdet - INFO - Epoch [11][3600/7330] lr: 1.000e-05, eta: 2:28:09, time: 0.837, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0351, loss_cls: 0.1380, acc: 94.7546, loss_bbox: 0.1976, loss_mask: 0.2094, loss: 0.5914 2024-05-29 02:55:58,681 - mmdet - INFO - Epoch [11][3650/7330] lr: 1.000e-05, eta: 2:27:28, time: 0.775, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0348, loss_cls: 0.1415, acc: 94.5950, loss_bbox: 0.1974, loss_mask: 0.2160, loss: 0.6037 2024-05-29 02:56:37,571 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 02:56:37,571 - mmdet - INFO - Epoch [11][3700/7330] lr: 1.000e-05, eta: 2:26:48, time: 0.778, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0362, loss_cls: 0.1535, acc: 94.0459, loss_bbox: 0.2119, loss_mask: 0.2164, loss: 0.6318 2024-05-29 02:57:16,181 - mmdet - INFO - Epoch [11][3750/7330] lr: 1.000e-05, eta: 2:26:07, time: 0.772, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0359, loss_cls: 0.1423, acc: 94.5793, loss_bbox: 0.1976, loss_mask: 0.2136, loss: 0.6021 2024-05-29 02:57:55,056 - mmdet - INFO - Epoch [11][3800/7330] lr: 1.000e-05, eta: 2:25:27, time: 0.777, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0335, loss_cls: 0.1401, acc: 94.6394, loss_bbox: 0.1947, loss_mask: 0.2069, loss: 0.5883 2024-05-29 02:58:33,637 - mmdet - INFO - Epoch [11][3850/7330] lr: 1.000e-05, eta: 2:24:47, time: 0.772, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0337, loss_cls: 0.1423, acc: 94.5913, loss_bbox: 0.1932, loss_mask: 0.2110, loss: 0.5924 2024-05-29 02:59:12,587 - mmdet - INFO - Epoch [11][3900/7330] lr: 1.000e-05, eta: 2:24:06, time: 0.779, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0370, loss_cls: 0.1473, acc: 94.3804, loss_bbox: 0.2060, loss_mask: 0.2182, loss: 0.6223 2024-05-29 02:59:54,243 - mmdet - INFO - Epoch [11][3950/7330] lr: 1.000e-05, eta: 2:23:26, time: 0.833, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0385, loss_cls: 0.1528, acc: 94.1721, loss_bbox: 0.2081, loss_mask: 0.2145, loss: 0.6286 2024-05-29 03:00:33,459 - mmdet - INFO - Epoch [11][4000/7330] lr: 1.000e-05, eta: 2:22:46, time: 0.784, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0375, loss_cls: 0.1519, acc: 94.1514, loss_bbox: 0.2094, loss_mask: 0.2155, loss: 0.6286 2024-05-29 03:01:16,786 - mmdet - INFO - Epoch [11][4050/7330] lr: 1.000e-05, eta: 2:22:06, time: 0.867, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0358, loss_cls: 0.1460, acc: 94.4446, loss_bbox: 0.1993, loss_mask: 0.2116, loss: 0.6050 2024-05-29 03:01:59,997 - mmdet - INFO - Epoch [11][4100/7330] lr: 1.000e-05, eta: 2:21:27, time: 0.864, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0354, loss_cls: 0.1454, acc: 94.4456, loss_bbox: 0.1994, loss_mask: 0.2149, loss: 0.6082 2024-05-29 03:02:41,313 - mmdet - INFO - Epoch [11][4150/7330] lr: 1.000e-05, eta: 2:20:46, time: 0.826, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0351, loss_cls: 0.1460, acc: 94.5120, loss_bbox: 0.2023, loss_mask: 0.2186, loss: 0.6155 2024-05-29 03:03:22,565 - mmdet - INFO - Epoch [11][4200/7330] lr: 1.000e-05, eta: 2:20:06, time: 0.825, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0350, loss_cls: 0.1409, acc: 94.5535, loss_bbox: 0.1974, loss_mask: 0.2085, loss: 0.5940 2024-05-29 03:04:01,226 - mmdet - INFO - Epoch [11][4250/7330] lr: 1.000e-05, eta: 2:19:26, time: 0.773, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0347, loss_cls: 0.1421, acc: 94.5496, loss_bbox: 0.2005, loss_mask: 0.2119, loss: 0.6010 2024-05-29 03:04:39,412 - mmdet - INFO - Epoch [11][4300/7330] lr: 1.000e-05, eta: 2:18:46, time: 0.764, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0332, loss_cls: 0.1371, acc: 94.7683, loss_bbox: 0.1922, loss_mask: 0.2115, loss: 0.5849 2024-05-29 03:05:18,028 - mmdet - INFO - Epoch [11][4350/7330] lr: 1.000e-05, eta: 2:18:05, time: 0.772, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0361, loss_cls: 0.1479, acc: 94.3582, loss_bbox: 0.2035, loss_mask: 0.2193, loss: 0.6205 2024-05-29 03:05:56,786 - mmdet - INFO - Epoch [11][4400/7330] lr: 1.000e-05, eta: 2:17:25, time: 0.775, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0369, loss_cls: 0.1458, acc: 94.3862, loss_bbox: 0.2071, loss_mask: 0.2161, loss: 0.6199 2024-05-29 03:06:35,659 - mmdet - INFO - Epoch [11][4450/7330] lr: 1.000e-05, eta: 2:16:44, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0346, loss_cls: 0.1418, acc: 94.5854, loss_bbox: 0.1969, loss_mask: 0.2103, loss: 0.5954 2024-05-29 03:07:14,445 - mmdet - INFO - Epoch [11][4500/7330] lr: 1.000e-05, eta: 2:16:04, time: 0.776, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0372, loss_cls: 0.1511, acc: 94.1433, loss_bbox: 0.2046, loss_mask: 0.2155, loss: 0.6209 2024-05-29 03:07:56,300 - mmdet - INFO - Epoch [11][4550/7330] lr: 1.000e-05, eta: 2:15:24, time: 0.837, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0357, loss_cls: 0.1430, acc: 94.3677, loss_bbox: 0.2019, loss_mask: 0.2183, loss: 0.6111 2024-05-29 03:08:35,021 - mmdet - INFO - Epoch [11][4600/7330] lr: 1.000e-05, eta: 2:14:44, time: 0.774, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0357, loss_cls: 0.1484, acc: 94.2847, loss_bbox: 0.2045, loss_mask: 0.2176, loss: 0.6183 2024-05-29 03:09:18,272 - mmdet - INFO - Epoch [11][4650/7330] lr: 1.000e-05, eta: 2:14:04, time: 0.865, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0361, loss_cls: 0.1468, acc: 94.3687, loss_bbox: 0.2055, loss_mask: 0.2194, loss: 0.6218 2024-05-29 03:10:02,084 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 03:10:02,085 - mmdet - INFO - Epoch [11][4700/7330] lr: 1.000e-05, eta: 2:13:24, time: 0.876, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0376, loss_cls: 0.1508, acc: 94.2520, loss_bbox: 0.2120, loss_mask: 0.2171, loss: 0.6310 2024-05-29 03:10:46,405 - mmdet - INFO - Epoch [11][4750/7330] lr: 1.000e-05, eta: 2:12:45, time: 0.886, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0362, loss_cls: 0.1474, acc: 94.3694, loss_bbox: 0.2082, loss_mask: 0.2130, loss: 0.6174 2024-05-29 03:11:25,108 - mmdet - INFO - Epoch [11][4800/7330] lr: 1.000e-05, eta: 2:12:04, time: 0.774, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0367, loss_cls: 0.1451, acc: 94.4646, loss_bbox: 0.2002, loss_mask: 0.2158, loss: 0.6111 2024-05-29 03:12:04,228 - mmdet - INFO - Epoch [11][4850/7330] lr: 1.000e-05, eta: 2:11:24, time: 0.782, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0348, loss_cls: 0.1457, acc: 94.3958, loss_bbox: 0.2015, loss_mask: 0.2164, loss: 0.6118 2024-05-29 03:12:43,197 - mmdet - INFO - Epoch [11][4900/7330] lr: 1.000e-05, eta: 2:10:44, time: 0.779, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0348, loss_cls: 0.1445, acc: 94.4521, loss_bbox: 0.1992, loss_mask: 0.2104, loss: 0.6010 2024-05-29 03:13:21,970 - mmdet - INFO - Epoch [11][4950/7330] lr: 1.000e-05, eta: 2:10:03, time: 0.775, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0359, loss_cls: 0.1494, acc: 94.2896, loss_bbox: 0.2078, loss_mask: 0.2186, loss: 0.6255 2024-05-29 03:14:00,820 - mmdet - INFO - Epoch [11][5000/7330] lr: 1.000e-05, eta: 2:09:23, time: 0.777, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0347, loss_cls: 0.1435, acc: 94.4324, loss_bbox: 0.1967, loss_mask: 0.2078, loss: 0.5955 2024-05-29 03:14:39,761 - mmdet - INFO - Epoch [11][5050/7330] lr: 1.000e-05, eta: 2:08:43, time: 0.779, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0360, loss_cls: 0.1478, acc: 94.3274, loss_bbox: 0.2065, loss_mask: 0.2178, loss: 0.6212 2024-05-29 03:15:18,448 - mmdet - INFO - Epoch [11][5100/7330] lr: 1.000e-05, eta: 2:08:02, time: 0.774, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0361, loss_cls: 0.1453, acc: 94.3923, loss_bbox: 0.2044, loss_mask: 0.2171, loss: 0.6148 2024-05-29 03:16:00,533 - mmdet - INFO - Epoch [11][5150/7330] lr: 1.000e-05, eta: 2:07:22, time: 0.842, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0355, loss_cls: 0.1436, acc: 94.5415, loss_bbox: 0.2000, loss_mask: 0.2179, loss: 0.6096 2024-05-29 03:16:41,842 - mmdet - INFO - Epoch [11][5200/7330] lr: 1.000e-05, eta: 2:06:42, time: 0.826, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0377, loss_cls: 0.1531, acc: 94.1299, loss_bbox: 0.2119, loss_mask: 0.2201, loss: 0.6360 2024-05-29 03:17:23,159 - mmdet - INFO - Epoch [11][5250/7330] lr: 1.000e-05, eta: 2:06:02, time: 0.826, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0356, loss_cls: 0.1442, acc: 94.4580, loss_bbox: 0.2034, loss_mask: 0.2126, loss: 0.6085 2024-05-29 03:18:09,378 - mmdet - INFO - Epoch [11][5300/7330] lr: 1.000e-05, eta: 2:05:23, time: 0.924, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0345, loss_cls: 0.1376, acc: 94.6814, loss_bbox: 0.1907, loss_mask: 0.2073, loss: 0.5811 2024-05-29 03:18:51,440 - mmdet - INFO - Epoch [11][5350/7330] lr: 1.000e-05, eta: 2:04:43, time: 0.841, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0349, loss_cls: 0.1471, acc: 94.3394, loss_bbox: 0.2026, loss_mask: 0.2199, loss: 0.6188 2024-05-29 03:19:30,563 - mmdet - INFO - Epoch [11][5400/7330] lr: 1.000e-05, eta: 2:04:02, time: 0.782, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0358, loss_cls: 0.1457, acc: 94.5325, loss_bbox: 0.2011, loss_mask: 0.2145, loss: 0.6102 2024-05-29 03:20:09,668 - mmdet - INFO - Epoch [11][5450/7330] lr: 1.000e-05, eta: 2:03:22, time: 0.782, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0381, loss_cls: 0.1579, acc: 94.0410, loss_bbox: 0.2131, loss_mask: 0.2234, loss: 0.6469 2024-05-29 03:20:48,247 - mmdet - INFO - Epoch [11][5500/7330] lr: 1.000e-05, eta: 2:02:42, time: 0.772, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0356, loss_cls: 0.1424, acc: 94.6101, loss_bbox: 0.1976, loss_mask: 0.2143, loss: 0.6029 2024-05-29 03:21:26,829 - mmdet - INFO - Epoch [11][5550/7330] lr: 1.000e-05, eta: 2:02:01, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0367, loss_cls: 0.1466, acc: 94.4233, loss_bbox: 0.2029, loss_mask: 0.2171, loss: 0.6172 2024-05-29 03:22:05,420 - mmdet - INFO - Epoch [11][5600/7330] lr: 1.000e-05, eta: 2:01:21, time: 0.772, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0347, loss_cls: 0.1404, acc: 94.7056, loss_bbox: 0.1930, loss_mask: 0.2095, loss: 0.5894 2024-05-29 03:22:44,476 - mmdet - INFO - Epoch [11][5650/7330] lr: 1.000e-05, eta: 2:00:41, time: 0.781, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0367, loss_cls: 0.1498, acc: 94.2764, loss_bbox: 0.2058, loss_mask: 0.2116, loss: 0.6178 2024-05-29 03:23:23,142 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 03:23:23,142 - mmdet - INFO - Epoch [11][5700/7330] lr: 1.000e-05, eta: 2:00:00, time: 0.773, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0349, loss_cls: 0.1446, acc: 94.4771, loss_bbox: 0.2004, loss_mask: 0.2164, loss: 0.6093 2024-05-29 03:24:04,690 - mmdet - INFO - Epoch [11][5750/7330] lr: 1.000e-05, eta: 1:59:20, time: 0.831, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0345, loss_cls: 0.1479, acc: 94.3611, loss_bbox: 0.1999, loss_mask: 0.2125, loss: 0.6066 2024-05-29 03:24:45,582 - mmdet - INFO - Epoch [11][5800/7330] lr: 1.000e-05, eta: 1:58:40, time: 0.818, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0363, loss_cls: 0.1440, acc: 94.4998, loss_bbox: 0.2012, loss_mask: 0.2121, loss: 0.6067 2024-05-29 03:25:30,084 - mmdet - INFO - Epoch [11][5850/7330] lr: 1.000e-05, eta: 1:58:00, time: 0.890, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0362, loss_cls: 0.1481, acc: 94.3376, loss_bbox: 0.2015, loss_mask: 0.2143, loss: 0.6137 2024-05-29 03:26:14,281 - mmdet - INFO - Epoch [11][5900/7330] lr: 1.000e-05, eta: 1:57:21, time: 0.884, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0375, loss_cls: 0.1530, acc: 94.0833, loss_bbox: 0.2092, loss_mask: 0.2180, loss: 0.6312 2024-05-29 03:26:58,160 - mmdet - INFO - Epoch [11][5950/7330] lr: 1.000e-05, eta: 1:56:41, time: 0.878, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0361, loss_cls: 0.1458, acc: 94.3735, loss_bbox: 0.1990, loss_mask: 0.2134, loss: 0.6070 2024-05-29 03:27:37,036 - mmdet - INFO - Epoch [11][6000/7330] lr: 1.000e-05, eta: 1:56:01, time: 0.778, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0352, loss_cls: 0.1505, acc: 94.3760, loss_bbox: 0.2073, loss_mask: 0.2136, loss: 0.6187 2024-05-29 03:28:15,585 - mmdet - INFO - Epoch [11][6050/7330] lr: 1.000e-05, eta: 1:55:20, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0354, loss_cls: 0.1468, acc: 94.3577, loss_bbox: 0.2046, loss_mask: 0.2204, loss: 0.6201 2024-05-29 03:28:54,643 - mmdet - INFO - Epoch [11][6100/7330] lr: 1.000e-05, eta: 1:54:40, time: 0.781, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0373, loss_cls: 0.1493, acc: 94.1912, loss_bbox: 0.2081, loss_mask: 0.2120, loss: 0.6208 2024-05-29 03:29:33,774 - mmdet - INFO - Epoch [11][6150/7330] lr: 1.000e-05, eta: 1:54:00, time: 0.783, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0369, loss_cls: 0.1461, acc: 94.3870, loss_bbox: 0.2032, loss_mask: 0.2155, loss: 0.6151 2024-05-29 03:30:12,873 - mmdet - INFO - Epoch [11][6200/7330] lr: 1.000e-05, eta: 1:53:19, time: 0.782, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1461, acc: 94.3479, loss_bbox: 0.2039, loss_mask: 0.2159, loss: 0.6149 2024-05-29 03:30:51,698 - mmdet - INFO - Epoch [11][6250/7330] lr: 1.000e-05, eta: 1:52:39, time: 0.776, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0345, loss_cls: 0.1416, acc: 94.5803, loss_bbox: 0.1983, loss_mask: 0.2123, loss: 0.6000 2024-05-29 03:31:33,155 - mmdet - INFO - Epoch [11][6300/7330] lr: 1.000e-05, eta: 1:51:59, time: 0.829, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0348, loss_cls: 0.1504, acc: 94.2766, loss_bbox: 0.2044, loss_mask: 0.2139, loss: 0.6165 2024-05-29 03:32:14,277 - mmdet - INFO - Epoch [11][6350/7330] lr: 1.000e-05, eta: 1:51:19, time: 0.822, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0375, loss_cls: 0.1547, acc: 94.1526, loss_bbox: 0.2131, loss_mask: 0.2160, loss: 0.6356 2024-05-29 03:32:55,263 - mmdet - INFO - Epoch [11][6400/7330] lr: 1.000e-05, eta: 1:50:39, time: 0.819, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0366, loss_cls: 0.1508, acc: 94.1951, loss_bbox: 0.2080, loss_mask: 0.2144, loss: 0.6241 2024-05-29 03:33:38,568 - mmdet - INFO - Epoch [11][6450/7330] lr: 1.000e-05, eta: 1:49:59, time: 0.866, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0356, loss_cls: 0.1461, acc: 94.3354, loss_bbox: 0.2063, loss_mask: 0.2175, loss: 0.6184 2024-05-29 03:34:21,922 - mmdet - INFO - Epoch [11][6500/7330] lr: 1.000e-05, eta: 1:49:19, time: 0.867, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0355, loss_cls: 0.1414, acc: 94.5750, loss_bbox: 0.1998, loss_mask: 0.2147, loss: 0.6041 2024-05-29 03:35:02,817 - mmdet - INFO - Epoch [11][6550/7330] lr: 1.000e-05, eta: 1:48:39, time: 0.818, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0371, loss_cls: 0.1533, acc: 94.1787, loss_bbox: 0.2106, loss_mask: 0.2189, loss: 0.6331 2024-05-29 03:35:41,628 - mmdet - INFO - Epoch [11][6600/7330] lr: 1.000e-05, eta: 1:47:59, time: 0.776, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0334, loss_cls: 0.1392, acc: 94.6711, loss_bbox: 0.1936, loss_mask: 0.2089, loss: 0.5874 2024-05-29 03:36:20,266 - mmdet - INFO - Epoch [11][6650/7330] lr: 1.000e-05, eta: 1:47:18, time: 0.773, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0365, loss_cls: 0.1419, acc: 94.5640, loss_bbox: 0.1963, loss_mask: 0.2139, loss: 0.6012 2024-05-29 03:36:58,816 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 03:36:58,816 - mmdet - INFO - Epoch [11][6700/7330] lr: 1.000e-05, eta: 1:46:38, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0363, loss_cls: 0.1451, acc: 94.4270, loss_bbox: 0.2001, loss_mask: 0.2113, loss: 0.6063 2024-05-29 03:37:37,526 - mmdet - INFO - Epoch [11][6750/7330] lr: 1.000e-05, eta: 1:45:57, time: 0.774, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0374, loss_cls: 0.1493, acc: 94.3184, loss_bbox: 0.2061, loss_mask: 0.2159, loss: 0.6227 2024-05-29 03:38:18,722 - mmdet - INFO - Epoch [11][6800/7330] lr: 1.000e-05, eta: 1:45:17, time: 0.824, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0359, loss_cls: 0.1415, acc: 94.5676, loss_bbox: 0.1947, loss_mask: 0.2112, loss: 0.5961 2024-05-29 03:39:00,056 - mmdet - INFO - Epoch [11][6850/7330] lr: 1.000e-05, eta: 1:44:37, time: 0.827, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0354, loss_cls: 0.1460, acc: 94.4448, loss_bbox: 0.2029, loss_mask: 0.2170, loss: 0.6134 2024-05-29 03:39:38,885 - mmdet - INFO - Epoch [11][6900/7330] lr: 1.000e-05, eta: 1:43:57, time: 0.777, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0352, loss_cls: 0.1489, acc: 94.2959, loss_bbox: 0.2000, loss_mask: 0.2151, loss: 0.6127 2024-05-29 03:40:17,806 - mmdet - INFO - Epoch [11][6950/7330] lr: 1.000e-05, eta: 1:43:17, time: 0.778, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0355, loss_cls: 0.1473, acc: 94.3425, loss_bbox: 0.2033, loss_mask: 0.2158, loss: 0.6153 2024-05-29 03:40:58,713 - mmdet - INFO - Epoch [11][7000/7330] lr: 1.000e-05, eta: 1:42:37, time: 0.818, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0358, loss_cls: 0.1450, acc: 94.3911, loss_bbox: 0.2053, loss_mask: 0.2179, loss: 0.6159 2024-05-29 03:41:42,161 - mmdet - INFO - Epoch [11][7050/7330] lr: 1.000e-05, eta: 1:41:57, time: 0.869, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0354, loss_cls: 0.1416, acc: 94.5510, loss_bbox: 0.1997, loss_mask: 0.2149, loss: 0.6036 2024-05-29 03:42:25,831 - mmdet - INFO - Epoch [11][7100/7330] lr: 1.000e-05, eta: 1:41:17, time: 0.874, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0361, loss_cls: 0.1443, acc: 94.4485, loss_bbox: 0.2020, loss_mask: 0.2106, loss: 0.6075 2024-05-29 03:43:06,653 - mmdet - INFO - Epoch [11][7150/7330] lr: 1.000e-05, eta: 1:40:37, time: 0.816, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0365, loss_cls: 0.1530, acc: 94.1826, loss_bbox: 0.2097, loss_mask: 0.2194, loss: 0.6323 2024-05-29 03:43:45,184 - mmdet - INFO - Epoch [11][7200/7330] lr: 1.000e-05, eta: 1:39:56, time: 0.771, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0352, loss_cls: 0.1448, acc: 94.5554, loss_bbox: 0.2011, loss_mask: 0.2216, loss: 0.6153 2024-05-29 03:44:24,012 - mmdet - INFO - Epoch [11][7250/7330] lr: 1.000e-05, eta: 1:39:16, time: 0.777, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0371, loss_cls: 0.1493, acc: 94.2771, loss_bbox: 0.2039, loss_mask: 0.2142, loss: 0.6179 2024-05-29 03:45:02,881 - mmdet - INFO - Epoch [11][7300/7330] lr: 1.000e-05, eta: 1:38:36, time: 0.777, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0368, loss_cls: 0.1484, acc: 94.3701, loss_bbox: 0.2048, loss_mask: 0.2164, loss: 0.6205 2024-05-29 03:45:26,881 - mmdet - INFO - Saving checkpoint at 11 epochs 2024-05-29 03:47:46,925 - mmdet - INFO - Evaluating bbox... 2024-05-29 03:48:11,273 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.470 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.699 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.278 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.506 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.654 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.579 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.377 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.620 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.759 2024-05-29 03:48:11,274 - mmdet - INFO - Evaluating segm... 2024-05-29 03:48:35,336 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.418 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.661 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.445 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.198 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.642 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.519 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.305 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.564 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.718 2024-05-29 03:48:35,847 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 03:48:35,850 - mmdet - INFO - Epoch(val) [11][625] bbox_mAP: 0.4700, bbox_mAP_50: 0.6990, bbox_mAP_75: 0.5080, bbox_mAP_s: 0.2780, bbox_mAP_m: 0.5060, bbox_mAP_l: 0.6540, bbox_mAP_copypaste: 0.470 0.699 0.508 0.278 0.506 0.654, segm_mAP: 0.4180, segm_mAP_50: 0.6610, segm_mAP_75: 0.4450, segm_mAP_s: 0.1980, segm_mAP_m: 0.4500, segm_mAP_l: 0.6420, segm_mAP_copypaste: 0.418 0.661 0.445 0.198 0.450 0.642 2024-05-29 03:49:23,582 - mmdet - INFO - Epoch [12][50/7330] lr: 1.000e-06, eta: 1:37:30, time: 0.954, data_time: 0.101, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0349, loss_cls: 0.1393, acc: 94.6426, loss_bbox: 0.1971, loss_mask: 0.2143, loss: 0.5981 2024-05-29 03:50:02,960 - mmdet - INFO - Epoch [12][100/7330] lr: 1.000e-06, eta: 1:36:50, time: 0.788, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0381, loss_cls: 0.1510, acc: 94.1042, loss_bbox: 0.2093, loss_mask: 0.2154, loss: 0.6275 2024-05-29 03:50:45,208 - mmdet - INFO - Epoch [12][150/7330] lr: 1.000e-06, eta: 1:36:10, time: 0.845, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0357, loss_cls: 0.1413, acc: 94.4409, loss_bbox: 0.2025, loss_mask: 0.2189, loss: 0.6112 2024-05-29 03:51:24,066 - mmdet - INFO - Epoch [12][200/7330] lr: 1.000e-06, eta: 1:35:29, time: 0.777, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0361, loss_cls: 0.1446, acc: 94.5000, loss_bbox: 0.2006, loss_mask: 0.2124, loss: 0.6064 2024-05-29 03:52:02,674 - mmdet - INFO - Epoch [12][250/7330] lr: 1.000e-06, eta: 1:34:49, time: 0.772, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0367, loss_cls: 0.1457, acc: 94.3687, loss_bbox: 0.1993, loss_mask: 0.2154, loss: 0.6104 2024-05-29 03:52:41,127 - mmdet - INFO - Epoch [12][300/7330] lr: 1.000e-06, eta: 1:34:09, time: 0.769, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0352, loss_cls: 0.1370, acc: 94.7046, loss_bbox: 0.1935, loss_mask: 0.2123, loss: 0.5897 2024-05-29 03:53:22,586 - mmdet - INFO - Epoch [12][350/7330] lr: 1.000e-06, eta: 1:33:29, time: 0.829, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0340, loss_cls: 0.1427, acc: 94.5015, loss_bbox: 0.2048, loss_mask: 0.2132, loss: 0.6068 2024-05-29 03:54:01,127 - mmdet - INFO - Epoch [12][400/7330] lr: 1.000e-06, eta: 1:32:48, time: 0.771, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0372, loss_cls: 0.1468, acc: 94.4180, loss_bbox: 0.2034, loss_mask: 0.2195, loss: 0.6200 2024-05-29 03:54:39,370 - mmdet - INFO - Epoch [12][450/7330] lr: 1.000e-06, eta: 1:32:08, time: 0.765, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0371, loss_cls: 0.1462, acc: 94.4209, loss_bbox: 0.2042, loss_mask: 0.2118, loss: 0.6117 2024-05-29 03:55:17,811 - mmdet - INFO - Epoch [12][500/7330] lr: 1.000e-06, eta: 1:31:28, time: 0.769, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0356, loss_cls: 0.1466, acc: 94.3723, loss_bbox: 0.2057, loss_mask: 0.2129, loss: 0.6125 2024-05-29 03:55:58,415 - mmdet - INFO - Epoch [12][550/7330] lr: 1.000e-06, eta: 1:30:48, time: 0.812, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0374, loss_cls: 0.1514, acc: 94.1958, loss_bbox: 0.2086, loss_mask: 0.2129, loss: 0.6250 2024-05-29 03:56:36,815 - mmdet - INFO - Epoch [12][600/7330] lr: 1.000e-06, eta: 1:30:07, time: 0.768, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0344, loss_cls: 0.1381, acc: 94.6655, loss_bbox: 0.1939, loss_mask: 0.2092, loss: 0.5884 2024-05-29 03:57:22,453 - mmdet - INFO - Epoch [12][650/7330] lr: 1.000e-06, eta: 1:29:27, time: 0.913, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0374, loss_cls: 0.1505, acc: 94.2222, loss_bbox: 0.2063, loss_mask: 0.2163, loss: 0.6237 2024-05-29 03:58:03,634 - mmdet - INFO - Epoch [12][700/7330] lr: 1.000e-06, eta: 1:28:47, time: 0.824, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0379, loss_cls: 0.1525, acc: 94.0471, loss_bbox: 0.2139, loss_mask: 0.2188, loss: 0.6364 2024-05-29 03:58:41,906 - mmdet - INFO - Epoch [12][750/7330] lr: 1.000e-06, eta: 1:28:07, time: 0.765, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0358, loss_cls: 0.1405, acc: 94.6755, loss_bbox: 0.1939, loss_mask: 0.2091, loss: 0.5921 2024-05-29 03:59:25,106 - mmdet - INFO - Epoch [12][800/7330] lr: 1.000e-06, eta: 1:27:27, time: 0.864, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0359, loss_cls: 0.1455, acc: 94.3894, loss_bbox: 0.2028, loss_mask: 0.2154, loss: 0.6110 2024-05-29 04:00:05,821 - mmdet - INFO - Epoch [12][850/7330] lr: 1.000e-06, eta: 1:26:47, time: 0.814, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0355, loss_cls: 0.1445, acc: 94.4561, loss_bbox: 0.2029, loss_mask: 0.2209, loss: 0.6168 2024-05-29 04:00:44,112 - mmdet - INFO - Epoch [12][900/7330] lr: 1.000e-06, eta: 1:26:07, time: 0.766, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0336, loss_cls: 0.1361, acc: 94.7761, loss_bbox: 0.1912, loss_mask: 0.2072, loss: 0.5793 2024-05-29 04:01:22,357 - mmdet - INFO - Epoch [12][950/7330] lr: 1.000e-06, eta: 1:25:26, time: 0.765, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0350, loss_cls: 0.1487, acc: 94.3145, loss_bbox: 0.2017, loss_mask: 0.2146, loss: 0.6126 2024-05-29 04:02:00,917 - mmdet - INFO - Epoch [12][1000/7330] lr: 1.000e-06, eta: 1:24:46, time: 0.771, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0332, loss_cls: 0.1430, acc: 94.4949, loss_bbox: 0.2009, loss_mask: 0.2091, loss: 0.5983 2024-05-29 04:02:39,280 - mmdet - INFO - Epoch [12][1050/7330] lr: 1.000e-06, eta: 1:24:06, time: 0.767, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0342, loss_cls: 0.1465, acc: 94.3950, loss_bbox: 0.1975, loss_mask: 0.2143, loss: 0.6052 2024-05-29 04:03:18,127 - mmdet - INFO - Epoch [12][1100/7330] lr: 1.000e-06, eta: 1:23:25, time: 0.777, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0346, loss_cls: 0.1396, acc: 94.6208, loss_bbox: 0.1984, loss_mask: 0.2155, loss: 0.6008 2024-05-29 04:04:01,714 - mmdet - INFO - Epoch [12][1150/7330] lr: 1.000e-06, eta: 1:22:45, time: 0.872, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0361, loss_cls: 0.1424, acc: 94.5063, loss_bbox: 0.1980, loss_mask: 0.2106, loss: 0.5999 2024-05-29 04:04:44,226 - mmdet - INFO - Epoch [12][1200/7330] lr: 1.000e-06, eta: 1:22:05, time: 0.850, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0361, loss_cls: 0.1489, acc: 94.2043, loss_bbox: 0.2057, loss_mask: 0.2162, loss: 0.6205 2024-05-29 04:05:25,150 - mmdet - INFO - Epoch [12][1250/7330] lr: 1.000e-06, eta: 1:21:25, time: 0.818, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0339, loss_cls: 0.1412, acc: 94.5898, loss_bbox: 0.1941, loss_mask: 0.2106, loss: 0.5919 2024-05-29 04:06:07,865 - mmdet - INFO - Epoch [12][1300/7330] lr: 1.000e-06, eta: 1:20:45, time: 0.854, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0351, loss_cls: 0.1395, acc: 94.6565, loss_bbox: 0.1964, loss_mask: 0.2151, loss: 0.5978 2024-05-29 04:06:46,597 - mmdet - INFO - Epoch [12][1350/7330] lr: 1.000e-06, eta: 1:20:05, time: 0.775, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0388, loss_cls: 0.1522, acc: 94.1636, loss_bbox: 0.2113, loss_mask: 0.2169, loss: 0.6327 2024-05-29 04:07:25,167 - mmdet - INFO - Epoch [12][1400/7330] lr: 1.000e-06, eta: 1:19:25, time: 0.771, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0343, loss_cls: 0.1401, acc: 94.5623, loss_bbox: 0.1964, loss_mask: 0.2089, loss: 0.5923 2024-05-29 04:08:07,980 - mmdet - INFO - Epoch [12][1450/7330] lr: 1.000e-06, eta: 1:18:45, time: 0.856, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0355, loss_cls: 0.1461, acc: 94.3694, loss_bbox: 0.2073, loss_mask: 0.2143, loss: 0.6161 2024-05-29 04:08:46,198 - mmdet - INFO - Epoch [12][1500/7330] lr: 1.000e-06, eta: 1:18:04, time: 0.764, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0371, loss_cls: 0.1495, acc: 94.2710, loss_bbox: 0.2053, loss_mask: 0.2164, loss: 0.6220 2024-05-29 04:09:24,805 - mmdet - INFO - Epoch [12][1550/7330] lr: 1.000e-06, eta: 1:17:24, time: 0.772, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0364, loss_cls: 0.1490, acc: 94.2107, loss_bbox: 0.2068, loss_mask: 0.2126, loss: 0.6178 2024-05-29 04:10:03,100 - mmdet - INFO - Epoch [12][1600/7330] lr: 1.000e-06, eta: 1:16:44, time: 0.766, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0366, loss_cls: 0.1434, acc: 94.4438, loss_bbox: 0.2001, loss_mask: 0.2126, loss: 0.6056 2024-05-29 04:10:41,253 - mmdet - INFO - Epoch [12][1650/7330] lr: 1.000e-06, eta: 1:16:04, time: 0.763, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0362, loss_cls: 0.1485, acc: 94.2971, loss_bbox: 0.2029, loss_mask: 0.2180, loss: 0.6185 2024-05-29 04:11:22,092 - mmdet - INFO - Epoch [12][1700/7330] lr: 1.000e-06, eta: 1:15:23, time: 0.817, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0337, loss_cls: 0.1424, acc: 94.5173, loss_bbox: 0.2009, loss_mask: 0.2113, loss: 0.6004 2024-05-29 04:12:02,149 - mmdet - INFO - Epoch [12][1750/7330] lr: 1.000e-06, eta: 1:14:43, time: 0.801, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0329, loss_cls: 0.1398, acc: 94.6262, loss_bbox: 0.1894, loss_mask: 0.2100, loss: 0.5828 2024-05-29 04:12:43,584 - mmdet - INFO - Epoch [12][1800/7330] lr: 1.000e-06, eta: 1:14:03, time: 0.829, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0364, loss_cls: 0.1448, acc: 94.4695, loss_bbox: 0.2032, loss_mask: 0.2178, loss: 0.6176 2024-05-29 04:13:24,609 - mmdet - INFO - Epoch [12][1850/7330] lr: 1.000e-06, eta: 1:13:23, time: 0.820, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0374, loss_cls: 0.1463, acc: 94.2986, loss_bbox: 0.2075, loss_mask: 0.2156, loss: 0.6205 2024-05-29 04:14:07,398 - mmdet - INFO - Epoch [12][1900/7330] lr: 1.000e-06, eta: 1:12:43, time: 0.856, data_time: 0.042, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1488, acc: 94.3257, loss_bbox: 0.2031, loss_mask: 0.2162, loss: 0.6172 2024-05-29 04:14:46,113 - mmdet - INFO - Epoch [12][1950/7330] lr: 1.000e-06, eta: 1:12:03, time: 0.774, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0369, loss_cls: 0.1422, acc: 94.5046, loss_bbox: 0.2028, loss_mask: 0.2118, loss: 0.6072 2024-05-29 04:15:24,737 - mmdet - INFO - Epoch [12][2000/7330] lr: 1.000e-06, eta: 1:11:22, time: 0.772, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0351, loss_cls: 0.1406, acc: 94.5981, loss_bbox: 0.1967, loss_mask: 0.2096, loss: 0.5945 2024-05-29 04:16:07,291 - mmdet - INFO - Epoch [12][2050/7330] lr: 1.000e-06, eta: 1:10:42, time: 0.851, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0349, loss_cls: 0.1395, acc: 94.6055, loss_bbox: 0.1955, loss_mask: 0.2073, loss: 0.5900 2024-05-29 04:16:45,598 - mmdet - INFO - Epoch [12][2100/7330] lr: 1.000e-06, eta: 1:10:02, time: 0.766, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0363, loss_cls: 0.1412, acc: 94.6052, loss_bbox: 0.1997, loss_mask: 0.2166, loss: 0.6063 2024-05-29 04:17:23,889 - mmdet - INFO - Epoch [12][2150/7330] lr: 1.000e-06, eta: 1:09:22, time: 0.766, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0339, loss_cls: 0.1427, acc: 94.4939, loss_bbox: 0.1983, loss_mask: 0.2122, loss: 0.5993 2024-05-29 04:18:02,513 - mmdet - INFO - Epoch [12][2200/7330] lr: 1.000e-06, eta: 1:08:42, time: 0.772, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0358, loss_cls: 0.1436, acc: 94.4622, loss_bbox: 0.2057, loss_mask: 0.2140, loss: 0.6118 2024-05-29 04:18:40,840 - mmdet - INFO - Epoch [12][2250/7330] lr: 1.000e-06, eta: 1:08:01, time: 0.766, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0315, loss_cls: 0.1369, acc: 94.6855, loss_bbox: 0.1924, loss_mask: 0.2094, loss: 0.5823 2024-05-29 04:19:23,992 - mmdet - INFO - Epoch [12][2300/7330] lr: 1.000e-06, eta: 1:07:21, time: 0.863, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0344, loss_cls: 0.1420, acc: 94.5486, loss_bbox: 0.1955, loss_mask: 0.2134, loss: 0.5982 2024-05-29 04:20:04,402 - mmdet - INFO - Epoch [12][2350/7330] lr: 1.000e-06, eta: 1:06:41, time: 0.808, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0341, loss_cls: 0.1380, acc: 94.7168, loss_bbox: 0.1940, loss_mask: 0.2072, loss: 0.5854 2024-05-29 04:20:43,074 - mmdet - INFO - Epoch [12][2400/7330] lr: 1.000e-06, eta: 1:06:01, time: 0.773, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0369, loss_cls: 0.1456, acc: 94.3862, loss_bbox: 0.2062, loss_mask: 0.2137, loss: 0.6150 2024-05-29 04:21:24,166 - mmdet - INFO - Epoch [12][2450/7330] lr: 1.000e-06, eta: 1:05:21, time: 0.822, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0363, loss_cls: 0.1414, acc: 94.5618, loss_bbox: 0.1988, loss_mask: 0.2131, loss: 0.6024 2024-05-29 04:22:06,705 - mmdet - INFO - Epoch [12][2500/7330] lr: 1.000e-06, eta: 1:04:41, time: 0.851, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0342, loss_cls: 0.1396, acc: 94.5986, loss_bbox: 0.1946, loss_mask: 0.2069, loss: 0.5881 2024-05-29 04:22:47,664 - mmdet - INFO - Epoch [12][2550/7330] lr: 1.000e-06, eta: 1:04:01, time: 0.819, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0355, loss_cls: 0.1397, acc: 94.6384, loss_bbox: 0.1982, loss_mask: 0.2107, loss: 0.5964 2024-05-29 04:23:26,063 - mmdet - INFO - Epoch [12][2600/7330] lr: 1.000e-06, eta: 1:03:20, time: 0.768, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0367, loss_cls: 0.1490, acc: 94.3181, loss_bbox: 0.2022, loss_mask: 0.2144, loss: 0.6149 2024-05-29 04:24:06,835 - mmdet - INFO - Epoch [12][2650/7330] lr: 1.000e-06, eta: 1:02:40, time: 0.815, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0337, loss_cls: 0.1429, acc: 94.5200, loss_bbox: 0.1991, loss_mask: 0.2113, loss: 0.5996 2024-05-29 04:24:44,945 - mmdet - INFO - Epoch [12][2700/7330] lr: 1.000e-06, eta: 1:02:00, time: 0.762, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0337, loss_cls: 0.1403, acc: 94.6387, loss_bbox: 0.1991, loss_mask: 0.2109, loss: 0.5964 2024-05-29 04:25:23,296 - mmdet - INFO - Epoch [12][2750/7330] lr: 1.000e-06, eta: 1:01:20, time: 0.767, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0325, loss_cls: 0.1285, acc: 95.0728, loss_bbox: 0.1790, loss_mask: 0.2055, loss: 0.5572 2024-05-29 04:26:04,283 - mmdet - INFO - Epoch [12][2800/7330] lr: 1.000e-06, eta: 1:00:39, time: 0.820, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0323, loss_cls: 0.1374, acc: 94.6907, loss_bbox: 0.1859, loss_mask: 0.2045, loss: 0.5719 2024-05-29 04:26:43,186 - mmdet - INFO - Epoch [12][2850/7330] lr: 1.000e-06, eta: 0:59:59, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0351, loss_cls: 0.1466, acc: 94.4343, loss_bbox: 0.2015, loss_mask: 0.2158, loss: 0.6128 2024-05-29 04:27:23,934 - mmdet - INFO - Epoch [12][2900/7330] lr: 1.000e-06, eta: 0:59:19, time: 0.816, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0349, loss_cls: 0.1439, acc: 94.4470, loss_bbox: 0.2001, loss_mask: 0.2098, loss: 0.6014 2024-05-29 04:28:05,077 - mmdet - INFO - Epoch [12][2950/7330] lr: 1.000e-06, eta: 0:58:39, time: 0.823, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0341, loss_cls: 0.1415, acc: 94.4985, loss_bbox: 0.2013, loss_mask: 0.2095, loss: 0.5987 2024-05-29 04:28:43,550 - mmdet - INFO - Epoch [12][3000/7330] lr: 1.000e-06, eta: 0:57:59, time: 0.769, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0372, loss_cls: 0.1484, acc: 94.2432, loss_bbox: 0.2070, loss_mask: 0.2191, loss: 0.6249 2024-05-29 04:29:27,233 - mmdet - INFO - Epoch [12][3050/7330] lr: 1.000e-06, eta: 0:57:19, time: 0.874, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0362, loss_cls: 0.1464, acc: 94.3496, loss_bbox: 0.2034, loss_mask: 0.2132, loss: 0.6116 2024-05-29 04:30:10,085 - mmdet - INFO - Epoch [12][3100/7330] lr: 1.000e-06, eta: 0:56:39, time: 0.857, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0349, loss_cls: 0.1406, acc: 94.5825, loss_bbox: 0.1978, loss_mask: 0.2077, loss: 0.5927 2024-05-29 04:30:48,517 - mmdet - INFO - Epoch [12][3150/7330] lr: 1.000e-06, eta: 0:55:58, time: 0.769, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0380, loss_cls: 0.1530, acc: 94.0754, loss_bbox: 0.2101, loss_mask: 0.2187, loss: 0.6336 2024-05-29 04:31:29,131 - mmdet - INFO - Epoch [12][3200/7330] lr: 1.000e-06, eta: 0:55:18, time: 0.812, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0386, loss_cls: 0.1511, acc: 94.1646, loss_bbox: 0.2116, loss_mask: 0.2175, loss: 0.6314 2024-05-29 04:32:07,384 - mmdet - INFO - Epoch [12][3250/7330] lr: 1.000e-06, eta: 0:54:38, time: 0.765, data_time: 0.027, memory: 13582, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0342, loss_cls: 0.1354, acc: 94.7100, loss_bbox: 0.1950, loss_mask: 0.2114, loss: 0.5872 2024-05-29 04:32:45,873 - mmdet - INFO - Epoch [12][3300/7330] lr: 1.000e-06, eta: 0:53:58, time: 0.770, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0360, loss_cls: 0.1413, acc: 94.5645, loss_bbox: 0.1993, loss_mask: 0.2140, loss: 0.6031 2024-05-29 04:33:26,706 - mmdet - INFO - Epoch [12][3350/7330] lr: 1.000e-06, eta: 0:53:18, time: 0.817, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0389, loss_cls: 0.1507, acc: 94.1550, loss_bbox: 0.2123, loss_mask: 0.2158, loss: 0.6311 2024-05-29 04:34:05,365 - mmdet - INFO - Epoch [12][3400/7330] lr: 1.000e-06, eta: 0:52:37, time: 0.773, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0345, loss_cls: 0.1396, acc: 94.5315, loss_bbox: 0.1996, loss_mask: 0.2144, loss: 0.5990 2024-05-29 04:34:43,718 - mmdet - INFO - Epoch [12][3450/7330] lr: 1.000e-06, eta: 0:51:57, time: 0.767, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0373, loss_cls: 0.1418, acc: 94.5796, loss_bbox: 0.1990, loss_mask: 0.2186, loss: 0.6096 2024-05-29 04:35:24,624 - mmdet - INFO - Epoch [12][3500/7330] lr: 1.000e-06, eta: 0:51:17, time: 0.818, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0368, loss_cls: 0.1393, acc: 94.5947, loss_bbox: 0.1961, loss_mask: 0.2124, loss: 0.5969 2024-05-29 04:36:04,998 - mmdet - INFO - Epoch [12][3550/7330] lr: 1.000e-06, eta: 0:50:37, time: 0.807, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0340, loss_cls: 0.1326, acc: 94.9160, loss_bbox: 0.1879, loss_mask: 0.2052, loss: 0.5720 2024-05-29 04:36:45,870 - mmdet - INFO - Epoch [12][3600/7330] lr: 1.000e-06, eta: 0:49:57, time: 0.817, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0365, loss_cls: 0.1429, acc: 94.4978, loss_bbox: 0.2009, loss_mask: 0.2115, loss: 0.6048 2024-05-29 04:37:26,358 - mmdet - INFO - Epoch [12][3650/7330] lr: 1.000e-06, eta: 0:49:16, time: 0.810, data_time: 0.043, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0338, loss_cls: 0.1392, acc: 94.5916, loss_bbox: 0.1973, loss_mask: 0.2103, loss: 0.5928 2024-05-29 04:38:04,830 - mmdet - INFO - Epoch [12][3700/7330] lr: 1.000e-06, eta: 0:48:36, time: 0.769, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0354, loss_cls: 0.1404, acc: 94.6143, loss_bbox: 0.1980, loss_mask: 0.2093, loss: 0.5949 2024-05-29 04:38:47,609 - mmdet - INFO - Epoch [12][3750/7330] lr: 1.000e-06, eta: 0:47:56, time: 0.856, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0366, loss_cls: 0.1488, acc: 94.3606, loss_bbox: 0.2028, loss_mask: 0.2150, loss: 0.6165 2024-05-29 04:39:28,336 - mmdet - INFO - Epoch [12][3800/7330] lr: 1.000e-06, eta: 0:47:16, time: 0.815, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0357, loss_cls: 0.1433, acc: 94.5010, loss_bbox: 0.1973, loss_mask: 0.2107, loss: 0.5989 2024-05-29 04:40:06,742 - mmdet - INFO - Epoch [12][3850/7330] lr: 1.000e-06, eta: 0:46:36, time: 0.768, data_time: 0.040, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0359, loss_cls: 0.1485, acc: 94.2898, loss_bbox: 0.2032, loss_mask: 0.2158, loss: 0.6155 2024-05-29 04:40:45,132 - mmdet - INFO - Epoch [12][3900/7330] lr: 1.000e-06, eta: 0:45:55, time: 0.768, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0359, loss_cls: 0.1450, acc: 94.3369, loss_bbox: 0.2009, loss_mask: 0.2101, loss: 0.6049 2024-05-29 04:41:26,148 - mmdet - INFO - Epoch [12][3950/7330] lr: 1.000e-06, eta: 0:45:15, time: 0.820, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0358, loss_cls: 0.1434, acc: 94.4565, loss_bbox: 0.1995, loss_mask: 0.2138, loss: 0.6048 2024-05-29 04:42:04,548 - mmdet - INFO - Epoch [12][4000/7330] lr: 1.000e-06, eta: 0:44:35, time: 0.768, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0334, loss_cls: 0.1358, acc: 94.6860, loss_bbox: 0.1947, loss_mask: 0.2101, loss: 0.5859 2024-05-29 04:42:43,044 - mmdet - INFO - Epoch [12][4050/7330] lr: 1.000e-06, eta: 0:43:55, time: 0.770, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0349, loss_cls: 0.1399, acc: 94.7107, loss_bbox: 0.1947, loss_mask: 0.2164, loss: 0.5977 2024-05-29 04:43:25,894 - mmdet - INFO - Epoch [12][4100/7330] lr: 1.000e-06, eta: 0:43:15, time: 0.857, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0368, loss_cls: 0.1455, acc: 94.3438, loss_bbox: 0.2071, loss_mask: 0.2165, loss: 0.6189 2024-05-29 04:44:04,433 - mmdet - INFO - Epoch [12][4150/7330] lr: 1.000e-06, eta: 0:42:35, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0358, loss_cls: 0.1427, acc: 94.4670, loss_bbox: 0.2006, loss_mask: 0.2147, loss: 0.6060 2024-05-29 04:44:44,899 - mmdet - INFO - Epoch [12][4200/7330] lr: 1.000e-06, eta: 0:41:54, time: 0.809, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0342, loss_cls: 0.1451, acc: 94.4260, loss_bbox: 0.2017, loss_mask: 0.2127, loss: 0.6061 2024-05-29 04:45:25,683 - mmdet - INFO - Epoch [12][4250/7330] lr: 1.000e-06, eta: 0:41:14, time: 0.816, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0351, loss_cls: 0.1452, acc: 94.4285, loss_bbox: 0.2044, loss_mask: 0.2112, loss: 0.6090 2024-05-29 04:46:04,197 - mmdet - INFO - Epoch [12][4300/7330] lr: 1.000e-06, eta: 0:40:34, time: 0.770, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0366, loss_cls: 0.1394, acc: 94.6663, loss_bbox: 0.1941, loss_mask: 0.2141, loss: 0.5968 2024-05-29 04:46:46,875 - mmdet - INFO - Epoch [12][4350/7330] lr: 1.000e-06, eta: 0:39:54, time: 0.854, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0359, loss_cls: 0.1397, acc: 94.5483, loss_bbox: 0.1963, loss_mask: 0.2114, loss: 0.5964 2024-05-29 04:47:27,606 - mmdet - INFO - Epoch [12][4400/7330] lr: 1.000e-06, eta: 0:39:14, time: 0.815, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0368, loss_cls: 0.1412, acc: 94.5825, loss_bbox: 0.2004, loss_mask: 0.2129, loss: 0.6029 2024-05-29 04:48:08,925 - mmdet - INFO - Epoch [12][4450/7330] lr: 1.000e-06, eta: 0:38:34, time: 0.826, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0361, loss_cls: 0.1512, acc: 94.1948, loss_bbox: 0.2079, loss_mask: 0.2114, loss: 0.6200 2024-05-29 04:48:47,347 - mmdet - INFO - Epoch [12][4500/7330] lr: 1.000e-06, eta: 0:37:53, time: 0.769, data_time: 0.044, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0368, loss_cls: 0.1451, acc: 94.4016, loss_bbox: 0.2046, loss_mask: 0.2148, loss: 0.6140 2024-05-29 04:49:25,519 - mmdet - INFO - Epoch [12][4550/7330] lr: 1.000e-06, eta: 0:37:13, time: 0.763, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0345, loss_cls: 0.1419, acc: 94.5620, loss_bbox: 0.1968, loss_mask: 0.2122, loss: 0.5978 2024-05-29 04:50:06,576 - mmdet - INFO - Epoch [12][4600/7330] lr: 1.000e-06, eta: 0:36:33, time: 0.821, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0356, loss_cls: 0.1453, acc: 94.4412, loss_bbox: 0.1995, loss_mask: 0.2139, loss: 0.6075 2024-05-29 04:50:45,122 - mmdet - INFO - Epoch [12][4650/7330] lr: 1.000e-06, eta: 0:35:53, time: 0.771, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0364, loss_cls: 0.1400, acc: 94.6602, loss_bbox: 0.1997, loss_mask: 0.2150, loss: 0.6040 2024-05-29 04:51:26,185 - mmdet - INFO - Epoch [12][4700/7330] lr: 1.000e-06, eta: 0:35:13, time: 0.821, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0361, loss_cls: 0.1474, acc: 94.3508, loss_bbox: 0.2029, loss_mask: 0.2166, loss: 0.6155 2024-05-29 04:52:05,016 - mmdet - INFO - Epoch [12][4750/7330] lr: 1.000e-06, eta: 0:34:32, time: 0.777, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0356, loss_cls: 0.1376, acc: 94.6729, loss_bbox: 0.1948, loss_mask: 0.2056, loss: 0.5859 2024-05-29 04:52:45,826 - mmdet - INFO - Epoch [12][4800/7330] lr: 1.000e-06, eta: 0:33:52, time: 0.816, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0343, loss_cls: 0.1401, acc: 94.5469, loss_bbox: 0.1945, loss_mask: 0.2106, loss: 0.5917 2024-05-29 04:53:24,053 - mmdet - INFO - Epoch [12][4850/7330] lr: 1.000e-06, eta: 0:33:12, time: 0.765, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0347, loss_cls: 0.1383, acc: 94.7061, loss_bbox: 0.1960, loss_mask: 0.2167, loss: 0.5976 2024-05-29 04:54:04,388 - mmdet - INFO - Epoch [12][4900/7330] lr: 1.000e-06, eta: 0:32:32, time: 0.807, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0330, loss_cls: 0.1392, acc: 94.5967, loss_bbox: 0.1963, loss_mask: 0.2107, loss: 0.5917 2024-05-29 04:54:47,307 - mmdet - INFO - Epoch [12][4950/7330] lr: 1.000e-06, eta: 0:31:52, time: 0.858, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0360, loss_cls: 0.1420, acc: 94.5037, loss_bbox: 0.2004, loss_mask: 0.2108, loss: 0.6022 2024-05-29 04:55:30,064 - mmdet - INFO - Epoch [12][5000/7330] lr: 1.000e-06, eta: 0:31:12, time: 0.855, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0346, loss_cls: 0.1436, acc: 94.5752, loss_bbox: 0.1982, loss_mask: 0.2137, loss: 0.6022 2024-05-29 04:56:08,228 - mmdet - INFO - Epoch [12][5050/7330] lr: 1.000e-06, eta: 0:30:31, time: 0.763, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0322, loss_cls: 0.1355, acc: 94.7869, loss_bbox: 0.1895, loss_mask: 0.2120, loss: 0.5808 2024-05-29 04:56:46,602 - mmdet - INFO - Epoch [12][5100/7330] lr: 1.000e-06, eta: 0:29:51, time: 0.767, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0341, loss_cls: 0.1417, acc: 94.5488, loss_bbox: 0.1980, loss_mask: 0.2146, loss: 0.6006 2024-05-29 04:57:24,842 - mmdet - INFO - Epoch [12][5150/7330] lr: 1.000e-06, eta: 0:29:11, time: 0.765, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0361, loss_cls: 0.1457, acc: 94.4458, loss_bbox: 0.2051, loss_mask: 0.2156, loss: 0.6150 2024-05-29 04:58:05,455 - mmdet - INFO - Epoch [12][5200/7330] lr: 1.000e-06, eta: 0:28:31, time: 0.812, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0352, loss_cls: 0.1391, acc: 94.6150, loss_bbox: 0.1967, loss_mask: 0.2094, loss: 0.5916 2024-05-29 04:58:44,245 - mmdet - INFO - Epoch [12][5250/7330] lr: 1.000e-06, eta: 0:27:51, time: 0.776, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0356, loss_cls: 0.1444, acc: 94.3992, loss_bbox: 0.2025, loss_mask: 0.2130, loss: 0.6083 2024-05-29 04:59:25,592 - mmdet - INFO - Epoch [12][5300/7330] lr: 1.000e-06, eta: 0:27:10, time: 0.827, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0376, loss_cls: 0.1487, acc: 94.3044, loss_bbox: 0.2062, loss_mask: 0.2174, loss: 0.6230 2024-05-29 05:00:04,074 - mmdet - INFO - Epoch [12][5350/7330] lr: 1.000e-06, eta: 0:26:30, time: 0.770, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0341, loss_cls: 0.1382, acc: 94.6211, loss_bbox: 0.1968, loss_mask: 0.2120, loss: 0.5929 2024-05-29 05:00:44,379 - mmdet - INFO - Epoch [12][5400/7330] lr: 1.000e-06, eta: 0:25:50, time: 0.806, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0332, loss_cls: 0.1393, acc: 94.6333, loss_bbox: 0.1948, loss_mask: 0.2067, loss: 0.5866 2024-05-29 05:01:23,093 - mmdet - INFO - Epoch [12][5450/7330] lr: 1.000e-06, eta: 0:25:10, time: 0.774, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0384, loss_cls: 0.1540, acc: 94.1257, loss_bbox: 0.2114, loss_mask: 0.2140, loss: 0.6304 2024-05-29 05:02:03,449 - mmdet - INFO - Epoch [12][5500/7330] lr: 1.000e-06, eta: 0:24:30, time: 0.807, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0336, loss_cls: 0.1334, acc: 94.8220, loss_bbox: 0.1915, loss_mask: 0.2144, loss: 0.5843 2024-05-29 05:02:47,349 - mmdet - INFO - Epoch [12][5550/7330] lr: 1.000e-06, eta: 0:23:50, time: 0.878, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0354, loss_cls: 0.1477, acc: 94.3479, loss_bbox: 0.2047, loss_mask: 0.2137, loss: 0.6144 2024-05-29 05:03:30,628 - mmdet - INFO - Epoch [12][5600/7330] lr: 1.000e-06, eta: 0:23:10, time: 0.866, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0338, loss_cls: 0.1402, acc: 94.5171, loss_bbox: 0.1974, loss_mask: 0.2144, loss: 0.5986 2024-05-29 05:04:08,955 - mmdet - INFO - Epoch [12][5650/7330] lr: 1.000e-06, eta: 0:22:29, time: 0.767, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0326, loss_cls: 0.1356, acc: 94.7852, loss_bbox: 0.1896, loss_mask: 0.2045, loss: 0.5736 2024-05-29 05:04:47,513 - mmdet - INFO - Epoch [12][5700/7330] lr: 1.000e-06, eta: 0:21:49, time: 0.771, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0369, loss_cls: 0.1415, acc: 94.5520, loss_bbox: 0.2035, loss_mask: 0.2125, loss: 0.6070 2024-05-29 05:05:28,421 - mmdet - INFO - Epoch [12][5750/7330] lr: 1.000e-06, eta: 0:21:09, time: 0.818, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0371, loss_cls: 0.1525, acc: 94.1692, loss_bbox: 0.2090, loss_mask: 0.2183, loss: 0.6304 2024-05-29 05:06:07,125 - mmdet - INFO - Epoch [12][5800/7330] lr: 1.000e-06, eta: 0:20:29, time: 0.774, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0360, loss_cls: 0.1406, acc: 94.5173, loss_bbox: 0.2042, loss_mask: 0.2145, loss: 0.6083 2024-05-29 05:06:45,860 - mmdet - INFO - Epoch [12][5850/7330] lr: 1.000e-06, eta: 0:19:49, time: 0.775, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0349, loss_cls: 0.1433, acc: 94.4978, loss_bbox: 0.1993, loss_mask: 0.2115, loss: 0.6017 2024-05-29 05:07:24,355 - mmdet - INFO - Epoch [12][5900/7330] lr: 1.000e-06, eta: 0:19:08, time: 0.770, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0369, loss_cls: 0.1493, acc: 94.3330, loss_bbox: 0.2046, loss_mask: 0.2165, loss: 0.6207 2024-05-29 05:08:05,175 - mmdet - INFO - Epoch [12][5950/7330] lr: 1.000e-06, eta: 0:18:28, time: 0.817, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0363, loss_cls: 0.1465, acc: 94.3123, loss_bbox: 0.2038, loss_mask: 0.2139, loss: 0.6135 2024-05-29 05:08:46,139 - mmdet - INFO - Epoch [12][6000/7330] lr: 1.000e-06, eta: 0:17:48, time: 0.819, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0357, loss_cls: 0.1429, acc: 94.5078, loss_bbox: 0.2014, loss_mask: 0.2170, loss: 0.6092 2024-05-29 05:09:24,988 - mmdet - INFO - Epoch [12][6050/7330] lr: 1.000e-06, eta: 0:17:08, time: 0.777, data_time: 0.038, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0371, loss_cls: 0.1440, acc: 94.4067, loss_bbox: 0.2047, loss_mask: 0.2169, loss: 0.6149 2024-05-29 05:10:08,562 - mmdet - INFO - Epoch [12][6100/7330] lr: 1.000e-06, eta: 0:16:28, time: 0.871, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0347, loss_cls: 0.1410, acc: 94.5129, loss_bbox: 0.1948, loss_mask: 0.2091, loss: 0.5924 2024-05-29 05:10:51,534 - mmdet - INFO - Epoch [12][6150/7330] lr: 1.000e-06, eta: 0:15:48, time: 0.859, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0368, loss_cls: 0.1405, acc: 94.5977, loss_bbox: 0.1964, loss_mask: 0.2144, loss: 0.6000 2024-05-29 05:11:32,182 - mmdet - INFO - Epoch [12][6200/7330] lr: 1.000e-06, eta: 0:15:07, time: 0.813, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0350, loss_cls: 0.1389, acc: 94.5974, loss_bbox: 0.1936, loss_mask: 0.2111, loss: 0.5914 2024-05-29 05:12:12,838 - mmdet - INFO - Epoch [12][6250/7330] lr: 1.000e-06, eta: 0:14:27, time: 0.813, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0330, loss_cls: 0.1363, acc: 94.6846, loss_bbox: 0.1993, loss_mask: 0.2132, loss: 0.5933 2024-05-29 05:12:51,710 - mmdet - INFO - Epoch [12][6300/7330] lr: 1.000e-06, eta: 0:13:47, time: 0.777, data_time: 0.041, memory: 13582, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0383, loss_cls: 0.1520, acc: 94.1614, loss_bbox: 0.2133, loss_mask: 0.2163, loss: 0.6334 2024-05-29 05:13:30,173 - mmdet - INFO - Epoch [12][6350/7330] lr: 1.000e-06, eta: 0:13:07, time: 0.769, data_time: 0.035, memory: 13582, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0342, loss_cls: 0.1347, acc: 94.8391, loss_bbox: 0.1912, loss_mask: 0.2080, loss: 0.5791 2024-05-29 05:14:08,745 - mmdet - INFO - Epoch [12][6400/7330] lr: 1.000e-06, eta: 0:12:27, time: 0.771, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0350, loss_cls: 0.1418, acc: 94.5383, loss_bbox: 0.2003, loss_mask: 0.2151, loss: 0.6045 2024-05-29 05:14:47,222 - mmdet - INFO - Epoch [12][6450/7330] lr: 1.000e-06, eta: 0:11:46, time: 0.770, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0364, loss_cls: 0.1462, acc: 94.3818, loss_bbox: 0.2046, loss_mask: 0.2180, loss: 0.6177 2024-05-29 05:15:25,416 - mmdet - INFO - Epoch [12][6500/7330] lr: 1.000e-06, eta: 0:11:06, time: 0.764, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0369, loss_cls: 0.1434, acc: 94.5146, loss_bbox: 0.1995, loss_mask: 0.2144, loss: 0.6067 2024-05-29 05:16:06,244 - mmdet - INFO - Epoch [12][6550/7330] lr: 1.000e-06, eta: 0:10:26, time: 0.817, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0362, loss_cls: 0.1438, acc: 94.4009, loss_bbox: 0.2044, loss_mask: 0.2166, loss: 0.6132 2024-05-29 05:16:46,817 - mmdet - INFO - Epoch [12][6600/7330] lr: 1.000e-06, eta: 0:09:46, time: 0.811, data_time: 0.032, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0343, loss_cls: 0.1368, acc: 94.6538, loss_bbox: 0.1937, loss_mask: 0.2105, loss: 0.5871 2024-05-29 05:17:25,264 - mmdet - INFO - Epoch [12][6650/7330] lr: 1.000e-06, eta: 0:09:06, time: 0.769, data_time: 0.028, memory: 13582, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0348, loss_cls: 0.1374, acc: 94.6963, loss_bbox: 0.1901, loss_mask: 0.2089, loss: 0.5837 2024-05-29 05:18:08,005 - mmdet - INFO - Epoch [12][6700/7330] lr: 1.000e-06, eta: 0:08:26, time: 0.855, data_time: 0.029, memory: 13582, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0355, loss_cls: 0.1417, acc: 94.5957, loss_bbox: 0.1980, loss_mask: 0.2176, loss: 0.6047 2024-05-29 05:18:53,615 - mmdet - INFO - Epoch [12][6750/7330] lr: 1.000e-06, eta: 0:07:46, time: 0.912, data_time: 0.030, memory: 13582, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0343, loss_cls: 0.1429, acc: 94.6111, loss_bbox: 0.1918, loss_mask: 0.2081, loss: 0.5897 2024-05-29 05:19:31,836 - mmdet - INFO - Epoch [12][6800/7330] lr: 1.000e-06, eta: 0:07:05, time: 0.765, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0346, loss_cls: 0.1391, acc: 94.6626, loss_bbox: 0.1952, loss_mask: 0.2078, loss: 0.5885 2024-05-29 05:20:12,736 - mmdet - INFO - Epoch [12][6850/7330] lr: 1.000e-06, eta: 0:06:25, time: 0.818, data_time: 0.034, memory: 13582, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0354, loss_cls: 0.1343, acc: 94.7559, loss_bbox: 0.1898, loss_mask: 0.2131, loss: 0.5838 2024-05-29 05:20:51,321 - mmdet - INFO - Epoch [12][6900/7330] lr: 1.000e-06, eta: 0:05:45, time: 0.772, data_time: 0.045, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0356, loss_cls: 0.1438, acc: 94.4917, loss_bbox: 0.1993, loss_mask: 0.2113, loss: 0.6033 2024-05-29 05:21:29,642 - mmdet - INFO - Epoch [12][6950/7330] lr: 1.000e-06, eta: 0:05:05, time: 0.766, data_time: 0.036, memory: 13582, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0323, loss_cls: 0.1364, acc: 94.7544, loss_bbox: 0.1937, loss_mask: 0.2098, loss: 0.5838 2024-05-29 05:22:08,176 - mmdet - INFO - Epoch [12][7000/7330] lr: 1.000e-06, eta: 0:04:25, time: 0.771, data_time: 0.039, memory: 13582, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0343, loss_cls: 0.1345, acc: 94.8630, loss_bbox: 0.1891, loss_mask: 0.2113, loss: 0.5816 2024-05-29 05:22:47,121 - mmdet - INFO - Epoch [12][7050/7330] lr: 1.000e-06, eta: 0:03:44, time: 0.779, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0365, loss_cls: 0.1451, acc: 94.4438, loss_bbox: 0.2045, loss_mask: 0.2142, loss: 0.6132 2024-05-29 05:23:25,398 - mmdet - INFO - Epoch [12][7100/7330] lr: 1.000e-06, eta: 0:03:04, time: 0.766, data_time: 0.031, memory: 13582, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0324, loss_cls: 0.1351, acc: 94.7917, loss_bbox: 0.1896, loss_mask: 0.2087, loss: 0.5761 2024-05-29 05:24:07,167 - mmdet - INFO - Epoch [12][7150/7330] lr: 1.000e-06, eta: 0:02:24, time: 0.835, data_time: 0.037, memory: 13582, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0361, loss_cls: 0.1509, acc: 94.2478, loss_bbox: 0.2073, loss_mask: 0.2144, loss: 0.6228 2024-05-29 05:24:50,272 - mmdet - INFO - Epoch [12][7200/7330] lr: 1.000e-06, eta: 0:01:44, time: 0.862, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0359, loss_cls: 0.1374, acc: 94.6633, loss_bbox: 0.1957, loss_mask: 0.2105, loss: 0.5917 2024-05-29 05:25:28,838 - mmdet - INFO - Epoch [12][7250/7330] lr: 1.000e-06, eta: 0:01:04, time: 0.771, data_time: 0.033, memory: 13582, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0354, loss_cls: 0.1459, acc: 94.4358, loss_bbox: 0.2004, loss_mask: 0.2139, loss: 0.6089 2024-05-29 05:26:09,795 - mmdet - INFO - Epoch [12][7300/7330] lr: 1.000e-06, eta: 0:00:24, time: 0.819, data_time: 0.048, memory: 13582, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0368, loss_cls: 0.1459, acc: 94.3652, loss_bbox: 0.2021, loss_mask: 0.2125, loss: 0.6107 2024-05-29 05:26:35,812 - mmdet - INFO - Saving checkpoint at 12 epochs 2024-05-29 05:28:58,914 - mmdet - INFO - Evaluating bbox... 2024-05-29 05:29:19,416 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.471 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.700 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.276 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.507 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.657 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.580 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.580 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.580 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.374 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.623 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.762 2024-05-29 05:29:19,416 - mmdet - INFO - Evaluating segm... 2024-05-29 05:29:44,845 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.419 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.661 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.446 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.198 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.450 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.644 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.518 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.565 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.718 2024-05-29 05:29:45,296 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1344_896_672_448_fpn_1x_coco_bs16.py 2024-05-29 05:29:45,297 - mmdet - INFO - Epoch(val) [12][625] bbox_mAP: 0.4710, bbox_mAP_50: 0.7000, bbox_mAP_75: 0.5080, bbox_mAP_s: 0.2760, bbox_mAP_m: 0.5070, bbox_mAP_l: 0.6570, bbox_mAP_copypaste: 0.471 0.700 0.508 0.276 0.507 0.657, segm_mAP: 0.4190, segm_mAP_50: 0.6610, segm_mAP_75: 0.4460, segm_mAP_s: 0.1980, segm_mAP_m: 0.4500, segm_mAP_l: 0.6440, segm_mAP_copypaste: 0.419 0.661 0.446 0.198 0.450 0.644