2024-05-30 12:01:15,684 - mmdet - INFO - Environment info: ------------------------------------------------------------ sys.platform: linux Python: 3.9.19 (main, May 6 2024, 19:43:03) [GCC 11.2.0] CUDA available: True GPU 0,1,2,3,4,5,6,7: NVIDIA A100-SXM4-80GB CUDA_HOME: /mnt/petrelfs/share/cuda-11.7/ NVCC: Cuda compilation tools, release 11.7, V11.7.99 GCC: gcc (GCC) 7.3.0 PyTorch: 1.12.0+cu113 PyTorch compiling details: PyTorch built with: - GCC 9.3 - C++ Version: 201402 - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications - Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815) - OpenMP 201511 (a.k.a. OpenMP 4.5) - LAPACK is enabled (usually provided by MKL) - NNPACK is enabled - CPU capability usage: AVX2 - CUDA Runtime 11.3 - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86 - CuDNN 8.3.2 (built against CUDA 11.5) - Magma 2.5.2 - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF, TorchVision: 0.13.0+cu113 OpenCV: 4.9.0 MMCV: 1.7.0 MMCV Compiler: GCC 7.3 MMCV CUDA Compiler: 11.7 MMDetection: 2.25.3+b37f24c ------------------------------------------------------------ 2024-05-30 12:01:17,569 - mmdet - INFO - Distributed training: True 2024-05-30 12:01:19,269 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='PIIPFourBranch', n_points=4, deform_num_heads=16, cffn_ratio=0.25, deform_ratio=0.5, with_cffn=True, interact_attn_type='deform', interaction_drop_path_rate=0.4, branch1=dict( real_size=448, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=24, embed_dim=1024, num_heads=16, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.4, init_scale=1, with_fpn=False, interaction_indexes=[[0, 1], [2, 3], [4, 5], [6, 7], [8, 9], [10, 11], [12, 13], [14, 15], [16, 17], [18, 19], [20, 21], [22, 23]], pretrained='./pretrained/deit_3_large_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[ 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28 ], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch2=dict( real_size=1120, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=768, num_heads=12, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.15, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_base_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch3=dict( real_size=1568, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=384, num_heads=6, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_3_small_224_21k.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True)), branch4=dict( real_size=1792, pretrain_img_size=224, patch_size=16, pretrain_patch_size=16, depth=12, embed_dim=192, num_heads=3, mlp_ratio=4, qkv_bias=True, drop_path_rate=0.05, init_scale=1, with_fpn=False, interaction_indexes=[[0, 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5], [6, 6], [7, 7], [8, 8], [9, 9], [10, 10], [11, 11]], pretrained='./pretrained/deit_tiny_patch16_224-a1311bcf.pth', window_attn=[ True, True, True, True, True, True, True, True, True, True, True, True ], window_size=[28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28, 28], use_flash_attn=True, img_norm_cfg=dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True))), neck=dict( type='FPN', in_channels=[1024, 1024, 1024, 1024], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=80, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=80, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1792, 1075), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1792, 1075), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_train2017.json', img_prefix='data/coco/train2017/', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1792, 1075), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ]), val=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1792, 1075), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='CocoDataset', ann_file='data/coco/annotations/instances_val2017.json', img_prefix='data/coco/val2017/', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1792, 1075), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[127.5, 127.5, 127.5], std=[127.5, 127.5, 127.5], to_rgb=True), dict(type='Pad', size_divisor=224), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) evaluation = dict(metric=['bbox', 'segm'], interval=1, save_best=None) optimizer = dict( type='AdamW', lr=0.0001, betas=(0.9, 0.999), weight_decay=0.05, constructor='CustomLayerDecayOptimizerConstructorMMDet', paramwise_cfg=dict( num_layers=24, layer_decay_rate=0.85, skip_stride=[2, 2, 2])) optimizer_config = dict(grad_clip=None) lr_config = dict( policy='step', warmup='linear', warmup_iters=500, warmup_ratio=0.001, step=[8, 11]) runner = dict(type='EpochBasedRunner', max_epochs=12) checkpoint_config = dict(interval=1, deepspeed=True, max_keep_ckpts=1) log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')]) custom_hooks = [dict(type='ToBFloat16HookMMDet', priority=49)] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] opencv_num_threads = 0 mp_start_method = 'fork' auto_scale_lr = dict(enable=False, base_batch_size=16) deepspeed = True deepspeed_config = 'zero_configs/adam_zero1_bf16.json' custom_imports = dict( imports=['mmdet.mmcv_custom'], allow_failed_imports=False) work_dir = './work_dirs/mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16' auto_resume = True gpu_ids = range(0, 8) 2024-05-30 12:01:25,246 - mmdet - INFO - Set random seed to 69324157, deterministic: False 2024-05-30 12:01:33,314 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 12:01:34,397 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 12:01:36,421 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 12:01:38,934 - mmdet - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['cls_token', 'norm.weight', 'norm.bias', 'head.weight', 'head.bias']) 2024-05-30 12:03:24,873 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2024-05-30 12:03:25,158 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2024-05-30 12:03:25,195 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] Name of parameter - Initialization information backbone.w1 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w2 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w3 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.w4 - torch.Size([]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.pos_embed - torch.Size([1, 196, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.weight - torch.Size([1024, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.patch_embed.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.0.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.1.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.2.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.3.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.4.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.5.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.6.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.7.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.8.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.9.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.10.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.11.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.12.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.13.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.14.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.15.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.16.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.17.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.18.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.19.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.20.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.21.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.22.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_1 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.gamma_2 - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.weight - torch.Size([3072, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.qkv.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.weight - torch.Size([1024, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.attn.proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.norm2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.weight - torch.Size([4096, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc1.bias - torch.Size([4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.weight - torch.Size([1024, 4096]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch1.blocks.23.mlp.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.pos_embed - torch.Size([1, 196, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.weight - torch.Size([768, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.patch_embed.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.0.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.1.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.2.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.3.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.4.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.5.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.6.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.7.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.8.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.9.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.10.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_1 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.gamma_2 - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm1.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.weight - torch.Size([2304, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.qkv.bias - torch.Size([2304]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.weight - torch.Size([768, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.attn.proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.norm2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.weight - torch.Size([3072, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc1.bias - torch.Size([3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.weight - torch.Size([768, 3072]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch2.blocks.11.mlp.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch3.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.pos_embed - torch.Size([1, 196, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.patch_embed.proj.weight - torch.Size([384, 3, 16, 16]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.patch_embed.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.0.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.1.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.2.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.3.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.4.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.5.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.6.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.7.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.8.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.9.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.10.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.gamma_1 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.gamma_2 - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm1.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm1.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.qkv.weight - torch.Size([1152, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.qkv.bias - torch.Size([1152]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.attn.proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm2.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.norm2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc1.weight - torch.Size([1536, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc1.bias - torch.Size([1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc2.weight - torch.Size([384, 1536]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.branch4.blocks.11.mlp.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.weight - torch.Size([1024, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight - torch.Size([512, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias - torch.Size([512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight - torch.Size([1024, 512]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight - torch.Size([256, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([256, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight - torch.Size([1024, 256]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.weight - torch.Size([768, 1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight - torch.Size([768, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight - torch.Size([192, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([192, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight - torch.Size([768, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias - torch.Size([768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.weight - torch.Size([384, 768]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_proj.weight - torch.Size([384, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ca_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.cffn_gamma - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight - torch.Size([128, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias - torch.Size([128]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.weight - torch.Size([64, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.bias - torch.Size([64]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.weight - torch.Size([192, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.bias - torch.Size([192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.weight - torch.Size([384, 192]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.weight - torch.Size([96, 384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight - torch.Size([96, 1, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias - torch.Size([96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.weight - torch.Size([384, 96]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.weight - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.bias - torch.Size([384]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.0.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch1.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.0.weight - torch.Size([1024, 768, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch2.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch3.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.0.weight - torch.Size([1024, 384, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.3.weight - torch.Size([1024, 1024, 3, 3]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.4.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.merge_branch4.4.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.weight - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.1.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn1.3.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.weight - torch.Size([1024, 1024, 2, 2]): The value is the same before and after calling `init_weights` of MaskRCNN backbone.fpn2.0.bias - torch.Size([1024]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.0.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.1.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.2.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.lateral_convs.3.conv.weight - torch.Size([256, 1024, 1, 1]): XavierInit: gain=1, distribution=uniform, bias=0 neck.lateral_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN neck.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): XavierInit: gain=1, distribution=uniform, bias=0 neck.fpn_convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN rpn_head.rpn_conv.weight - torch.Size([256, 256, 3, 3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_conv.bias - torch.Size([256]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.weight - torch.Size([3, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_cls.bias - torch.Size([3]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.weight - torch.Size([12, 256, 1, 1]): NormalInit: mean=0, std=0.01, bias=0 rpn_head.rpn_reg.bias - torch.Size([12]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.weight - torch.Size([81, 1024]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_cls.bias - torch.Size([81]): NormalInit: mean=0, std=0.01, bias=0 roi_head.bbox_head.fc_reg.weight - torch.Size([320, 1024]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.fc_reg.bias - torch.Size([320]): NormalInit: mean=0, std=0.001, bias=0 roi_head.bbox_head.shared_fcs.0.weight - torch.Size([1024, 12544]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.0.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.weight - torch.Size([1024, 1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.bbox_head.shared_fcs.1.bias - torch.Size([1024]): XavierInit: gain=1, distribution=uniform, bias=0 roi_head.mask_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.0.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.1.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.1.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.2.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.2.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.convs.3.conv.weight - torch.Size([256, 256, 3, 3]): Initialized by user-defined `init_weights` in ConvModule roi_head.mask_head.convs.3.conv.bias - torch.Size([256]): The value is the same before and after calling `init_weights` of MaskRCNN roi_head.mask_head.upsample.weight - torch.Size([256, 256, 2, 2]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.upsample.bias - torch.Size([256]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.weight - torch.Size([80, 256, 1, 1]): Initialized by user-defined `init_weights` in FCNMaskHead roi_head.mask_head.conv_logits.bias - torch.Size([80]): Initialized by user-defined `init_weights` in FCNMaskHead 2024-05-30 12:03:41,420 - mmdet - INFO - {'num_layers': 24, 'layer_decay_rate': 0.85, 'skip_stride': [2, 2, 2]} 2024-05-30 12:03:41,420 - mmdet - INFO - Build LayerDecayOptimizerConstructor 0.850000 - 26 2024-05-30 12:03:41,439 - mmdet - INFO - Param groups = { "layer_25_decay": { "param_names": [ "backbone.w1", "backbone.w2", "backbone.w3", "backbone.w4", "backbone.interactions.0.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.0.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.1.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.2.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.3.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.4.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.5.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.6.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.7.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.8.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.9.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.10.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_12.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_23.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_34.branch2to1_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.weight", "backbone.interactions.11.interaction_units_34.branch1to2_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.weight", "backbone.merge_branch1.0.weight", "backbone.merge_branch1.3.weight", "backbone.merge_branch2.0.weight", "backbone.merge_branch2.3.weight", "backbone.merge_branch3.0.weight", "backbone.merge_branch3.3.weight", "backbone.merge_branch4.0.weight", "backbone.merge_branch4.3.weight", "backbone.fpn1.0.weight", "backbone.fpn1.3.weight", "backbone.fpn2.0.weight", "neck.lateral_convs.0.conv.weight", "neck.lateral_convs.1.conv.weight", "neck.lateral_convs.2.conv.weight", "neck.lateral_convs.3.conv.weight", "neck.fpn_convs.0.conv.weight", "neck.fpn_convs.1.conv.weight", "neck.fpn_convs.2.conv.weight", "neck.fpn_convs.3.conv.weight", "rpn_head.rpn_conv.weight", "rpn_head.rpn_cls.weight", "rpn_head.rpn_reg.weight", "roi_head.bbox_head.fc_cls.weight", "roi_head.bbox_head.fc_reg.weight", "roi_head.bbox_head.shared_fcs.0.weight", "roi_head.bbox_head.shared_fcs.1.weight", "roi_head.mask_head.convs.0.conv.weight", "roi_head.mask_head.convs.1.conv.weight", "roi_head.mask_head.convs.2.conv.weight", "roi_head.mask_head.convs.3.conv.weight", "roi_head.mask_head.upsample.weight", "roi_head.mask_head.conv_logits.weight" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.05 }, "layer_0_decay": { "param_names": [ "backbone.branch1.pos_embed", "backbone.branch1.patch_embed.proj.weight", "backbone.branch2.pos_embed", "backbone.branch2.patch_embed.proj.weight", "backbone.branch3.pos_embed", "backbone.branch3.patch_embed.proj.weight", "backbone.branch4.pos_embed", "backbone.branch4.patch_embed.proj.weight" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.05 }, "layer_0_no_decay": { "param_names": [ "backbone.branch1.patch_embed.proj.bias", "backbone.branch2.patch_embed.proj.bias", "backbone.branch3.patch_embed.proj.bias", "backbone.branch4.patch_embed.proj.bias" ], "lr_scale": 0.017197809852207896, "lr": 1.7197809852207897e-06, "weight_decay": 0.0 }, "layer_1_no_decay": { "param_names": [ "backbone.branch1.blocks.0.gamma_1", "backbone.branch1.blocks.0.gamma_2", "backbone.branch1.blocks.0.norm1.weight", "backbone.branch1.blocks.0.norm1.bias", "backbone.branch1.blocks.0.attn.qkv.bias", "backbone.branch1.blocks.0.attn.proj.bias", "backbone.branch1.blocks.0.norm2.weight", "backbone.branch1.blocks.0.norm2.bias", "backbone.branch1.blocks.0.mlp.fc1.bias", "backbone.branch1.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.0 }, "layer_1_decay": { "param_names": [ "backbone.branch1.blocks.0.attn.qkv.weight", "backbone.branch1.blocks.0.attn.proj.weight", "backbone.branch1.blocks.0.mlp.fc1.weight", "backbone.branch1.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.02023271747318576, "lr": 2.023271747318576e-06, "weight_decay": 0.05 }, "layer_2_no_decay": { "param_names": [ "backbone.branch1.blocks.1.gamma_1", "backbone.branch1.blocks.1.gamma_2", "backbone.branch1.blocks.1.norm1.weight", "backbone.branch1.blocks.1.norm1.bias", "backbone.branch1.blocks.1.attn.qkv.bias", "backbone.branch1.blocks.1.attn.proj.bias", "backbone.branch1.blocks.1.norm2.weight", "backbone.branch1.blocks.1.norm2.bias", "backbone.branch1.blocks.1.mlp.fc1.bias", "backbone.branch1.blocks.1.mlp.fc2.bias", "backbone.branch2.blocks.0.gamma_1", "backbone.branch2.blocks.0.gamma_2", "backbone.branch2.blocks.0.norm1.weight", "backbone.branch2.blocks.0.norm1.bias", "backbone.branch2.blocks.0.attn.qkv.bias", "backbone.branch2.blocks.0.attn.proj.bias", "backbone.branch2.blocks.0.norm2.weight", "backbone.branch2.blocks.0.norm2.bias", "backbone.branch2.blocks.0.mlp.fc1.bias", "backbone.branch2.blocks.0.mlp.fc2.bias", "backbone.branch3.blocks.0.gamma_1", "backbone.branch3.blocks.0.gamma_2", "backbone.branch3.blocks.0.norm1.weight", "backbone.branch3.blocks.0.norm1.bias", "backbone.branch3.blocks.0.attn.qkv.bias", "backbone.branch3.blocks.0.attn.proj.bias", "backbone.branch3.blocks.0.norm2.weight", "backbone.branch3.blocks.0.norm2.bias", "backbone.branch3.blocks.0.mlp.fc1.bias", "backbone.branch3.blocks.0.mlp.fc2.bias", "backbone.branch4.blocks.0.gamma_1", "backbone.branch4.blocks.0.gamma_2", "backbone.branch4.blocks.0.norm1.weight", "backbone.branch4.blocks.0.norm1.bias", "backbone.branch4.blocks.0.attn.qkv.bias", "backbone.branch4.blocks.0.attn.proj.bias", "backbone.branch4.blocks.0.norm2.weight", "backbone.branch4.blocks.0.norm2.bias", "backbone.branch4.blocks.0.mlp.fc1.bias", "backbone.branch4.blocks.0.mlp.fc2.bias" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.0 }, "layer_2_decay": { "param_names": [ "backbone.branch1.blocks.1.attn.qkv.weight", "backbone.branch1.blocks.1.attn.proj.weight", "backbone.branch1.blocks.1.mlp.fc1.weight", "backbone.branch1.blocks.1.mlp.fc2.weight", "backbone.branch2.blocks.0.attn.qkv.weight", "backbone.branch2.blocks.0.attn.proj.weight", "backbone.branch2.blocks.0.mlp.fc1.weight", "backbone.branch2.blocks.0.mlp.fc2.weight", "backbone.branch3.blocks.0.attn.qkv.weight", "backbone.branch3.blocks.0.attn.proj.weight", "backbone.branch3.blocks.0.mlp.fc1.weight", "backbone.branch3.blocks.0.mlp.fc2.weight", "backbone.branch4.blocks.0.attn.qkv.weight", "backbone.branch4.blocks.0.attn.proj.weight", "backbone.branch4.blocks.0.mlp.fc1.weight", "backbone.branch4.blocks.0.mlp.fc2.weight" ], "lr_scale": 0.023803197027277366, "lr": 2.380319702727737e-06, "weight_decay": 0.05 }, "layer_3_no_decay": { "param_names": [ "backbone.branch1.blocks.2.gamma_1", "backbone.branch1.blocks.2.gamma_2", "backbone.branch1.blocks.2.norm1.weight", "backbone.branch1.blocks.2.norm1.bias", "backbone.branch1.blocks.2.attn.qkv.bias", "backbone.branch1.blocks.2.attn.proj.bias", "backbone.branch1.blocks.2.norm2.weight", "backbone.branch1.blocks.2.norm2.bias", "backbone.branch1.blocks.2.mlp.fc1.bias", "backbone.branch1.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.0 }, "layer_3_decay": { "param_names": [ "backbone.branch1.blocks.2.attn.qkv.weight", "backbone.branch1.blocks.2.attn.proj.weight", "backbone.branch1.blocks.2.mlp.fc1.weight", "backbone.branch1.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.028003761208561607, "lr": 2.8003761208561607e-06, "weight_decay": 0.05 }, "layer_4_no_decay": { "param_names": [ "backbone.branch1.blocks.3.gamma_1", "backbone.branch1.blocks.3.gamma_2", "backbone.branch1.blocks.3.norm1.weight", "backbone.branch1.blocks.3.norm1.bias", "backbone.branch1.blocks.3.attn.qkv.bias", "backbone.branch1.blocks.3.attn.proj.bias", "backbone.branch1.blocks.3.norm2.weight", "backbone.branch1.blocks.3.norm2.bias", "backbone.branch1.blocks.3.mlp.fc1.bias", "backbone.branch1.blocks.3.mlp.fc2.bias", "backbone.branch2.blocks.1.gamma_1", "backbone.branch2.blocks.1.gamma_2", "backbone.branch2.blocks.1.norm1.weight", "backbone.branch2.blocks.1.norm1.bias", "backbone.branch2.blocks.1.attn.qkv.bias", "backbone.branch2.blocks.1.attn.proj.bias", "backbone.branch2.blocks.1.norm2.weight", "backbone.branch2.blocks.1.norm2.bias", "backbone.branch2.blocks.1.mlp.fc1.bias", "backbone.branch2.blocks.1.mlp.fc2.bias", "backbone.branch3.blocks.1.gamma_1", "backbone.branch3.blocks.1.gamma_2", "backbone.branch3.blocks.1.norm1.weight", "backbone.branch3.blocks.1.norm1.bias", "backbone.branch3.blocks.1.attn.qkv.bias", "backbone.branch3.blocks.1.attn.proj.bias", "backbone.branch3.blocks.1.norm2.weight", "backbone.branch3.blocks.1.norm2.bias", "backbone.branch3.blocks.1.mlp.fc1.bias", "backbone.branch3.blocks.1.mlp.fc2.bias", "backbone.branch4.blocks.1.gamma_1", "backbone.branch4.blocks.1.gamma_2", "backbone.branch4.blocks.1.norm1.weight", "backbone.branch4.blocks.1.norm1.bias", "backbone.branch4.blocks.1.attn.qkv.bias", "backbone.branch4.blocks.1.attn.proj.bias", "backbone.branch4.blocks.1.norm2.weight", "backbone.branch4.blocks.1.norm2.bias", "backbone.branch4.blocks.1.mlp.fc1.bias", "backbone.branch4.blocks.1.mlp.fc2.bias" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.0 }, "layer_4_decay": { "param_names": [ "backbone.branch1.blocks.3.attn.qkv.weight", "backbone.branch1.blocks.3.attn.proj.weight", "backbone.branch1.blocks.3.mlp.fc1.weight", "backbone.branch1.blocks.3.mlp.fc2.weight", "backbone.branch2.blocks.1.attn.qkv.weight", "backbone.branch2.blocks.1.attn.proj.weight", "backbone.branch2.blocks.1.mlp.fc1.weight", "backbone.branch2.blocks.1.mlp.fc2.weight", "backbone.branch3.blocks.1.attn.qkv.weight", "backbone.branch3.blocks.1.attn.proj.weight", "backbone.branch3.blocks.1.mlp.fc1.weight", "backbone.branch3.blocks.1.mlp.fc2.weight", "backbone.branch4.blocks.1.attn.qkv.weight", "backbone.branch4.blocks.1.attn.proj.weight", "backbone.branch4.blocks.1.mlp.fc1.weight", "backbone.branch4.blocks.1.mlp.fc2.weight" ], "lr_scale": 0.03294560142183718, "lr": 3.2945601421837183e-06, "weight_decay": 0.05 }, "layer_5_no_decay": { "param_names": [ "backbone.branch1.blocks.4.gamma_1", "backbone.branch1.blocks.4.gamma_2", "backbone.branch1.blocks.4.norm1.weight", "backbone.branch1.blocks.4.norm1.bias", "backbone.branch1.blocks.4.attn.qkv.bias", "backbone.branch1.blocks.4.attn.proj.bias", "backbone.branch1.blocks.4.norm2.weight", "backbone.branch1.blocks.4.norm2.bias", "backbone.branch1.blocks.4.mlp.fc1.bias", "backbone.branch1.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.0 }, "layer_5_decay": { "param_names": [ "backbone.branch1.blocks.4.attn.qkv.weight", "backbone.branch1.blocks.4.attn.proj.weight", "backbone.branch1.blocks.4.mlp.fc1.weight", "backbone.branch1.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.03875953108451433, "lr": 3.875953108451433e-06, "weight_decay": 0.05 }, "layer_6_no_decay": { "param_names": [ "backbone.branch1.blocks.5.gamma_1", "backbone.branch1.blocks.5.gamma_2", "backbone.branch1.blocks.5.norm1.weight", "backbone.branch1.blocks.5.norm1.bias", "backbone.branch1.blocks.5.attn.qkv.bias", "backbone.branch1.blocks.5.attn.proj.bias", "backbone.branch1.blocks.5.norm2.weight", "backbone.branch1.blocks.5.norm2.bias", "backbone.branch1.blocks.5.mlp.fc1.bias", "backbone.branch1.blocks.5.mlp.fc2.bias", "backbone.branch2.blocks.2.gamma_1", "backbone.branch2.blocks.2.gamma_2", "backbone.branch2.blocks.2.norm1.weight", "backbone.branch2.blocks.2.norm1.bias", "backbone.branch2.blocks.2.attn.qkv.bias", "backbone.branch2.blocks.2.attn.proj.bias", "backbone.branch2.blocks.2.norm2.weight", "backbone.branch2.blocks.2.norm2.bias", "backbone.branch2.blocks.2.mlp.fc1.bias", "backbone.branch2.blocks.2.mlp.fc2.bias", "backbone.branch3.blocks.2.gamma_1", "backbone.branch3.blocks.2.gamma_2", "backbone.branch3.blocks.2.norm1.weight", "backbone.branch3.blocks.2.norm1.bias", "backbone.branch3.blocks.2.attn.qkv.bias", "backbone.branch3.blocks.2.attn.proj.bias", "backbone.branch3.blocks.2.norm2.weight", "backbone.branch3.blocks.2.norm2.bias", "backbone.branch3.blocks.2.mlp.fc1.bias", "backbone.branch3.blocks.2.mlp.fc2.bias", "backbone.branch4.blocks.2.gamma_1", "backbone.branch4.blocks.2.gamma_2", "backbone.branch4.blocks.2.norm1.weight", "backbone.branch4.blocks.2.norm1.bias", "backbone.branch4.blocks.2.attn.qkv.bias", "backbone.branch4.blocks.2.attn.proj.bias", "backbone.branch4.blocks.2.norm2.weight", "backbone.branch4.blocks.2.norm2.bias", "backbone.branch4.blocks.2.mlp.fc1.bias", "backbone.branch4.blocks.2.mlp.fc2.bias" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.0 }, "layer_6_decay": { "param_names": [ "backbone.branch1.blocks.5.attn.qkv.weight", "backbone.branch1.blocks.5.attn.proj.weight", "backbone.branch1.blocks.5.mlp.fc1.weight", "backbone.branch1.blocks.5.mlp.fc2.weight", "backbone.branch2.blocks.2.attn.qkv.weight", "backbone.branch2.blocks.2.attn.proj.weight", "backbone.branch2.blocks.2.mlp.fc1.weight", "backbone.branch2.blocks.2.mlp.fc2.weight", "backbone.branch3.blocks.2.attn.qkv.weight", "backbone.branch3.blocks.2.attn.proj.weight", "backbone.branch3.blocks.2.mlp.fc1.weight", "backbone.branch3.blocks.2.mlp.fc2.weight", "backbone.branch4.blocks.2.attn.qkv.weight", "backbone.branch4.blocks.2.attn.proj.weight", "backbone.branch4.blocks.2.mlp.fc1.weight", "backbone.branch4.blocks.2.mlp.fc2.weight" ], "lr_scale": 0.04559944833472275, "lr": 4.5599448334722756e-06, "weight_decay": 0.05 }, "layer_7_no_decay": { "param_names": [ "backbone.branch1.blocks.6.gamma_1", "backbone.branch1.blocks.6.gamma_2", "backbone.branch1.blocks.6.norm1.weight", "backbone.branch1.blocks.6.norm1.bias", "backbone.branch1.blocks.6.attn.qkv.bias", "backbone.branch1.blocks.6.attn.proj.bias", "backbone.branch1.blocks.6.norm2.weight", "backbone.branch1.blocks.6.norm2.bias", "backbone.branch1.blocks.6.mlp.fc1.bias", "backbone.branch1.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.0 }, "layer_7_decay": { "param_names": [ "backbone.branch1.blocks.6.attn.qkv.weight", "backbone.branch1.blocks.6.attn.proj.weight", "backbone.branch1.blocks.6.mlp.fc1.weight", "backbone.branch1.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.053646409805556176, "lr": 5.364640980555618e-06, "weight_decay": 0.05 }, "layer_8_no_decay": { "param_names": [ "backbone.branch1.blocks.7.gamma_1", "backbone.branch1.blocks.7.gamma_2", "backbone.branch1.blocks.7.norm1.weight", "backbone.branch1.blocks.7.norm1.bias", "backbone.branch1.blocks.7.attn.qkv.bias", "backbone.branch1.blocks.7.attn.proj.bias", "backbone.branch1.blocks.7.norm2.weight", "backbone.branch1.blocks.7.norm2.bias", "backbone.branch1.blocks.7.mlp.fc1.bias", "backbone.branch1.blocks.7.mlp.fc2.bias", "backbone.branch2.blocks.3.gamma_1", "backbone.branch2.blocks.3.gamma_2", "backbone.branch2.blocks.3.norm1.weight", "backbone.branch2.blocks.3.norm1.bias", "backbone.branch2.blocks.3.attn.qkv.bias", "backbone.branch2.blocks.3.attn.proj.bias", "backbone.branch2.blocks.3.norm2.weight", "backbone.branch2.blocks.3.norm2.bias", "backbone.branch2.blocks.3.mlp.fc1.bias", "backbone.branch2.blocks.3.mlp.fc2.bias", "backbone.branch3.blocks.3.gamma_1", "backbone.branch3.blocks.3.gamma_2", "backbone.branch3.blocks.3.norm1.weight", "backbone.branch3.blocks.3.norm1.bias", "backbone.branch3.blocks.3.attn.qkv.bias", "backbone.branch3.blocks.3.attn.proj.bias", "backbone.branch3.blocks.3.norm2.weight", "backbone.branch3.blocks.3.norm2.bias", "backbone.branch3.blocks.3.mlp.fc1.bias", "backbone.branch3.blocks.3.mlp.fc2.bias", "backbone.branch4.blocks.3.gamma_1", "backbone.branch4.blocks.3.gamma_2", "backbone.branch4.blocks.3.norm1.weight", "backbone.branch4.blocks.3.norm1.bias", "backbone.branch4.blocks.3.attn.qkv.bias", "backbone.branch4.blocks.3.attn.proj.bias", "backbone.branch4.blocks.3.norm2.weight", "backbone.branch4.blocks.3.norm2.bias", "backbone.branch4.blocks.3.mlp.fc1.bias", "backbone.branch4.blocks.3.mlp.fc2.bias" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.0 }, "layer_8_decay": { "param_names": [ "backbone.branch1.blocks.7.attn.qkv.weight", "backbone.branch1.blocks.7.attn.proj.weight", "backbone.branch1.blocks.7.mlp.fc1.weight", "backbone.branch1.blocks.7.mlp.fc2.weight", "backbone.branch2.blocks.3.attn.qkv.weight", "backbone.branch2.blocks.3.attn.proj.weight", "backbone.branch2.blocks.3.mlp.fc1.weight", "backbone.branch2.blocks.3.mlp.fc2.weight", "backbone.branch3.blocks.3.attn.qkv.weight", "backbone.branch3.blocks.3.attn.proj.weight", "backbone.branch3.blocks.3.mlp.fc1.weight", "backbone.branch3.blocks.3.mlp.fc2.weight", "backbone.branch4.blocks.3.attn.qkv.weight", "backbone.branch4.blocks.3.attn.proj.weight", "backbone.branch4.blocks.3.mlp.fc1.weight", "backbone.branch4.blocks.3.mlp.fc2.weight" ], "lr_scale": 0.06311342330065432, "lr": 6.3113423300654325e-06, "weight_decay": 0.05 }, "layer_9_no_decay": { "param_names": [ "backbone.branch1.blocks.8.gamma_1", "backbone.branch1.blocks.8.gamma_2", "backbone.branch1.blocks.8.norm1.weight", "backbone.branch1.blocks.8.norm1.bias", "backbone.branch1.blocks.8.attn.qkv.bias", "backbone.branch1.blocks.8.attn.proj.bias", "backbone.branch1.blocks.8.norm2.weight", "backbone.branch1.blocks.8.norm2.bias", "backbone.branch1.blocks.8.mlp.fc1.bias", "backbone.branch1.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.0 }, "layer_9_decay": { "param_names": [ "backbone.branch1.blocks.8.attn.qkv.weight", "backbone.branch1.blocks.8.attn.proj.weight", "backbone.branch1.blocks.8.mlp.fc1.weight", "backbone.branch1.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.07425108623606391, "lr": 7.425108623606392e-06, "weight_decay": 0.05 }, "layer_10_no_decay": { "param_names": [ "backbone.branch1.blocks.9.gamma_1", "backbone.branch1.blocks.9.gamma_2", "backbone.branch1.blocks.9.norm1.weight", "backbone.branch1.blocks.9.norm1.bias", "backbone.branch1.blocks.9.attn.qkv.bias", "backbone.branch1.blocks.9.attn.proj.bias", "backbone.branch1.blocks.9.norm2.weight", "backbone.branch1.blocks.9.norm2.bias", "backbone.branch1.blocks.9.mlp.fc1.bias", "backbone.branch1.blocks.9.mlp.fc2.bias", "backbone.branch2.blocks.4.gamma_1", "backbone.branch2.blocks.4.gamma_2", "backbone.branch2.blocks.4.norm1.weight", "backbone.branch2.blocks.4.norm1.bias", "backbone.branch2.blocks.4.attn.qkv.bias", "backbone.branch2.blocks.4.attn.proj.bias", "backbone.branch2.blocks.4.norm2.weight", "backbone.branch2.blocks.4.norm2.bias", "backbone.branch2.blocks.4.mlp.fc1.bias", "backbone.branch2.blocks.4.mlp.fc2.bias", "backbone.branch3.blocks.4.gamma_1", "backbone.branch3.blocks.4.gamma_2", "backbone.branch3.blocks.4.norm1.weight", "backbone.branch3.blocks.4.norm1.bias", "backbone.branch3.blocks.4.attn.qkv.bias", "backbone.branch3.blocks.4.attn.proj.bias", "backbone.branch3.blocks.4.norm2.weight", "backbone.branch3.blocks.4.norm2.bias", "backbone.branch3.blocks.4.mlp.fc1.bias", "backbone.branch3.blocks.4.mlp.fc2.bias", "backbone.branch4.blocks.4.gamma_1", "backbone.branch4.blocks.4.gamma_2", "backbone.branch4.blocks.4.norm1.weight", "backbone.branch4.blocks.4.norm1.bias", "backbone.branch4.blocks.4.attn.qkv.bias", "backbone.branch4.blocks.4.attn.proj.bias", "backbone.branch4.blocks.4.norm2.weight", "backbone.branch4.blocks.4.norm2.bias", "backbone.branch4.blocks.4.mlp.fc1.bias", "backbone.branch4.blocks.4.mlp.fc2.bias" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.0 }, "layer_10_decay": { "param_names": [ "backbone.branch1.blocks.9.attn.qkv.weight", "backbone.branch1.blocks.9.attn.proj.weight", "backbone.branch1.blocks.9.mlp.fc1.weight", "backbone.branch1.blocks.9.mlp.fc2.weight", "backbone.branch2.blocks.4.attn.qkv.weight", "backbone.branch2.blocks.4.attn.proj.weight", "backbone.branch2.blocks.4.mlp.fc1.weight", "backbone.branch2.blocks.4.mlp.fc2.weight", "backbone.branch3.blocks.4.attn.qkv.weight", "backbone.branch3.blocks.4.attn.proj.weight", "backbone.branch3.blocks.4.mlp.fc1.weight", "backbone.branch3.blocks.4.mlp.fc2.weight", "backbone.branch4.blocks.4.attn.qkv.weight", "backbone.branch4.blocks.4.attn.proj.weight", "backbone.branch4.blocks.4.mlp.fc1.weight", "backbone.branch4.blocks.4.mlp.fc2.weight" ], "lr_scale": 0.08735421910125167, "lr": 8.735421910125167e-06, "weight_decay": 0.05 }, "layer_11_no_decay": { "param_names": [ "backbone.branch1.blocks.10.gamma_1", "backbone.branch1.blocks.10.gamma_2", "backbone.branch1.blocks.10.norm1.weight", "backbone.branch1.blocks.10.norm1.bias", "backbone.branch1.blocks.10.attn.qkv.bias", "backbone.branch1.blocks.10.attn.proj.bias", "backbone.branch1.blocks.10.norm2.weight", "backbone.branch1.blocks.10.norm2.bias", "backbone.branch1.blocks.10.mlp.fc1.bias", "backbone.branch1.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.0 }, "layer_11_decay": { "param_names": [ "backbone.branch1.blocks.10.attn.qkv.weight", "backbone.branch1.blocks.10.attn.proj.weight", "backbone.branch1.blocks.10.mlp.fc1.weight", "backbone.branch1.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.10276966953088432, "lr": 1.0276966953088432e-05, "weight_decay": 0.05 }, "layer_12_no_decay": { "param_names": [ "backbone.branch1.blocks.11.gamma_1", "backbone.branch1.blocks.11.gamma_2", "backbone.branch1.blocks.11.norm1.weight", "backbone.branch1.blocks.11.norm1.bias", "backbone.branch1.blocks.11.attn.qkv.bias", "backbone.branch1.blocks.11.attn.proj.bias", "backbone.branch1.blocks.11.norm2.weight", "backbone.branch1.blocks.11.norm2.bias", "backbone.branch1.blocks.11.mlp.fc1.bias", "backbone.branch1.blocks.11.mlp.fc2.bias", "backbone.branch2.blocks.5.gamma_1", "backbone.branch2.blocks.5.gamma_2", "backbone.branch2.blocks.5.norm1.weight", "backbone.branch2.blocks.5.norm1.bias", "backbone.branch2.blocks.5.attn.qkv.bias", "backbone.branch2.blocks.5.attn.proj.bias", "backbone.branch2.blocks.5.norm2.weight", "backbone.branch2.blocks.5.norm2.bias", "backbone.branch2.blocks.5.mlp.fc1.bias", "backbone.branch2.blocks.5.mlp.fc2.bias", "backbone.branch3.blocks.5.gamma_1", "backbone.branch3.blocks.5.gamma_2", "backbone.branch3.blocks.5.norm1.weight", "backbone.branch3.blocks.5.norm1.bias", "backbone.branch3.blocks.5.attn.qkv.bias", "backbone.branch3.blocks.5.attn.proj.bias", "backbone.branch3.blocks.5.norm2.weight", "backbone.branch3.blocks.5.norm2.bias", "backbone.branch3.blocks.5.mlp.fc1.bias", "backbone.branch3.blocks.5.mlp.fc2.bias", "backbone.branch4.blocks.5.gamma_1", "backbone.branch4.blocks.5.gamma_2", "backbone.branch4.blocks.5.norm1.weight", "backbone.branch4.blocks.5.norm1.bias", "backbone.branch4.blocks.5.attn.qkv.bias", "backbone.branch4.blocks.5.attn.proj.bias", "backbone.branch4.blocks.5.norm2.weight", "backbone.branch4.blocks.5.norm2.bias", "backbone.branch4.blocks.5.mlp.fc1.bias", "backbone.branch4.blocks.5.mlp.fc2.bias" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.0 }, "layer_12_decay": { "param_names": [ "backbone.branch1.blocks.11.attn.qkv.weight", "backbone.branch1.blocks.11.attn.proj.weight", "backbone.branch1.blocks.11.mlp.fc1.weight", "backbone.branch1.blocks.11.mlp.fc2.weight", "backbone.branch2.blocks.5.attn.qkv.weight", "backbone.branch2.blocks.5.attn.proj.weight", "backbone.branch2.blocks.5.mlp.fc1.weight", "backbone.branch2.blocks.5.mlp.fc2.weight", "backbone.branch3.blocks.5.attn.qkv.weight", "backbone.branch3.blocks.5.attn.proj.weight", "backbone.branch3.blocks.5.mlp.fc1.weight", "backbone.branch3.blocks.5.mlp.fc2.weight", "backbone.branch4.blocks.5.attn.qkv.weight", "backbone.branch4.blocks.5.attn.proj.weight", "backbone.branch4.blocks.5.mlp.fc1.weight", "backbone.branch4.blocks.5.mlp.fc2.weight" ], "lr_scale": 0.12090549356574626, "lr": 1.2090549356574626e-05, "weight_decay": 0.05 }, "layer_13_no_decay": { "param_names": [ "backbone.branch1.blocks.12.gamma_1", "backbone.branch1.blocks.12.gamma_2", "backbone.branch1.blocks.12.norm1.weight", "backbone.branch1.blocks.12.norm1.bias", "backbone.branch1.blocks.12.attn.qkv.bias", "backbone.branch1.blocks.12.attn.proj.bias", "backbone.branch1.blocks.12.norm2.weight", "backbone.branch1.blocks.12.norm2.bias", "backbone.branch1.blocks.12.mlp.fc1.bias", "backbone.branch1.blocks.12.mlp.fc2.bias" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.0 }, "layer_13_decay": { "param_names": [ "backbone.branch1.blocks.12.attn.qkv.weight", "backbone.branch1.blocks.12.attn.proj.weight", "backbone.branch1.blocks.12.mlp.fc1.weight", "backbone.branch1.blocks.12.mlp.fc2.weight" ], "lr_scale": 0.14224175713617207, "lr": 1.4224175713617208e-05, "weight_decay": 0.05 }, "layer_14_no_decay": { "param_names": [ "backbone.branch1.blocks.13.gamma_1", "backbone.branch1.blocks.13.gamma_2", "backbone.branch1.blocks.13.norm1.weight", "backbone.branch1.blocks.13.norm1.bias", "backbone.branch1.blocks.13.attn.qkv.bias", "backbone.branch1.blocks.13.attn.proj.bias", "backbone.branch1.blocks.13.norm2.weight", "backbone.branch1.blocks.13.norm2.bias", "backbone.branch1.blocks.13.mlp.fc1.bias", "backbone.branch1.blocks.13.mlp.fc2.bias", "backbone.branch2.blocks.6.gamma_1", "backbone.branch2.blocks.6.gamma_2", "backbone.branch2.blocks.6.norm1.weight", "backbone.branch2.blocks.6.norm1.bias", "backbone.branch2.blocks.6.attn.qkv.bias", "backbone.branch2.blocks.6.attn.proj.bias", "backbone.branch2.blocks.6.norm2.weight", "backbone.branch2.blocks.6.norm2.bias", "backbone.branch2.blocks.6.mlp.fc1.bias", "backbone.branch2.blocks.6.mlp.fc2.bias", "backbone.branch3.blocks.6.gamma_1", "backbone.branch3.blocks.6.gamma_2", "backbone.branch3.blocks.6.norm1.weight", "backbone.branch3.blocks.6.norm1.bias", "backbone.branch3.blocks.6.attn.qkv.bias", "backbone.branch3.blocks.6.attn.proj.bias", "backbone.branch3.blocks.6.norm2.weight", "backbone.branch3.blocks.6.norm2.bias", "backbone.branch3.blocks.6.mlp.fc1.bias", "backbone.branch3.blocks.6.mlp.fc2.bias", "backbone.branch4.blocks.6.gamma_1", "backbone.branch4.blocks.6.gamma_2", "backbone.branch4.blocks.6.norm1.weight", "backbone.branch4.blocks.6.norm1.bias", "backbone.branch4.blocks.6.attn.qkv.bias", "backbone.branch4.blocks.6.attn.proj.bias", "backbone.branch4.blocks.6.norm2.weight", "backbone.branch4.blocks.6.norm2.bias", "backbone.branch4.blocks.6.mlp.fc1.bias", "backbone.branch4.blocks.6.mlp.fc2.bias" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.0 }, "layer_14_decay": { "param_names": [ "backbone.branch1.blocks.13.attn.qkv.weight", "backbone.branch1.blocks.13.attn.proj.weight", "backbone.branch1.blocks.13.mlp.fc1.weight", "backbone.branch1.blocks.13.mlp.fc2.weight", "backbone.branch2.blocks.6.attn.qkv.weight", "backbone.branch2.blocks.6.attn.proj.weight", "backbone.branch2.blocks.6.mlp.fc1.weight", "backbone.branch2.blocks.6.mlp.fc2.weight", "backbone.branch3.blocks.6.attn.qkv.weight", "backbone.branch3.blocks.6.attn.proj.weight", "backbone.branch3.blocks.6.mlp.fc1.weight", "backbone.branch3.blocks.6.mlp.fc2.weight", "backbone.branch4.blocks.6.attn.qkv.weight", "backbone.branch4.blocks.6.attn.proj.weight", "backbone.branch4.blocks.6.mlp.fc1.weight", "backbone.branch4.blocks.6.mlp.fc2.weight" ], "lr_scale": 0.1673432436896142, "lr": 1.673432436896142e-05, "weight_decay": 0.05 }, "layer_15_no_decay": { "param_names": [ "backbone.branch1.blocks.14.gamma_1", "backbone.branch1.blocks.14.gamma_2", "backbone.branch1.blocks.14.norm1.weight", "backbone.branch1.blocks.14.norm1.bias", "backbone.branch1.blocks.14.attn.qkv.bias", "backbone.branch1.blocks.14.attn.proj.bias", "backbone.branch1.blocks.14.norm2.weight", "backbone.branch1.blocks.14.norm2.bias", "backbone.branch1.blocks.14.mlp.fc1.bias", "backbone.branch1.blocks.14.mlp.fc2.bias" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.0 }, "layer_15_decay": { "param_names": [ "backbone.branch1.blocks.14.attn.qkv.weight", "backbone.branch1.blocks.14.attn.proj.weight", "backbone.branch1.blocks.14.mlp.fc1.weight", "backbone.branch1.blocks.14.mlp.fc2.weight" ], "lr_scale": 0.1968744043407226, "lr": 1.968744043407226e-05, "weight_decay": 0.05 }, "layer_16_no_decay": { "param_names": [ "backbone.branch1.blocks.15.gamma_1", "backbone.branch1.blocks.15.gamma_2", "backbone.branch1.blocks.15.norm1.weight", "backbone.branch1.blocks.15.norm1.bias", "backbone.branch1.blocks.15.attn.qkv.bias", "backbone.branch1.blocks.15.attn.proj.bias", "backbone.branch1.blocks.15.norm2.weight", "backbone.branch1.blocks.15.norm2.bias", "backbone.branch1.blocks.15.mlp.fc1.bias", "backbone.branch1.blocks.15.mlp.fc2.bias", "backbone.branch2.blocks.7.gamma_1", "backbone.branch2.blocks.7.gamma_2", "backbone.branch2.blocks.7.norm1.weight", "backbone.branch2.blocks.7.norm1.bias", "backbone.branch2.blocks.7.attn.qkv.bias", "backbone.branch2.blocks.7.attn.proj.bias", "backbone.branch2.blocks.7.norm2.weight", "backbone.branch2.blocks.7.norm2.bias", "backbone.branch2.blocks.7.mlp.fc1.bias", "backbone.branch2.blocks.7.mlp.fc2.bias", "backbone.branch3.blocks.7.gamma_1", "backbone.branch3.blocks.7.gamma_2", "backbone.branch3.blocks.7.norm1.weight", "backbone.branch3.blocks.7.norm1.bias", "backbone.branch3.blocks.7.attn.qkv.bias", "backbone.branch3.blocks.7.attn.proj.bias", "backbone.branch3.blocks.7.norm2.weight", "backbone.branch3.blocks.7.norm2.bias", "backbone.branch3.blocks.7.mlp.fc1.bias", "backbone.branch3.blocks.7.mlp.fc2.bias", "backbone.branch4.blocks.7.gamma_1", "backbone.branch4.blocks.7.gamma_2", "backbone.branch4.blocks.7.norm1.weight", "backbone.branch4.blocks.7.norm1.bias", "backbone.branch4.blocks.7.attn.qkv.bias", "backbone.branch4.blocks.7.attn.proj.bias", "backbone.branch4.blocks.7.norm2.weight", "backbone.branch4.blocks.7.norm2.bias", "backbone.branch4.blocks.7.mlp.fc1.bias", "backbone.branch4.blocks.7.mlp.fc2.bias" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.0 }, "layer_16_decay": { "param_names": [ "backbone.branch1.blocks.15.attn.qkv.weight", "backbone.branch1.blocks.15.attn.proj.weight", "backbone.branch1.blocks.15.mlp.fc1.weight", "backbone.branch1.blocks.15.mlp.fc2.weight", "backbone.branch2.blocks.7.attn.qkv.weight", "backbone.branch2.blocks.7.attn.proj.weight", "backbone.branch2.blocks.7.mlp.fc1.weight", "backbone.branch2.blocks.7.mlp.fc2.weight", "backbone.branch3.blocks.7.attn.qkv.weight", "backbone.branch3.blocks.7.attn.proj.weight", "backbone.branch3.blocks.7.mlp.fc1.weight", "backbone.branch3.blocks.7.mlp.fc2.weight", "backbone.branch4.blocks.7.attn.qkv.weight", "backbone.branch4.blocks.7.attn.proj.weight", "backbone.branch4.blocks.7.mlp.fc1.weight", "backbone.branch4.blocks.7.mlp.fc2.weight" ], "lr_scale": 0.23161694628320306, "lr": 2.3161694628320308e-05, "weight_decay": 0.05 }, "layer_17_no_decay": { "param_names": [ "backbone.branch1.blocks.16.gamma_1", "backbone.branch1.blocks.16.gamma_2", "backbone.branch1.blocks.16.norm1.weight", "backbone.branch1.blocks.16.norm1.bias", "backbone.branch1.blocks.16.attn.qkv.bias", "backbone.branch1.blocks.16.attn.proj.bias", "backbone.branch1.blocks.16.norm2.weight", "backbone.branch1.blocks.16.norm2.bias", "backbone.branch1.blocks.16.mlp.fc1.bias", "backbone.branch1.blocks.16.mlp.fc2.bias" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.0 }, "layer_17_decay": { "param_names": [ "backbone.branch1.blocks.16.attn.qkv.weight", "backbone.branch1.blocks.16.attn.proj.weight", "backbone.branch1.blocks.16.mlp.fc1.weight", "backbone.branch1.blocks.16.mlp.fc2.weight" ], "lr_scale": 0.27249052503906246, "lr": 2.7249052503906248e-05, "weight_decay": 0.05 }, "layer_18_no_decay": { "param_names": [ "backbone.branch1.blocks.17.gamma_1", "backbone.branch1.blocks.17.gamma_2", "backbone.branch1.blocks.17.norm1.weight", "backbone.branch1.blocks.17.norm1.bias", "backbone.branch1.blocks.17.attn.qkv.bias", "backbone.branch1.blocks.17.attn.proj.bias", "backbone.branch1.blocks.17.norm2.weight", "backbone.branch1.blocks.17.norm2.bias", "backbone.branch1.blocks.17.mlp.fc1.bias", "backbone.branch1.blocks.17.mlp.fc2.bias", "backbone.branch2.blocks.8.gamma_1", "backbone.branch2.blocks.8.gamma_2", "backbone.branch2.blocks.8.norm1.weight", "backbone.branch2.blocks.8.norm1.bias", "backbone.branch2.blocks.8.attn.qkv.bias", "backbone.branch2.blocks.8.attn.proj.bias", "backbone.branch2.blocks.8.norm2.weight", "backbone.branch2.blocks.8.norm2.bias", "backbone.branch2.blocks.8.mlp.fc1.bias", "backbone.branch2.blocks.8.mlp.fc2.bias", "backbone.branch3.blocks.8.gamma_1", "backbone.branch3.blocks.8.gamma_2", "backbone.branch3.blocks.8.norm1.weight", "backbone.branch3.blocks.8.norm1.bias", "backbone.branch3.blocks.8.attn.qkv.bias", "backbone.branch3.blocks.8.attn.proj.bias", "backbone.branch3.blocks.8.norm2.weight", "backbone.branch3.blocks.8.norm2.bias", "backbone.branch3.blocks.8.mlp.fc1.bias", "backbone.branch3.blocks.8.mlp.fc2.bias", "backbone.branch4.blocks.8.gamma_1", "backbone.branch4.blocks.8.gamma_2", "backbone.branch4.blocks.8.norm1.weight", "backbone.branch4.blocks.8.norm1.bias", "backbone.branch4.blocks.8.attn.qkv.bias", "backbone.branch4.blocks.8.attn.proj.bias", "backbone.branch4.blocks.8.norm2.weight", "backbone.branch4.blocks.8.norm2.bias", "backbone.branch4.blocks.8.mlp.fc1.bias", "backbone.branch4.blocks.8.mlp.fc2.bias" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.0 }, "layer_18_decay": { "param_names": [ "backbone.branch1.blocks.17.attn.qkv.weight", "backbone.branch1.blocks.17.attn.proj.weight", "backbone.branch1.blocks.17.mlp.fc1.weight", "backbone.branch1.blocks.17.mlp.fc2.weight", "backbone.branch2.blocks.8.attn.qkv.weight", "backbone.branch2.blocks.8.attn.proj.weight", "backbone.branch2.blocks.8.mlp.fc1.weight", "backbone.branch2.blocks.8.mlp.fc2.weight", "backbone.branch3.blocks.8.attn.qkv.weight", "backbone.branch3.blocks.8.attn.proj.weight", "backbone.branch3.blocks.8.mlp.fc1.weight", "backbone.branch3.blocks.8.mlp.fc2.weight", "backbone.branch4.blocks.8.attn.qkv.weight", "backbone.branch4.blocks.8.attn.proj.weight", "backbone.branch4.blocks.8.mlp.fc1.weight", "backbone.branch4.blocks.8.mlp.fc2.weight" ], "lr_scale": 0.3205770882812499, "lr": 3.2057708828124995e-05, "weight_decay": 0.05 }, "layer_19_no_decay": { "param_names": [ "backbone.branch1.blocks.18.gamma_1", "backbone.branch1.blocks.18.gamma_2", "backbone.branch1.blocks.18.norm1.weight", "backbone.branch1.blocks.18.norm1.bias", "backbone.branch1.blocks.18.attn.qkv.bias", "backbone.branch1.blocks.18.attn.proj.bias", "backbone.branch1.blocks.18.norm2.weight", "backbone.branch1.blocks.18.norm2.bias", "backbone.branch1.blocks.18.mlp.fc1.bias", "backbone.branch1.blocks.18.mlp.fc2.bias" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.0 }, "layer_19_decay": { "param_names": [ "backbone.branch1.blocks.18.attn.qkv.weight", "backbone.branch1.blocks.18.attn.proj.weight", "backbone.branch1.blocks.18.mlp.fc1.weight", "backbone.branch1.blocks.18.mlp.fc2.weight" ], "lr_scale": 0.37714951562499993, "lr": 3.77149515625e-05, "weight_decay": 0.05 }, "layer_20_no_decay": { "param_names": [ "backbone.branch1.blocks.19.gamma_1", "backbone.branch1.blocks.19.gamma_2", "backbone.branch1.blocks.19.norm1.weight", "backbone.branch1.blocks.19.norm1.bias", "backbone.branch1.blocks.19.attn.qkv.bias", "backbone.branch1.blocks.19.attn.proj.bias", "backbone.branch1.blocks.19.norm2.weight", "backbone.branch1.blocks.19.norm2.bias", "backbone.branch1.blocks.19.mlp.fc1.bias", "backbone.branch1.blocks.19.mlp.fc2.bias", "backbone.branch2.blocks.9.gamma_1", "backbone.branch2.blocks.9.gamma_2", "backbone.branch2.blocks.9.norm1.weight", "backbone.branch2.blocks.9.norm1.bias", "backbone.branch2.blocks.9.attn.qkv.bias", "backbone.branch2.blocks.9.attn.proj.bias", "backbone.branch2.blocks.9.norm2.weight", "backbone.branch2.blocks.9.norm2.bias", "backbone.branch2.blocks.9.mlp.fc1.bias", "backbone.branch2.blocks.9.mlp.fc2.bias", "backbone.branch3.blocks.9.gamma_1", "backbone.branch3.blocks.9.gamma_2", "backbone.branch3.blocks.9.norm1.weight", "backbone.branch3.blocks.9.norm1.bias", "backbone.branch3.blocks.9.attn.qkv.bias", "backbone.branch3.blocks.9.attn.proj.bias", "backbone.branch3.blocks.9.norm2.weight", "backbone.branch3.blocks.9.norm2.bias", "backbone.branch3.blocks.9.mlp.fc1.bias", "backbone.branch3.blocks.9.mlp.fc2.bias", "backbone.branch4.blocks.9.gamma_1", "backbone.branch4.blocks.9.gamma_2", "backbone.branch4.blocks.9.norm1.weight", "backbone.branch4.blocks.9.norm1.bias", "backbone.branch4.blocks.9.attn.qkv.bias", "backbone.branch4.blocks.9.attn.proj.bias", "backbone.branch4.blocks.9.norm2.weight", "backbone.branch4.blocks.9.norm2.bias", "backbone.branch4.blocks.9.mlp.fc1.bias", "backbone.branch4.blocks.9.mlp.fc2.bias" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.0 }, "layer_20_decay": { "param_names": [ "backbone.branch1.blocks.19.attn.qkv.weight", "backbone.branch1.blocks.19.attn.proj.weight", "backbone.branch1.blocks.19.mlp.fc1.weight", "backbone.branch1.blocks.19.mlp.fc2.weight", "backbone.branch2.blocks.9.attn.qkv.weight", "backbone.branch2.blocks.9.attn.proj.weight", "backbone.branch2.blocks.9.mlp.fc1.weight", "backbone.branch2.blocks.9.mlp.fc2.weight", "backbone.branch3.blocks.9.attn.qkv.weight", "backbone.branch3.blocks.9.attn.proj.weight", "backbone.branch3.blocks.9.mlp.fc1.weight", "backbone.branch3.blocks.9.mlp.fc2.weight", "backbone.branch4.blocks.9.attn.qkv.weight", "backbone.branch4.blocks.9.attn.proj.weight", "backbone.branch4.blocks.9.mlp.fc1.weight", "backbone.branch4.blocks.9.mlp.fc2.weight" ], "lr_scale": 0.44370531249999995, "lr": 4.4370531249999995e-05, "weight_decay": 0.05 }, "layer_21_no_decay": { "param_names": [ "backbone.branch1.blocks.20.gamma_1", "backbone.branch1.blocks.20.gamma_2", "backbone.branch1.blocks.20.norm1.weight", "backbone.branch1.blocks.20.norm1.bias", "backbone.branch1.blocks.20.attn.qkv.bias", "backbone.branch1.blocks.20.attn.proj.bias", "backbone.branch1.blocks.20.norm2.weight", "backbone.branch1.blocks.20.norm2.bias", "backbone.branch1.blocks.20.mlp.fc1.bias", "backbone.branch1.blocks.20.mlp.fc2.bias" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.0 }, "layer_21_decay": { "param_names": [ "backbone.branch1.blocks.20.attn.qkv.weight", "backbone.branch1.blocks.20.attn.proj.weight", "backbone.branch1.blocks.20.mlp.fc1.weight", "backbone.branch1.blocks.20.mlp.fc2.weight" ], "lr_scale": 0.5220062499999999, "lr": 5.220062499999999e-05, "weight_decay": 0.05 }, "layer_22_no_decay": { "param_names": [ "backbone.branch1.blocks.21.gamma_1", "backbone.branch1.blocks.21.gamma_2", "backbone.branch1.blocks.21.norm1.weight", "backbone.branch1.blocks.21.norm1.bias", "backbone.branch1.blocks.21.attn.qkv.bias", "backbone.branch1.blocks.21.attn.proj.bias", "backbone.branch1.blocks.21.norm2.weight", "backbone.branch1.blocks.21.norm2.bias", "backbone.branch1.blocks.21.mlp.fc1.bias", "backbone.branch1.blocks.21.mlp.fc2.bias", "backbone.branch2.blocks.10.gamma_1", "backbone.branch2.blocks.10.gamma_2", "backbone.branch2.blocks.10.norm1.weight", "backbone.branch2.blocks.10.norm1.bias", "backbone.branch2.blocks.10.attn.qkv.bias", "backbone.branch2.blocks.10.attn.proj.bias", "backbone.branch2.blocks.10.norm2.weight", "backbone.branch2.blocks.10.norm2.bias", "backbone.branch2.blocks.10.mlp.fc1.bias", "backbone.branch2.blocks.10.mlp.fc2.bias", "backbone.branch3.blocks.10.gamma_1", "backbone.branch3.blocks.10.gamma_2", "backbone.branch3.blocks.10.norm1.weight", "backbone.branch3.blocks.10.norm1.bias", "backbone.branch3.blocks.10.attn.qkv.bias", "backbone.branch3.blocks.10.attn.proj.bias", "backbone.branch3.blocks.10.norm2.weight", "backbone.branch3.blocks.10.norm2.bias", "backbone.branch3.blocks.10.mlp.fc1.bias", "backbone.branch3.blocks.10.mlp.fc2.bias", "backbone.branch4.blocks.10.gamma_1", "backbone.branch4.blocks.10.gamma_2", "backbone.branch4.blocks.10.norm1.weight", "backbone.branch4.blocks.10.norm1.bias", "backbone.branch4.blocks.10.attn.qkv.bias", "backbone.branch4.blocks.10.attn.proj.bias", "backbone.branch4.blocks.10.norm2.weight", "backbone.branch4.blocks.10.norm2.bias", "backbone.branch4.blocks.10.mlp.fc1.bias", "backbone.branch4.blocks.10.mlp.fc2.bias" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.0 }, "layer_22_decay": { "param_names": [ "backbone.branch1.blocks.21.attn.qkv.weight", "backbone.branch1.blocks.21.attn.proj.weight", "backbone.branch1.blocks.21.mlp.fc1.weight", "backbone.branch1.blocks.21.mlp.fc2.weight", "backbone.branch2.blocks.10.attn.qkv.weight", "backbone.branch2.blocks.10.attn.proj.weight", "backbone.branch2.blocks.10.mlp.fc1.weight", "backbone.branch2.blocks.10.mlp.fc2.weight", "backbone.branch3.blocks.10.attn.qkv.weight", "backbone.branch3.blocks.10.attn.proj.weight", "backbone.branch3.blocks.10.mlp.fc1.weight", "backbone.branch3.blocks.10.mlp.fc2.weight", "backbone.branch4.blocks.10.attn.qkv.weight", "backbone.branch4.blocks.10.attn.proj.weight", "backbone.branch4.blocks.10.mlp.fc1.weight", "backbone.branch4.blocks.10.mlp.fc2.weight" ], "lr_scale": 0.6141249999999999, "lr": 6.14125e-05, "weight_decay": 0.05 }, "layer_23_no_decay": { "param_names": [ "backbone.branch1.blocks.22.gamma_1", "backbone.branch1.blocks.22.gamma_2", "backbone.branch1.blocks.22.norm1.weight", "backbone.branch1.blocks.22.norm1.bias", "backbone.branch1.blocks.22.attn.qkv.bias", "backbone.branch1.blocks.22.attn.proj.bias", "backbone.branch1.blocks.22.norm2.weight", "backbone.branch1.blocks.22.norm2.bias", "backbone.branch1.blocks.22.mlp.fc1.bias", "backbone.branch1.blocks.22.mlp.fc2.bias" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.0 }, "layer_23_decay": { "param_names": [ "backbone.branch1.blocks.22.attn.qkv.weight", "backbone.branch1.blocks.22.attn.proj.weight", "backbone.branch1.blocks.22.mlp.fc1.weight", "backbone.branch1.blocks.22.mlp.fc2.weight" ], "lr_scale": 0.7224999999999999, "lr": 7.225e-05, "weight_decay": 0.05 }, "layer_24_no_decay": { "param_names": [ "backbone.branch1.blocks.23.gamma_1", "backbone.branch1.blocks.23.gamma_2", "backbone.branch1.blocks.23.norm1.weight", "backbone.branch1.blocks.23.norm1.bias", "backbone.branch1.blocks.23.attn.qkv.bias", "backbone.branch1.blocks.23.attn.proj.bias", "backbone.branch1.blocks.23.norm2.weight", "backbone.branch1.blocks.23.norm2.bias", "backbone.branch1.blocks.23.mlp.fc1.bias", "backbone.branch1.blocks.23.mlp.fc2.bias", "backbone.branch2.blocks.11.gamma_1", "backbone.branch2.blocks.11.gamma_2", "backbone.branch2.blocks.11.norm1.weight", "backbone.branch2.blocks.11.norm1.bias", "backbone.branch2.blocks.11.attn.qkv.bias", "backbone.branch2.blocks.11.attn.proj.bias", "backbone.branch2.blocks.11.norm2.weight", "backbone.branch2.blocks.11.norm2.bias", "backbone.branch2.blocks.11.mlp.fc1.bias", "backbone.branch2.blocks.11.mlp.fc2.bias", "backbone.branch3.blocks.11.gamma_1", "backbone.branch3.blocks.11.gamma_2", "backbone.branch3.blocks.11.norm1.weight", "backbone.branch3.blocks.11.norm1.bias", "backbone.branch3.blocks.11.attn.qkv.bias", "backbone.branch3.blocks.11.attn.proj.bias", "backbone.branch3.blocks.11.norm2.weight", "backbone.branch3.blocks.11.norm2.bias", "backbone.branch3.blocks.11.mlp.fc1.bias", "backbone.branch3.blocks.11.mlp.fc2.bias", "backbone.branch4.blocks.11.gamma_1", "backbone.branch4.blocks.11.gamma_2", "backbone.branch4.blocks.11.norm1.weight", "backbone.branch4.blocks.11.norm1.bias", "backbone.branch4.blocks.11.attn.qkv.bias", "backbone.branch4.blocks.11.attn.proj.bias", "backbone.branch4.blocks.11.norm2.weight", "backbone.branch4.blocks.11.norm2.bias", "backbone.branch4.blocks.11.mlp.fc1.bias", "backbone.branch4.blocks.11.mlp.fc2.bias" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.0 }, "layer_24_decay": { "param_names": [ "backbone.branch1.blocks.23.attn.qkv.weight", "backbone.branch1.blocks.23.attn.proj.weight", "backbone.branch1.blocks.23.mlp.fc1.weight", "backbone.branch1.blocks.23.mlp.fc2.weight", "backbone.branch2.blocks.11.attn.qkv.weight", "backbone.branch2.blocks.11.attn.proj.weight", "backbone.branch2.blocks.11.mlp.fc1.weight", "backbone.branch2.blocks.11.mlp.fc2.weight", "backbone.branch3.blocks.11.attn.qkv.weight", "backbone.branch3.blocks.11.attn.proj.weight", "backbone.branch3.blocks.11.mlp.fc1.weight", "backbone.branch3.blocks.11.mlp.fc2.weight", "backbone.branch4.blocks.11.attn.qkv.weight", "backbone.branch4.blocks.11.attn.proj.weight", "backbone.branch4.blocks.11.mlp.fc1.weight", "backbone.branch4.blocks.11.mlp.fc2.weight" ], "lr_scale": 0.85, "lr": 8.5e-05, "weight_decay": 0.05 }, "layer_25_no_decay": { "param_names": [ "backbone.interactions.0.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.0.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.0.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.0.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.1.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.1.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.1.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.2.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.2.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.2.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.3.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.3.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.3.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.4.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.4.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.4.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.5.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.5.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.5.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.6.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.6.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.6.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.7.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.7.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.7.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.8.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.8.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.8.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.9.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.9.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.9.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.10.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.10.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.10.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_12.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_23.branch1to2_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ca_gamma", "backbone.interactions.11.interaction_units_34.branch2to1_injector.cffn_gamma", "backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.query_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_34.branch2to1_injector.ffn_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ca_gamma", "backbone.interactions.11.interaction_units_34.branch1to2_injector.cffn_gamma", "backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.query_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.feat_norm.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.sampling_offsets.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.attention_weights.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.value_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.attn.output_proj.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc1.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.dwconv.dwconv.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn.fc2.bias", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.weight", "backbone.interactions.11.interaction_units_34.branch1to2_injector.ffn_norm.bias", "backbone.merge_branch1.1.weight", "backbone.merge_branch1.1.bias", "backbone.merge_branch1.4.weight", "backbone.merge_branch1.4.bias", "backbone.merge_branch2.1.weight", "backbone.merge_branch2.1.bias", "backbone.merge_branch2.4.weight", "backbone.merge_branch2.4.bias", "backbone.merge_branch3.1.weight", "backbone.merge_branch3.1.bias", "backbone.merge_branch3.4.weight", "backbone.merge_branch3.4.bias", "backbone.merge_branch4.1.weight", "backbone.merge_branch4.1.bias", "backbone.merge_branch4.4.weight", "backbone.merge_branch4.4.bias", "backbone.fpn1.0.bias", "backbone.fpn1.1.weight", "backbone.fpn1.1.bias", "backbone.fpn1.3.bias", "backbone.fpn2.0.bias", "neck.lateral_convs.0.conv.bias", "neck.lateral_convs.1.conv.bias", "neck.lateral_convs.2.conv.bias", "neck.lateral_convs.3.conv.bias", "neck.fpn_convs.0.conv.bias", "neck.fpn_convs.1.conv.bias", "neck.fpn_convs.2.conv.bias", "neck.fpn_convs.3.conv.bias", "rpn_head.rpn_conv.bias", "rpn_head.rpn_cls.bias", "rpn_head.rpn_reg.bias", "roi_head.bbox_head.fc_cls.bias", "roi_head.bbox_head.fc_reg.bias", "roi_head.bbox_head.shared_fcs.0.bias", "roi_head.bbox_head.shared_fcs.1.bias", "roi_head.mask_head.convs.0.conv.bias", "roi_head.mask_head.convs.1.conv.bias", "roi_head.mask_head.convs.2.conv.bias", "roi_head.mask_head.convs.3.conv.bias", "roi_head.mask_head.upsample.bias", "roi_head.mask_head.conv_logits.bias" ], "lr_scale": 1.0, "lr": 0.0001, "weight_decay": 0.0 } } 2024-05-30 12:04:15,065 - mmdet - INFO - Automatic scaling of learning rate (LR) has been disabled. 2024-05-30 12:04:15,481 - mmdet - INFO - Start running, work_dir: /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16 2024-05-30 12:04:15,481 - mmdet - INFO - Hooks will be executed in the following order: before_run: (VERY_HIGH ) StepLrUpdaterHook (49 ) ToBFloat16HookMMDet (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_epoch: (VERY_HIGH ) StepLrUpdaterHook (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_train_iter: (VERY_HIGH ) StepLrUpdaterHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook -------------------- after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) DeepspeedCheckpointHook (LOW ) IterTimerHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- after_train_epoch: (NORMAL ) DeepspeedCheckpointHook (LOW ) DeepspeedDistEvalHook (VERY_LOW ) TextLoggerHook -------------------- before_val_epoch: (NORMAL ) DistSamplerSeedHook (LOW ) IterTimerHook (VERY_LOW ) TextLoggerHook -------------------- before_val_iter: (LOW ) IterTimerHook -------------------- after_val_iter: (LOW ) IterTimerHook -------------------- after_val_epoch: (VERY_LOW ) TextLoggerHook -------------------- after_run: (VERY_LOW ) TextLoggerHook -------------------- 2024-05-30 12:04:15,481 - mmdet - INFO - workflow: [('train', 1)], max: 12 epochs 2024-05-30 12:04:15,497 - mmdet - INFO - Checkpoints will be saved to /mnt/petrelfs/PIIP/mmdetection/work_dirs/mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16 by HardDiskBackend. 2024-05-30 12:05:18,910 - mmdet - INFO - Epoch [1][50/7330] lr: 9.890e-06, eta: 1 day, 6:57:48, time: 1.268, data_time: 0.156, memory: 23981, loss_rpn_cls: 0.5680, loss_rpn_bbox: 0.1502, loss_cls: 1.7020, acc: 63.5959, loss_bbox: 0.0333, loss_mask: 1.4041, loss: 3.8576 2024-05-30 12:06:12,010 - mmdet - INFO - Epoch [1][100/7330] lr: 1.988e-05, eta: 1 day, 4:25:54, time: 1.062, data_time: 0.065, memory: 23981, loss_rpn_cls: 0.2903, loss_rpn_bbox: 0.1113, loss_cls: 0.3091, acc: 96.0645, loss_bbox: 0.1130, loss_mask: 0.7698, loss: 1.5936 2024-05-30 12:07:04,801 - mmdet - INFO - Epoch [1][150/7330] lr: 2.987e-05, eta: 1 day, 3:31:40, time: 1.056, data_time: 0.059, memory: 24117, loss_rpn_cls: 0.2348, loss_rpn_bbox: 0.0993, loss_cls: 0.3237, acc: 95.3987, loss_bbox: 0.1452, loss_mask: 0.7064, loss: 1.5094 2024-05-30 12:07:57,547 - mmdet - INFO - Epoch [1][200/7330] lr: 3.986e-05, eta: 1 day, 3:03:49, time: 1.055, data_time: 0.059, memory: 24117, loss_rpn_cls: 0.2166, loss_rpn_bbox: 0.1007, loss_cls: 0.2959, acc: 95.5745, loss_bbox: 0.1404, loss_mask: 0.6900, loss: 1.4436 2024-05-30 12:08:50,015 - mmdet - INFO - Epoch [1][250/7330] lr: 4.985e-05, eta: 1 day, 2:45:06, time: 1.049, data_time: 0.058, memory: 24289, loss_rpn_cls: 0.1983, loss_rpn_bbox: 0.1136, loss_cls: 0.3064, acc: 95.2397, loss_bbox: 0.1546, loss_mask: 0.6721, loss: 1.4449 2024-05-30 12:09:42,254 - mmdet - INFO - Epoch [1][300/7330] lr: 5.984e-05, eta: 1 day, 2:31:12, time: 1.045, data_time: 0.054, memory: 24289, loss_rpn_cls: 0.1537, loss_rpn_bbox: 0.0978, loss_cls: 0.3807, acc: 93.7185, loss_bbox: 0.2171, loss_mask: 0.6485, loss: 1.4978 2024-05-30 12:10:39,620 - mmdet - INFO - Epoch [1][350/7330] lr: 6.983e-05, eta: 1 day, 2:42:26, time: 1.147, data_time: 0.064, memory: 24289, loss_rpn_cls: 0.1287, loss_rpn_bbox: 0.0947, loss_cls: 0.4227, acc: 92.8149, loss_bbox: 0.2610, loss_mask: 0.6273, loss: 1.5344 2024-05-30 12:11:43,943 - mmdet - INFO - Epoch [1][400/7330] lr: 7.982e-05, eta: 1 day, 3:16:01, time: 1.287, data_time: 0.065, memory: 24289, loss_rpn_cls: 0.1268, loss_rpn_bbox: 0.1034, loss_cls: 0.4639, acc: 92.1003, loss_bbox: 0.2851, loss_mask: 0.6058, loss: 1.5851 2024-05-30 12:12:40,038 - mmdet - INFO - Epoch [1][450/7330] lr: 8.981e-05, eta: 1 day, 3:15:10, time: 1.122, data_time: 0.076, memory: 24289, loss_rpn_cls: 0.1156, loss_rpn_bbox: 0.1017, loss_cls: 0.4526, acc: 91.6694, loss_bbox: 0.3013, loss_mask: 0.5771, loss: 1.5483 2024-05-30 12:13:32,933 - mmdet - INFO - Epoch [1][500/7330] lr: 9.980e-05, eta: 1 day, 3:05:03, time: 1.058, data_time: 0.053, memory: 24289, loss_rpn_cls: 0.1034, loss_rpn_bbox: 0.0981, loss_cls: 0.4378, acc: 91.4822, loss_bbox: 0.3081, loss_mask: 0.5576, loss: 1.5051 2024-05-30 12:14:26,536 - mmdet - INFO - Epoch [1][550/7330] lr: 1.000e-04, eta: 1 day, 2:58:27, time: 1.072, data_time: 0.055, memory: 24289, loss_rpn_cls: 0.1003, loss_rpn_bbox: 0.0920, loss_cls: 0.4101, acc: 91.6868, loss_bbox: 0.2983, loss_mask: 0.5346, loss: 1.4353 2024-05-30 12:15:19,351 - mmdet - INFO - Epoch [1][600/7330] lr: 1.000e-04, eta: 1 day, 2:50:54, time: 1.056, data_time: 0.054, memory: 24289, loss_rpn_cls: 0.0937, loss_rpn_bbox: 0.0882, loss_cls: 0.4077, acc: 91.3015, loss_bbox: 0.3189, loss_mask: 0.5191, loss: 1.4276 2024-05-30 12:16:12,329 - mmdet - INFO - Epoch [1][650/7330] lr: 1.000e-04, eta: 1 day, 2:44:44, time: 1.060, data_time: 0.051, memory: 24289, loss_rpn_cls: 0.0863, loss_rpn_bbox: 0.0908, loss_cls: 0.3920, acc: 91.1548, loss_bbox: 0.3174, loss_mask: 0.5018, loss: 1.3884 2024-05-30 12:17:05,271 - mmdet - INFO - Epoch [1][700/7330] lr: 1.000e-04, eta: 1 day, 2:39:15, time: 1.059, data_time: 0.056, memory: 24289, loss_rpn_cls: 0.0804, loss_rpn_bbox: 0.0868, loss_cls: 0.3816, acc: 91.3035, loss_bbox: 0.3117, loss_mask: 0.4818, loss: 1.3424 2024-05-30 12:17:58,060 - mmdet - INFO - Epoch [1][750/7330] lr: 1.000e-04, eta: 1 day, 2:34:04, time: 1.056, data_time: 0.065, memory: 24289, loss_rpn_cls: 0.0879, loss_rpn_bbox: 0.0892, loss_cls: 0.3701, acc: 91.0164, loss_bbox: 0.3135, loss_mask: 0.4742, loss: 1.3349 2024-05-30 12:18:51,476 - mmdet - INFO - Epoch [1][800/7330] lr: 1.000e-04, eta: 1 day, 2:30:35, time: 1.068, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0811, loss_rpn_bbox: 0.0940, loss_cls: 0.3971, acc: 90.2947, loss_bbox: 0.3442, loss_mask: 0.4672, loss: 1.3835 2024-05-30 12:20:04,225 - mmdet - INFO - Epoch [1][850/7330] lr: 1.000e-04, eta: 1 day, 3:00:24, time: 1.455, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0760, loss_rpn_bbox: 0.0858, loss_cls: 0.3550, acc: 90.7258, loss_bbox: 0.3340, loss_mask: 0.4592, loss: 1.3100 2024-05-30 12:20:57,393 - mmdet - INFO - Epoch [1][900/7330] lr: 1.000e-04, eta: 1 day, 2:55:14, time: 1.064, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0741, loss_rpn_bbox: 0.0891, loss_cls: 0.3622, acc: 90.7244, loss_bbox: 0.3351, loss_mask: 0.4429, loss: 1.3035 2024-05-30 12:21:50,134 - mmdet - INFO - Epoch [1][950/7330] lr: 1.000e-04, eta: 1 day, 2:49:51, time: 1.055, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0753, loss_rpn_bbox: 0.0883, loss_cls: 0.3531, acc: 90.7783, loss_bbox: 0.3378, loss_mask: 0.4441, loss: 1.2986 2024-05-30 12:22:44,202 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 12:22:44,202 - mmdet - INFO - Epoch [1][1000/7330] lr: 1.000e-04, eta: 1 day, 2:46:51, time: 1.081, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0784, loss_rpn_bbox: 0.0924, loss_cls: 0.3644, acc: 90.0461, loss_bbox: 0.3527, loss_mask: 0.4362, loss: 1.3241 2024-05-30 12:23:38,789 - mmdet - INFO - Epoch [1][1050/7330] lr: 1.000e-04, eta: 1 day, 2:44:45, time: 1.092, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0779, loss_rpn_bbox: 0.0896, loss_cls: 0.3543, acc: 90.3154, loss_bbox: 0.3436, loss_mask: 0.4394, loss: 1.3048 2024-05-30 12:24:32,076 - mmdet - INFO - Epoch [1][1100/7330] lr: 1.000e-04, eta: 1 day, 2:41:03, time: 1.066, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0706, loss_rpn_bbox: 0.0849, loss_cls: 0.3398, acc: 90.5906, loss_bbox: 0.3388, loss_mask: 0.4250, loss: 1.2592 2024-05-30 12:25:26,070 - mmdet - INFO - Epoch [1][1150/7330] lr: 1.000e-04, eta: 1 day, 2:38:29, time: 1.080, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0643, loss_rpn_bbox: 0.0844, loss_cls: 0.3373, acc: 90.5107, loss_bbox: 0.3433, loss_mask: 0.4134, loss: 1.2426 2024-05-30 12:26:19,556 - mmdet - INFO - Epoch [1][1200/7330] lr: 1.000e-04, eta: 1 day, 2:35:27, time: 1.070, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0662, loss_rpn_bbox: 0.0828, loss_cls: 0.3593, acc: 89.9768, loss_bbox: 0.3612, loss_mask: 0.4268, loss: 1.2964 2024-05-30 12:27:13,294 - mmdet - INFO - Epoch [1][1250/7330] lr: 1.000e-04, eta: 1 day, 2:32:53, time: 1.075, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0667, loss_rpn_bbox: 0.0847, loss_cls: 0.3487, acc: 90.2371, loss_bbox: 0.3541, loss_mask: 0.4176, loss: 1.2718 2024-05-30 12:28:12,059 - mmdet - INFO - Epoch [1][1300/7330] lr: 1.000e-04, eta: 1 day, 2:36:01, time: 1.175, data_time: 0.047, memory: 24290, loss_rpn_cls: 0.0671, loss_rpn_bbox: 0.0794, loss_cls: 0.3237, acc: 90.8176, loss_bbox: 0.3318, loss_mask: 0.4113, loss: 1.2133 2024-05-30 12:29:09,438 - mmdet - INFO - Epoch [1][1350/7330] lr: 1.000e-04, eta: 1 day, 2:37:22, time: 1.148, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0623, loss_rpn_bbox: 0.0838, loss_cls: 0.3308, acc: 90.4290, loss_bbox: 0.3484, loss_mask: 0.4069, loss: 1.2322 2024-05-30 12:30:06,808 - mmdet - INFO - Epoch [1][1400/7330] lr: 1.000e-04, eta: 1 day, 2:38:32, time: 1.147, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0634, loss_rpn_bbox: 0.0827, loss_cls: 0.3497, acc: 90.0496, loss_bbox: 0.3557, loss_mask: 0.4087, loss: 1.2602 2024-05-30 12:31:00,218 - mmdet - INFO - Epoch [1][1450/7330] lr: 1.000e-04, eta: 1 day, 2:35:39, time: 1.068, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0583, loss_rpn_bbox: 0.0796, loss_cls: 0.3403, acc: 90.0544, loss_bbox: 0.3569, loss_mask: 0.3975, loss: 1.2327 2024-05-30 12:31:53,081 - mmdet - INFO - Epoch [1][1500/7330] lr: 1.000e-04, eta: 1 day, 2:32:21, time: 1.057, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0600, loss_rpn_bbox: 0.0777, loss_cls: 0.3156, acc: 90.7334, loss_bbox: 0.3320, loss_mask: 0.3958, loss: 1.1811 2024-05-30 12:32:46,770 - mmdet - INFO - Epoch [1][1550/7330] lr: 1.000e-04, eta: 1 day, 2:29:58, time: 1.074, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0649, loss_rpn_bbox: 0.0847, loss_cls: 0.3371, acc: 90.0278, loss_bbox: 0.3544, loss_mask: 0.3938, loss: 1.2349 2024-05-30 12:33:39,661 - mmdet - INFO - Epoch [1][1600/7330] lr: 1.000e-04, eta: 1 day, 2:26:59, time: 1.058, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0585, loss_rpn_bbox: 0.0787, loss_cls: 0.3223, acc: 90.4902, loss_bbox: 0.3420, loss_mask: 0.3951, loss: 1.1966 2024-05-30 12:34:34,449 - mmdet - INFO - Epoch [1][1650/7330] lr: 1.000e-04, eta: 1 day, 2:25:46, time: 1.096, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0615, loss_rpn_bbox: 0.0864, loss_cls: 0.3506, acc: 89.5081, loss_bbox: 0.3776, loss_mask: 0.4007, loss: 1.2768 2024-05-30 12:35:28,418 - mmdet - INFO - Epoch [1][1700/7330] lr: 1.000e-04, eta: 1 day, 2:23:52, time: 1.079, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0571, loss_rpn_bbox: 0.0828, loss_cls: 0.3387, acc: 90.0520, loss_bbox: 0.3594, loss_mask: 0.3873, loss: 1.2253 2024-05-30 12:36:26,099 - mmdet - INFO - Epoch [1][1750/7330] lr: 1.000e-04, eta: 1 day, 2:25:05, time: 1.154, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0547, loss_rpn_bbox: 0.0799, loss_cls: 0.3286, acc: 90.1638, loss_bbox: 0.3554, loss_mask: 0.3856, loss: 1.2043 2024-05-30 12:37:27,164 - mmdet - INFO - Epoch [1][1800/7330] lr: 1.000e-04, eta: 1 day, 2:28:52, time: 1.221, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0631, loss_rpn_bbox: 0.0854, loss_cls: 0.3274, acc: 90.1533, loss_bbox: 0.3530, loss_mask: 0.3806, loss: 1.2095 2024-05-30 12:38:23,769 - mmdet - INFO - Epoch [1][1850/7330] lr: 1.000e-04, eta: 1 day, 2:28:57, time: 1.132, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0569, loss_rpn_bbox: 0.0784, loss_cls: 0.3249, acc: 90.4175, loss_bbox: 0.3425, loss_mask: 0.3735, loss: 1.1762 2024-05-30 12:39:17,008 - mmdet - INFO - Epoch [1][1900/7330] lr: 1.000e-04, eta: 1 day, 2:26:25, time: 1.065, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0632, loss_rpn_bbox: 0.0810, loss_cls: 0.3226, acc: 90.2898, loss_bbox: 0.3486, loss_mask: 0.3806, loss: 1.1959 2024-05-30 12:40:10,988 - mmdet - INFO - Epoch [1][1950/7330] lr: 1.000e-04, eta: 1 day, 2:24:32, time: 1.080, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0535, loss_rpn_bbox: 0.0800, loss_cls: 0.3116, acc: 90.5527, loss_bbox: 0.3360, loss_mask: 0.3702, loss: 1.1513 2024-05-30 12:41:04,029 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 12:41:04,029 - mmdet - INFO - Epoch [1][2000/7330] lr: 1.000e-04, eta: 1 day, 2:22:01, time: 1.061, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0567, loss_rpn_bbox: 0.0777, loss_cls: 0.3213, acc: 90.2651, loss_bbox: 0.3493, loss_mask: 0.3674, loss: 1.1724 2024-05-30 12:41:57,019 - mmdet - INFO - Epoch [1][2050/7330] lr: 1.000e-04, eta: 1 day, 2:19:32, time: 1.060, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0529, loss_rpn_bbox: 0.0753, loss_cls: 0.3197, acc: 89.9805, loss_bbox: 0.3577, loss_mask: 0.3692, loss: 1.1748 2024-05-30 12:42:50,660 - mmdet - INFO - Epoch [1][2100/7330] lr: 1.000e-04, eta: 1 day, 2:17:35, time: 1.073, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0535, loss_rpn_bbox: 0.0793, loss_cls: 0.3165, acc: 90.2456, loss_bbox: 0.3497, loss_mask: 0.3702, loss: 1.1691 2024-05-30 12:43:44,863 - mmdet - INFO - Epoch [1][2150/7330] lr: 1.000e-04, eta: 1 day, 2:16:03, time: 1.084, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0525, loss_rpn_bbox: 0.0771, loss_cls: 0.3253, acc: 90.0120, loss_bbox: 0.3565, loss_mask: 0.3737, loss: 1.1852 2024-05-30 12:44:39,989 - mmdet - INFO - Epoch [1][2200/7330] lr: 1.000e-04, eta: 1 day, 2:15:09, time: 1.103, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0544, loss_rpn_bbox: 0.0816, loss_cls: 0.3295, acc: 89.8872, loss_bbox: 0.3600, loss_mask: 0.3743, loss: 1.1999 2024-05-30 12:45:42,699 - mmdet - INFO - Epoch [1][2250/7330] lr: 1.000e-04, eta: 1 day, 2:19:04, time: 1.254, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0518, loss_rpn_bbox: 0.0767, loss_cls: 0.3115, acc: 90.2729, loss_bbox: 0.3514, loss_mask: 0.3639, loss: 1.1555 2024-05-30 12:46:38,092 - mmdet - INFO - Epoch [1][2300/7330] lr: 1.000e-04, eta: 1 day, 2:18:13, time: 1.108, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0508, loss_rpn_bbox: 0.0741, loss_cls: 0.3079, acc: 90.6121, loss_bbox: 0.3341, loss_mask: 0.3576, loss: 1.1245 2024-05-30 12:47:33,355 - mmdet - INFO - Epoch [1][2350/7330] lr: 1.000e-04, eta: 1 day, 2:17:18, time: 1.105, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0531, loss_rpn_bbox: 0.0743, loss_cls: 0.3148, acc: 90.1724, loss_bbox: 0.3500, loss_mask: 0.3588, loss: 1.1510 2024-05-30 12:48:25,453 - mmdet - INFO - Epoch [1][2400/7330] lr: 1.000e-04, eta: 1 day, 2:14:29, time: 1.042, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0531, loss_rpn_bbox: 0.0771, loss_cls: 0.3088, acc: 90.3486, loss_bbox: 0.3458, loss_mask: 0.3555, loss: 1.1402 2024-05-30 12:49:18,725 - mmdet - INFO - Epoch [1][2450/7330] lr: 1.000e-04, eta: 1 day, 2:12:26, time: 1.065, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0514, loss_rpn_bbox: 0.0788, loss_cls: 0.3217, acc: 90.0549, loss_bbox: 0.3489, loss_mask: 0.3594, loss: 1.1601 2024-05-30 12:50:11,750 - mmdet - INFO - Epoch [1][2500/7330] lr: 1.000e-04, eta: 1 day, 2:10:18, time: 1.061, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0516, loss_rpn_bbox: 0.0748, loss_cls: 0.3166, acc: 90.0852, loss_bbox: 0.3482, loss_mask: 0.3530, loss: 1.1441 2024-05-30 12:51:05,112 - mmdet - INFO - Epoch [1][2550/7330] lr: 1.000e-04, eta: 1 day, 2:08:23, time: 1.067, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0473, loss_rpn_bbox: 0.0746, loss_cls: 0.2998, acc: 90.4597, loss_bbox: 0.3420, loss_mask: 0.3521, loss: 1.1157 2024-05-30 12:51:58,646 - mmdet - INFO - Epoch [1][2600/7330] lr: 1.000e-04, eta: 1 day, 2:06:38, time: 1.071, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0530, loss_rpn_bbox: 0.0735, loss_cls: 0.3012, acc: 90.5254, loss_bbox: 0.3412, loss_mask: 0.3500, loss: 1.1189 2024-05-30 12:52:53,866 - mmdet - INFO - Epoch [1][2650/7330] lr: 1.000e-04, eta: 1 day, 2:05:48, time: 1.104, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0482, loss_rpn_bbox: 0.0696, loss_cls: 0.3013, acc: 90.5669, loss_bbox: 0.3392, loss_mask: 0.3525, loss: 1.1108 2024-05-30 12:53:51,021 - mmdet - INFO - Epoch [1][2700/7330] lr: 1.000e-04, eta: 1 day, 2:05:59, time: 1.143, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0454, loss_rpn_bbox: 0.0722, loss_cls: 0.2872, acc: 90.6494, loss_bbox: 0.3342, loss_mask: 0.3432, loss: 1.0822 2024-05-30 12:54:51,768 - mmdet - INFO - Epoch [1][2750/7330] lr: 1.000e-04, eta: 1 day, 2:07:58, time: 1.215, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0468, loss_rpn_bbox: 0.0761, loss_cls: 0.3145, acc: 89.7532, loss_bbox: 0.3681, loss_mask: 0.3530, loss: 1.1585 2024-05-30 12:55:47,521 - mmdet - INFO - Epoch [1][2800/7330] lr: 1.000e-04, eta: 1 day, 2:07:20, time: 1.115, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0491, loss_rpn_bbox: 0.0744, loss_cls: 0.3077, acc: 90.2822, loss_bbox: 0.3480, loss_mask: 0.3484, loss: 1.1276 2024-05-30 12:56:40,980 - mmdet - INFO - Epoch [1][2850/7330] lr: 1.000e-04, eta: 1 day, 2:05:32, time: 1.069, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0508, loss_rpn_bbox: 0.0722, loss_cls: 0.3082, acc: 90.2908, loss_bbox: 0.3426, loss_mask: 0.3470, loss: 1.1207 2024-05-30 12:57:34,212 - mmdet - INFO - Epoch [1][2900/7330] lr: 1.000e-04, eta: 1 day, 2:03:40, time: 1.065, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0503, loss_rpn_bbox: 0.0744, loss_cls: 0.3099, acc: 90.2661, loss_bbox: 0.3452, loss_mask: 0.3450, loss: 1.1249 2024-05-30 12:58:27,020 - mmdet - INFO - Epoch [1][2950/7330] lr: 1.000e-04, eta: 1 day, 2:01:37, time: 1.056, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0479, loss_rpn_bbox: 0.0720, loss_cls: 0.3042, acc: 90.2935, loss_bbox: 0.3463, loss_mask: 0.3439, loss: 1.1144 2024-05-30 12:59:20,861 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 12:59:20,861 - mmdet - INFO - Epoch [1][3000/7330] lr: 1.000e-04, eta: 1 day, 2:00:06, time: 1.077, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0528, loss_rpn_bbox: 0.0723, loss_cls: 0.2930, acc: 90.4834, loss_bbox: 0.3405, loss_mask: 0.3475, loss: 1.1061 2024-05-30 13:00:13,626 - mmdet - INFO - Epoch [1][3050/7330] lr: 1.000e-04, eta: 1 day, 1:58:06, time: 1.055, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0449, loss_rpn_bbox: 0.0716, loss_cls: 0.2890, acc: 90.8357, loss_bbox: 0.3337, loss_mask: 0.3356, loss: 1.0748 2024-05-30 13:01:06,940 - mmdet - INFO - Epoch [1][3100/7330] lr: 1.000e-04, eta: 1 day, 1:56:24, time: 1.066, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0468, loss_rpn_bbox: 0.0716, loss_cls: 0.2988, acc: 90.4514, loss_bbox: 0.3376, loss_mask: 0.3352, loss: 1.0900 2024-05-30 13:02:00,689 - mmdet - INFO - Epoch [1][3150/7330] lr: 1.000e-04, eta: 1 day, 1:54:55, time: 1.075, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0463, loss_rpn_bbox: 0.0749, loss_cls: 0.3063, acc: 90.2405, loss_bbox: 0.3450, loss_mask: 0.3391, loss: 1.1116 2024-05-30 13:03:04,997 - mmdet - INFO - Epoch [1][3200/7330] lr: 1.000e-04, eta: 1 day, 1:58:06, time: 1.286, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0469, loss_rpn_bbox: 0.0756, loss_cls: 0.3124, acc: 90.1184, loss_bbox: 0.3456, loss_mask: 0.3407, loss: 1.1212 2024-05-30 13:03:58,396 - mmdet - INFO - Epoch [1][3250/7330] lr: 1.000e-04, eta: 1 day, 1:56:25, time: 1.068, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0707, loss_cls: 0.3023, acc: 90.4910, loss_bbox: 0.3382, loss_mask: 0.3428, loss: 1.0992 2024-05-30 13:04:53,814 - mmdet - INFO - Epoch [1][3300/7330] lr: 1.000e-04, eta: 1 day, 1:55:38, time: 1.108, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0455, loss_rpn_bbox: 0.0757, loss_cls: 0.3131, acc: 89.9448, loss_bbox: 0.3587, loss_mask: 0.3386, loss: 1.1316 2024-05-30 13:05:49,069 - mmdet - INFO - Epoch [1][3350/7330] lr: 1.000e-04, eta: 1 day, 1:54:46, time: 1.105, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0474, loss_rpn_bbox: 0.0708, loss_cls: 0.2965, acc: 90.5427, loss_bbox: 0.3372, loss_mask: 0.3325, loss: 1.0845 2024-05-30 13:06:42,330 - mmdet - INFO - Epoch [1][3400/7330] lr: 1.000e-04, eta: 1 day, 1:53:04, time: 1.065, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0457, loss_rpn_bbox: 0.0722, loss_cls: 0.2905, acc: 90.6892, loss_bbox: 0.3299, loss_mask: 0.3291, loss: 1.0673 2024-05-30 13:07:36,141 - mmdet - INFO - Epoch [1][3450/7330] lr: 1.000e-04, eta: 1 day, 1:51:38, time: 1.076, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0438, loss_rpn_bbox: 0.0723, loss_cls: 0.2938, acc: 90.5137, loss_bbox: 0.3371, loss_mask: 0.3417, loss: 1.0888 2024-05-30 13:08:29,278 - mmdet - INFO - Epoch [1][3500/7330] lr: 1.000e-04, eta: 1 day, 1:49:56, time: 1.063, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0454, loss_rpn_bbox: 0.0701, loss_cls: 0.2884, acc: 90.7571, loss_bbox: 0.3318, loss_mask: 0.3270, loss: 1.0628 2024-05-30 13:09:22,734 - mmdet - INFO - Epoch [1][3550/7330] lr: 1.000e-04, eta: 1 day, 1:48:23, time: 1.069, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0490, loss_rpn_bbox: 0.0725, loss_cls: 0.2814, acc: 90.6323, loss_bbox: 0.3361, loss_mask: 0.3296, loss: 1.0687 2024-05-30 13:10:15,960 - mmdet - INFO - Epoch [1][3600/7330] lr: 1.000e-04, eta: 1 day, 1:46:45, time: 1.065, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0485, loss_rpn_bbox: 0.0736, loss_cls: 0.2961, acc: 90.4561, loss_bbox: 0.3338, loss_mask: 0.3320, loss: 1.0840 2024-05-30 13:11:16,267 - mmdet - INFO - Epoch [1][3650/7330] lr: 1.000e-04, eta: 1 day, 1:47:53, time: 1.206, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0423, loss_rpn_bbox: 0.0639, loss_cls: 0.2848, acc: 90.9270, loss_bbox: 0.3270, loss_mask: 0.3259, loss: 1.0439 2024-05-30 13:12:15,187 - mmdet - INFO - Epoch [1][3700/7330] lr: 1.000e-04, eta: 1 day, 1:47:44, time: 1.142, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0442, loss_rpn_bbox: 0.0717, loss_cls: 0.2922, acc: 90.3955, loss_bbox: 0.3398, loss_mask: 0.3344, loss: 1.0824 2024-05-30 13:13:08,548 - mmdet - INFO - Epoch [1][3750/7330] lr: 1.000e-04, eta: 1 day, 1:46:50, time: 1.103, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0686, loss_cls: 0.2728, acc: 91.0679, loss_bbox: 0.3238, loss_mask: 0.3308, loss: 1.0391 2024-05-30 13:14:04,004 - mmdet - INFO - Epoch [1][3800/7330] lr: 1.000e-04, eta: 1 day, 1:46:03, time: 1.109, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0444, loss_rpn_bbox: 0.0721, loss_cls: 0.2896, acc: 90.6021, loss_bbox: 0.3335, loss_mask: 0.3257, loss: 1.0654 2024-05-30 13:14:56,708 - mmdet - INFO - Epoch [1][3850/7330] lr: 1.000e-04, eta: 1 day, 1:44:15, time: 1.054, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0440, loss_rpn_bbox: 0.0707, loss_cls: 0.2943, acc: 90.5122, loss_bbox: 0.3338, loss_mask: 0.3260, loss: 1.0686 2024-05-30 13:15:49,782 - mmdet - INFO - Epoch [1][3900/7330] lr: 1.000e-04, eta: 1 day, 1:42:37, time: 1.061, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0433, loss_rpn_bbox: 0.0759, loss_cls: 0.3096, acc: 89.9238, loss_bbox: 0.3492, loss_mask: 0.3386, loss: 1.1166 2024-05-30 13:16:44,504 - mmdet - INFO - Epoch [1][3950/7330] lr: 1.000e-04, eta: 1 day, 1:41:35, time: 1.094, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0450, loss_rpn_bbox: 0.0707, loss_cls: 0.2865, acc: 90.6812, loss_bbox: 0.3293, loss_mask: 0.3271, loss: 1.0587 2024-05-30 13:17:38,241 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 13:17:38,241 - mmdet - INFO - Epoch [1][4000/7330] lr: 1.000e-04, eta: 1 day, 1:40:12, time: 1.075, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0472, loss_rpn_bbox: 0.0749, loss_cls: 0.3127, acc: 89.8577, loss_bbox: 0.3640, loss_mask: 0.3368, loss: 1.1356 2024-05-30 13:18:31,193 - mmdet - INFO - Epoch [1][4050/7330] lr: 1.000e-04, eta: 1 day, 1:38:34, time: 1.059, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0394, loss_rpn_bbox: 0.0681, loss_cls: 0.2788, acc: 90.9370, loss_bbox: 0.3242, loss_mask: 0.3224, loss: 1.0329 2024-05-30 13:19:28,172 - mmdet - INFO - Epoch [1][4100/7330] lr: 1.000e-04, eta: 1 day, 1:38:19, time: 1.140, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0451, loss_rpn_bbox: 0.0711, loss_cls: 0.2854, acc: 90.7854, loss_bbox: 0.3259, loss_mask: 0.3256, loss: 1.0531 2024-05-30 13:20:26,420 - mmdet - INFO - Epoch [1][4150/7330] lr: 1.000e-04, eta: 1 day, 1:38:29, time: 1.165, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0424, loss_rpn_bbox: 0.0717, loss_cls: 0.2870, acc: 90.5200, loss_bbox: 0.3341, loss_mask: 0.3274, loss: 1.0625 2024-05-30 13:21:19,127 - mmdet - INFO - Epoch [1][4200/7330] lr: 1.000e-04, eta: 1 day, 1:36:47, time: 1.054, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0437, loss_rpn_bbox: 0.0694, loss_cls: 0.2858, acc: 90.8066, loss_bbox: 0.3290, loss_mask: 0.3252, loss: 1.0531 2024-05-30 13:22:18,125 - mmdet - INFO - Epoch [1][4250/7330] lr: 1.000e-04, eta: 1 day, 1:37:10, time: 1.180, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0490, loss_rpn_bbox: 0.0727, loss_cls: 0.2969, acc: 90.4363, loss_bbox: 0.3416, loss_mask: 0.3283, loss: 1.0884 2024-05-30 13:23:13,958 - mmdet - INFO - Epoch [1][4300/7330] lr: 1.000e-04, eta: 1 day, 1:36:29, time: 1.117, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0448, loss_rpn_bbox: 0.0762, loss_cls: 0.2852, acc: 90.6294, loss_bbox: 0.3309, loss_mask: 0.3206, loss: 1.0578 2024-05-30 13:24:07,249 - mmdet - INFO - Epoch [1][4350/7330] lr: 1.000e-04, eta: 1 day, 1:34:59, time: 1.066, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0426, loss_rpn_bbox: 0.0701, loss_cls: 0.2867, acc: 90.6240, loss_bbox: 0.3305, loss_mask: 0.3264, loss: 1.0562 2024-05-30 13:25:01,283 - mmdet - INFO - Epoch [1][4400/7330] lr: 1.000e-04, eta: 1 day, 1:33:44, time: 1.081, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0430, loss_rpn_bbox: 0.0685, loss_cls: 0.2861, acc: 90.7651, loss_bbox: 0.3312, loss_mask: 0.3257, loss: 1.0545 2024-05-30 13:25:54,223 - mmdet - INFO - Epoch [1][4450/7330] lr: 1.000e-04, eta: 1 day, 1:32:09, time: 1.059, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0456, loss_rpn_bbox: 0.0689, loss_cls: 0.2839, acc: 90.7839, loss_bbox: 0.3289, loss_mask: 0.3183, loss: 1.0457 2024-05-30 13:26:47,401 - mmdet - INFO - Epoch [1][4500/7330] lr: 1.000e-04, eta: 1 day, 1:30:40, time: 1.063, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0426, loss_rpn_bbox: 0.0691, loss_cls: 0.2874, acc: 90.4622, loss_bbox: 0.3353, loss_mask: 0.3294, loss: 1.0638 2024-05-30 13:27:40,749 - mmdet - INFO - Epoch [1][4550/7330] lr: 1.000e-04, eta: 1 day, 1:29:14, time: 1.067, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0395, loss_rpn_bbox: 0.0677, loss_cls: 0.2868, acc: 90.5833, loss_bbox: 0.3342, loss_mask: 0.3248, loss: 1.0531 2024-05-30 13:28:43,661 - mmdet - INFO - Epoch [1][4600/7330] lr: 1.000e-04, eta: 1 day, 1:30:43, time: 1.258, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0647, loss_cls: 0.2744, acc: 90.9351, loss_bbox: 0.3203, loss_mask: 0.3160, loss: 1.0148 2024-05-30 13:29:37,078 - mmdet - INFO - Epoch [1][4650/7330] lr: 1.000e-04, eta: 1 day, 1:29:17, time: 1.068, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0664, loss_cls: 0.2854, acc: 90.5835, loss_bbox: 0.3342, loss_mask: 0.3210, loss: 1.0472 2024-05-30 13:30:32,032 - mmdet - INFO - Epoch [1][4700/7330] lr: 1.000e-04, eta: 1 day, 1:28:20, time: 1.099, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0417, loss_rpn_bbox: 0.0662, loss_cls: 0.2800, acc: 90.9373, loss_bbox: 0.3254, loss_mask: 0.3118, loss: 1.0251 2024-05-30 13:31:28,106 - mmdet - INFO - Epoch [1][4750/7330] lr: 1.000e-04, eta: 1 day, 1:27:43, time: 1.122, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0418, loss_rpn_bbox: 0.0708, loss_cls: 0.2842, acc: 90.4878, loss_bbox: 0.3388, loss_mask: 0.3163, loss: 1.0518 2024-05-30 13:32:21,577 - mmdet - INFO - Epoch [1][4800/7330] lr: 1.000e-04, eta: 1 day, 1:26:20, time: 1.069, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0446, loss_rpn_bbox: 0.0692, loss_cls: 0.2856, acc: 90.6499, loss_bbox: 0.3277, loss_mask: 0.3257, loss: 1.0527 2024-05-30 13:33:16,936 - mmdet - INFO - Epoch [1][4850/7330] lr: 1.000e-04, eta: 1 day, 1:25:30, time: 1.107, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0667, loss_cls: 0.2793, acc: 90.8140, loss_bbox: 0.3226, loss_mask: 0.3098, loss: 1.0195 2024-05-30 13:34:11,383 - mmdet - INFO - Epoch [1][4900/7330] lr: 1.000e-04, eta: 1 day, 1:24:24, time: 1.089, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0417, loss_rpn_bbox: 0.0730, loss_cls: 0.2847, acc: 90.4641, loss_bbox: 0.3394, loss_mask: 0.3207, loss: 1.0596 2024-05-30 13:35:04,547 - mmdet - INFO - Epoch [1][4950/7330] lr: 1.000e-04, eta: 1 day, 1:22:58, time: 1.063, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0419, loss_rpn_bbox: 0.0656, loss_cls: 0.2692, acc: 91.1074, loss_bbox: 0.3149, loss_mask: 0.3114, loss: 1.0030 2024-05-30 13:35:58,635 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 13:35:58,635 - mmdet - INFO - Epoch [1][5000/7330] lr: 1.000e-04, eta: 1 day, 1:21:47, time: 1.082, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0411, loss_rpn_bbox: 0.0687, loss_cls: 0.2819, acc: 90.7935, loss_bbox: 0.3275, loss_mask: 0.3226, loss: 1.0419 2024-05-30 13:37:00,193 - mmdet - INFO - Epoch [1][5050/7330] lr: 1.000e-04, eta: 1 day, 1:22:39, time: 1.231, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0411, loss_rpn_bbox: 0.0671, loss_cls: 0.2784, acc: 90.9236, loss_bbox: 0.3173, loss_mask: 0.3128, loss: 1.0166 2024-05-30 13:37:54,328 - mmdet - INFO - Epoch [1][5100/7330] lr: 1.000e-04, eta: 1 day, 1:21:28, time: 1.083, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0630, loss_cls: 0.2581, acc: 91.6609, loss_bbox: 0.2973, loss_mask: 0.3044, loss: 0.9607 2024-05-30 13:38:47,087 - mmdet - INFO - Epoch [1][5150/7330] lr: 1.000e-04, eta: 1 day, 1:19:56, time: 1.055, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0402, loss_rpn_bbox: 0.0673, loss_cls: 0.2709, acc: 91.1802, loss_bbox: 0.3176, loss_mask: 0.3097, loss: 1.0057 2024-05-30 13:39:42,344 - mmdet - INFO - Epoch [1][5200/7330] lr: 1.000e-04, eta: 1 day, 1:19:04, time: 1.105, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0440, loss_rpn_bbox: 0.0679, loss_cls: 0.2674, acc: 90.9998, loss_bbox: 0.3224, loss_mask: 0.3120, loss: 1.0138 2024-05-30 13:40:35,833 - mmdet - INFO - Epoch [1][5250/7330] lr: 1.000e-04, eta: 1 day, 1:17:44, time: 1.070, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0673, loss_cls: 0.2669, acc: 91.0144, loss_bbox: 0.3157, loss_mask: 0.3109, loss: 0.9997 2024-05-30 13:41:31,427 - mmdet - INFO - Epoch [1][5300/7330] lr: 1.000e-04, eta: 1 day, 1:16:57, time: 1.112, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0401, loss_rpn_bbox: 0.0663, loss_cls: 0.2725, acc: 91.0137, loss_bbox: 0.3216, loss_mask: 0.3172, loss: 1.0177 2024-05-30 13:42:27,551 - mmdet - INFO - Epoch [1][5350/7330] lr: 1.000e-04, eta: 1 day, 1:16:19, time: 1.123, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0420, loss_rpn_bbox: 0.0689, loss_cls: 0.2834, acc: 90.5879, loss_bbox: 0.3207, loss_mask: 0.3105, loss: 1.0255 2024-05-30 13:43:21,638 - mmdet - INFO - Epoch [1][5400/7330] lr: 1.000e-04, eta: 1 day, 1:15:09, time: 1.082, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0398, loss_rpn_bbox: 0.0685, loss_cls: 0.2806, acc: 90.7429, loss_bbox: 0.3282, loss_mask: 0.3172, loss: 1.0342 2024-05-30 13:44:14,248 - mmdet - INFO - Epoch [1][5450/7330] lr: 1.000e-04, eta: 1 day, 1:13:36, time: 1.052, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0645, loss_cls: 0.2684, acc: 91.1199, loss_bbox: 0.3143, loss_mask: 0.3103, loss: 0.9951 2024-05-30 13:45:09,635 - mmdet - INFO - Epoch [1][5500/7330] lr: 1.000e-04, eta: 1 day, 1:12:47, time: 1.108, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0407, loss_rpn_bbox: 0.0655, loss_cls: 0.2618, acc: 91.3108, loss_bbox: 0.3140, loss_mask: 0.3052, loss: 0.9872 2024-05-30 13:46:10,706 - mmdet - INFO - Epoch [1][5550/7330] lr: 1.000e-04, eta: 1 day, 1:13:21, time: 1.221, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0414, loss_rpn_bbox: 0.0660, loss_cls: 0.2757, acc: 90.9197, loss_bbox: 0.3180, loss_mask: 0.3075, loss: 1.0086 2024-05-30 13:47:03,682 - mmdet - INFO - Epoch [1][5600/7330] lr: 1.000e-04, eta: 1 day, 1:11:55, time: 1.059, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0389, loss_rpn_bbox: 0.0643, loss_cls: 0.2705, acc: 91.1760, loss_bbox: 0.3071, loss_mask: 0.2995, loss: 0.9802 2024-05-30 13:47:58,959 - mmdet - INFO - Epoch [1][5650/7330] lr: 1.000e-04, eta: 1 day, 1:11:03, time: 1.105, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0372, loss_rpn_bbox: 0.0667, loss_cls: 0.2748, acc: 90.7451, loss_bbox: 0.3270, loss_mask: 0.3150, loss: 1.0208 2024-05-30 13:48:53,555 - mmdet - INFO - Epoch [1][5700/7330] lr: 1.000e-04, eta: 1 day, 1:10:01, time: 1.092, data_time: 0.050, memory: 24290, loss_rpn_cls: 0.0396, loss_rpn_bbox: 0.0691, loss_cls: 0.2769, acc: 90.7800, loss_bbox: 0.3238, loss_mask: 0.3135, loss: 1.0229 2024-05-30 13:49:46,645 - mmdet - INFO - Epoch [1][5750/7330] lr: 1.000e-04, eta: 1 day, 1:08:38, time: 1.062, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0629, loss_cls: 0.2733, acc: 90.9138, loss_bbox: 0.3168, loss_mask: 0.3040, loss: 0.9957 2024-05-30 13:50:42,341 - mmdet - INFO - Epoch [1][5800/7330] lr: 1.000e-04, eta: 1 day, 1:07:52, time: 1.114, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0358, loss_rpn_bbox: 0.0617, loss_cls: 0.2772, acc: 90.9021, loss_bbox: 0.3174, loss_mask: 0.3089, loss: 1.0010 2024-05-30 13:51:35,705 - mmdet - INFO - Epoch [1][5850/7330] lr: 1.000e-04, eta: 1 day, 1:06:33, time: 1.067, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0391, loss_rpn_bbox: 0.0665, loss_cls: 0.2786, acc: 90.7915, loss_bbox: 0.3263, loss_mask: 0.3067, loss: 1.0173 2024-05-30 13:52:31,075 - mmdet - INFO - Epoch [1][5900/7330] lr: 1.000e-04, eta: 1 day, 1:05:42, time: 1.107, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0397, loss_rpn_bbox: 0.0660, loss_cls: 0.2596, acc: 91.4929, loss_bbox: 0.3046, loss_mask: 0.3049, loss: 0.9748 2024-05-30 13:53:26,845 - mmdet - INFO - Epoch [1][5950/7330] lr: 1.000e-04, eta: 1 day, 1:04:57, time: 1.115, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0397, loss_rpn_bbox: 0.0680, loss_cls: 0.2757, acc: 90.7571, loss_bbox: 0.3276, loss_mask: 0.3121, loss: 1.0231 2024-05-30 13:54:28,236 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 13:54:28,236 - mmdet - INFO - Epoch [1][6000/7330] lr: 1.000e-04, eta: 1 day, 1:05:28, time: 1.227, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0370, loss_rpn_bbox: 0.0635, loss_cls: 0.2775, acc: 90.8079, loss_bbox: 0.3192, loss_mask: 0.3078, loss: 1.0051 2024-05-30 13:55:21,135 - mmdet - INFO - Epoch [1][6050/7330] lr: 1.000e-04, eta: 1 day, 1:04:04, time: 1.058, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0599, loss_cls: 0.2587, acc: 91.4683, loss_bbox: 0.3013, loss_mask: 0.3028, loss: 0.9606 2024-05-30 13:56:16,603 - mmdet - INFO - Epoch [1][6100/7330] lr: 1.000e-04, eta: 1 day, 1:03:14, time: 1.109, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0687, loss_cls: 0.2686, acc: 91.0613, loss_bbox: 0.3160, loss_mask: 0.3091, loss: 0.9988 2024-05-30 13:57:10,932 - mmdet - INFO - Epoch [1][6150/7330] lr: 1.000e-04, eta: 1 day, 1:02:08, time: 1.087, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0410, loss_rpn_bbox: 0.0700, loss_cls: 0.2792, acc: 90.6704, loss_bbox: 0.3272, loss_mask: 0.3153, loss: 1.0326 2024-05-30 13:58:03,888 - mmdet - INFO - Epoch [1][6200/7330] lr: 1.000e-04, eta: 1 day, 1:00:45, time: 1.059, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0375, loss_rpn_bbox: 0.0662, loss_cls: 0.2606, acc: 91.3857, loss_bbox: 0.3065, loss_mask: 0.3018, loss: 0.9726 2024-05-30 13:58:57,669 - mmdet - INFO - Epoch [1][6250/7330] lr: 1.000e-04, eta: 1 day, 0:59:33, time: 1.076, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0386, loss_rpn_bbox: 0.0659, loss_cls: 0.2746, acc: 90.7407, loss_bbox: 0.3296, loss_mask: 0.3096, loss: 1.0183 2024-05-30 13:59:50,515 - mmdet - INFO - Epoch [1][6300/7330] lr: 1.000e-04, eta: 1 day, 0:58:10, time: 1.057, data_time: 0.048, memory: 24290, loss_rpn_cls: 0.0371, loss_rpn_bbox: 0.0611, loss_cls: 0.2615, acc: 91.2168, loss_bbox: 0.3110, loss_mask: 0.2995, loss: 0.9701 2024-05-30 14:00:45,732 - mmdet - INFO - Epoch [1][6350/7330] lr: 1.000e-04, eta: 1 day, 0:57:17, time: 1.104, data_time: 0.044, memory: 24290, loss_rpn_cls: 0.0379, loss_rpn_bbox: 0.0645, loss_cls: 0.2634, acc: 91.2341, loss_bbox: 0.3108, loss_mask: 0.3010, loss: 0.9776 2024-05-30 14:01:42,568 - mmdet - INFO - Epoch [1][6400/7330] lr: 1.000e-04, eta: 1 day, 0:56:45, time: 1.137, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0587, loss_cls: 0.2474, acc: 91.7698, loss_bbox: 0.2889, loss_mask: 0.2896, loss: 0.9186 2024-05-30 14:02:39,687 - mmdet - INFO - Epoch [1][6450/7330] lr: 1.000e-04, eta: 1 day, 0:56:16, time: 1.142, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0602, loss_cls: 0.2625, acc: 91.1726, loss_bbox: 0.3131, loss_mask: 0.2989, loss: 0.9715 2024-05-30 14:03:37,366 - mmdet - INFO - Epoch [1][6500/7330] lr: 1.000e-04, eta: 1 day, 0:55:53, time: 1.154, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0377, loss_rpn_bbox: 0.0652, loss_cls: 0.2661, acc: 91.0173, loss_bbox: 0.3126, loss_mask: 0.2962, loss: 0.9779 2024-05-30 14:04:31,935 - mmdet - INFO - Epoch [1][6550/7330] lr: 1.000e-04, eta: 1 day, 0:54:52, time: 1.091, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0378, loss_rpn_bbox: 0.0618, loss_cls: 0.2695, acc: 90.8870, loss_bbox: 0.3134, loss_mask: 0.3000, loss: 0.9825 2024-05-30 14:05:24,876 - mmdet - INFO - Epoch [1][6600/7330] lr: 1.000e-04, eta: 1 day, 0:53:30, time: 1.059, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0613, loss_cls: 0.2681, acc: 90.9800, loss_bbox: 0.3167, loss_mask: 0.3031, loss: 0.9837 2024-05-30 14:06:18,596 - mmdet - INFO - Epoch [1][6650/7330] lr: 1.000e-04, eta: 1 day, 0:52:19, time: 1.074, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0653, loss_cls: 0.2632, acc: 91.0630, loss_bbox: 0.3132, loss_mask: 0.2969, loss: 0.9752 2024-05-30 14:07:11,399 - mmdet - INFO - Epoch [1][6700/7330] lr: 1.000e-04, eta: 1 day, 0:50:56, time: 1.056, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0397, loss_rpn_bbox: 0.0641, loss_cls: 0.2648, acc: 91.2126, loss_bbox: 0.3078, loss_mask: 0.2994, loss: 0.9758 2024-05-30 14:08:05,596 - mmdet - INFO - Epoch [1][6750/7330] lr: 1.000e-04, eta: 1 day, 0:49:51, time: 1.084, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0376, loss_rpn_bbox: 0.0635, loss_cls: 0.2627, acc: 91.1838, loss_bbox: 0.3146, loss_mask: 0.3033, loss: 0.9817 2024-05-30 14:08:59,113 - mmdet - INFO - Epoch [1][6800/7330] lr: 1.000e-04, eta: 1 day, 0:48:38, time: 1.070, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0395, loss_rpn_bbox: 0.0632, loss_cls: 0.2653, acc: 91.2212, loss_bbox: 0.3078, loss_mask: 0.3016, loss: 0.9774 2024-05-30 14:09:56,371 - mmdet - INFO - Epoch [1][6850/7330] lr: 1.000e-04, eta: 1 day, 0:48:09, time: 1.145, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0383, loss_rpn_bbox: 0.0689, loss_cls: 0.2629, acc: 91.2310, loss_bbox: 0.3148, loss_mask: 0.3049, loss: 0.9897 2024-05-30 14:10:51,397 - mmdet - INFO - Epoch [1][6900/7330] lr: 1.000e-04, eta: 1 day, 0:47:14, time: 1.101, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0622, loss_cls: 0.2634, acc: 91.2812, loss_bbox: 0.3059, loss_mask: 0.2993, loss: 0.9663 2024-05-30 14:11:51,667 - mmdet - INFO - Epoch [1][6950/7330] lr: 1.000e-04, eta: 1 day, 0:47:20, time: 1.205, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0359, loss_rpn_bbox: 0.0634, loss_cls: 0.2601, acc: 91.3005, loss_bbox: 0.3096, loss_mask: 0.2983, loss: 0.9672 2024-05-30 14:12:46,480 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 14:12:46,480 - mmdet - INFO - Epoch [1][7000/7330] lr: 1.000e-04, eta: 1 day, 0:46:22, time: 1.096, data_time: 0.043, memory: 24290, loss_rpn_cls: 0.0360, loss_rpn_bbox: 0.0606, loss_cls: 0.2573, acc: 91.3416, loss_bbox: 0.3038, loss_mask: 0.3004, loss: 0.9581 2024-05-30 14:13:39,999 - mmdet - INFO - Epoch [1][7050/7330] lr: 1.000e-04, eta: 1 day, 0:45:09, time: 1.070, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0392, loss_rpn_bbox: 0.0656, loss_cls: 0.2630, acc: 91.1790, loss_bbox: 0.3090, loss_mask: 0.3023, loss: 0.9791 2024-05-30 14:14:33,138 - mmdet - INFO - Epoch [1][7100/7330] lr: 1.000e-04, eta: 1 day, 0:43:52, time: 1.063, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0654, loss_cls: 0.2666, acc: 91.0969, loss_bbox: 0.3119, loss_mask: 0.2952, loss: 0.9765 2024-05-30 14:15:26,463 - mmdet - INFO - Epoch [1][7150/7330] lr: 1.000e-04, eta: 1 day, 0:42:37, time: 1.067, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0362, loss_rpn_bbox: 0.0655, loss_cls: 0.2655, acc: 91.0974, loss_bbox: 0.3130, loss_mask: 0.3054, loss: 0.9856 2024-05-30 14:16:20,400 - mmdet - INFO - Epoch [1][7200/7330] lr: 1.000e-04, eta: 1 day, 0:41:30, time: 1.079, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0393, loss_rpn_bbox: 0.0655, loss_cls: 0.2667, acc: 91.1313, loss_bbox: 0.3067, loss_mask: 0.3014, loss: 0.9796 2024-05-30 14:17:14,638 - mmdet - INFO - Epoch [1][7250/7330] lr: 1.000e-04, eta: 1 day, 0:40:26, time: 1.085, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0374, loss_rpn_bbox: 0.0656, loss_cls: 0.2626, acc: 91.1140, loss_bbox: 0.3109, loss_mask: 0.2994, loss: 0.9760 2024-05-30 14:18:11,957 - mmdet - INFO - Epoch [1][7300/7330] lr: 1.000e-04, eta: 1 day, 0:39:56, time: 1.146, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0368, loss_rpn_bbox: 0.0689, loss_cls: 0.2713, acc: 91.0396, loss_bbox: 0.3140, loss_mask: 0.3016, loss: 0.9925 2024-05-30 14:18:45,681 - mmdet - INFO - Saving checkpoint at 1 epochs 2024-05-30 14:21:14,073 - mmdet - INFO - Evaluating bbox... 2024-05-30 14:21:42,765 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.309 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.568 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.307 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.171 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.343 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.439 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.436 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.436 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.436 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.251 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.482 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.588 2024-05-30 14:21:42,765 - mmdet - INFO - Evaluating segm... 2024-05-30 14:22:13,622 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.305 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.533 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.310 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.120 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.333 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.496 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.419 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.419 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.419 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.211 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.469 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.596 2024-05-30 14:22:14,131 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 14:22:14,132 - mmdet - INFO - Epoch(val) [1][625] bbox_mAP: 0.3090, bbox_mAP_50: 0.5680, bbox_mAP_75: 0.3070, bbox_mAP_s: 0.1710, bbox_mAP_m: 0.3430, bbox_mAP_l: 0.4390, bbox_mAP_copypaste: 0.309 0.568 0.307 0.171 0.343 0.439, segm_mAP: 0.3050, segm_mAP_50: 0.5330, segm_mAP_75: 0.3100, segm_mAP_s: 0.1200, segm_mAP_m: 0.3330, segm_mAP_l: 0.4960, segm_mAP_copypaste: 0.305 0.533 0.310 0.120 0.333 0.496 2024-05-30 14:23:14,131 - mmdet - INFO - Epoch [2][50/7330] lr: 1.000e-04, eta: 1 day, 0:33:21, time: 1.199, data_time: 0.133, memory: 24290, loss_rpn_cls: 0.0365, loss_rpn_bbox: 0.0623, loss_cls: 0.2515, acc: 91.3225, loss_bbox: 0.3113, loss_mask: 0.2930, loss: 0.9547 2024-05-30 14:24:10,752 - mmdet - INFO - Epoch [2][100/7330] lr: 1.000e-04, eta: 1 day, 0:32:46, time: 1.132, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0638, loss_cls: 0.2589, acc: 91.1443, loss_bbox: 0.3077, loss_mask: 0.2939, loss: 0.9598 2024-05-30 14:25:06,706 - mmdet - INFO - Epoch [2][150/7330] lr: 1.000e-04, eta: 1 day, 0:32:02, time: 1.119, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0595, loss_cls: 0.2494, acc: 91.4995, loss_bbox: 0.2974, loss_mask: 0.2944, loss: 0.9324 2024-05-30 14:26:00,848 - mmdet - INFO - Epoch [2][200/7330] lr: 1.000e-04, eta: 1 day, 0:31:00, time: 1.083, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0345, loss_rpn_bbox: 0.0632, loss_cls: 0.2503, acc: 91.4021, loss_bbox: 0.3048, loss_mask: 0.2949, loss: 0.9478 2024-05-30 14:26:55,387 - mmdet - INFO - Epoch [2][250/7330] lr: 1.000e-04, eta: 1 day, 0:30:01, time: 1.091, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0593, loss_cls: 0.2419, acc: 91.7046, loss_bbox: 0.2937, loss_mask: 0.2873, loss: 0.9152 2024-05-30 14:27:49,564 - mmdet - INFO - Epoch [2][300/7330] lr: 1.000e-04, eta: 1 day, 0:28:59, time: 1.084, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0641, loss_cls: 0.2593, acc: 91.1443, loss_bbox: 0.3136, loss_mask: 0.2939, loss: 0.9641 2024-05-30 14:28:47,744 - mmdet - INFO - Epoch [2][350/7330] lr: 1.000e-04, eta: 1 day, 0:28:39, time: 1.164, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0621, loss_cls: 0.2593, acc: 91.0562, loss_bbox: 0.3115, loss_mask: 0.2964, loss: 0.9620 2024-05-30 14:29:43,767 - mmdet - INFO - Epoch [2][400/7330] lr: 1.000e-04, eta: 1 day, 0:27:56, time: 1.120, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0611, loss_cls: 0.2534, acc: 91.4805, loss_bbox: 0.3025, loss_mask: 0.2945, loss: 0.9471 2024-05-30 14:30:40,431 - mmdet - INFO - Epoch [2][450/7330] lr: 1.000e-04, eta: 1 day, 0:27:19, time: 1.133, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0356, loss_rpn_bbox: 0.0633, loss_cls: 0.2572, acc: 91.3521, loss_bbox: 0.3040, loss_mask: 0.2952, loss: 0.9553 2024-05-30 14:31:34,458 - mmdet - INFO - Epoch [2][500/7330] lr: 1.000e-04, eta: 1 day, 0:26:16, time: 1.081, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0351, loss_rpn_bbox: 0.0619, loss_cls: 0.2438, acc: 91.5654, loss_bbox: 0.2997, loss_mask: 0.2898, loss: 0.9302 2024-05-30 14:32:29,338 - mmdet - INFO - Epoch [2][550/7330] lr: 1.000e-04, eta: 1 day, 0:25:21, time: 1.098, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0342, loss_rpn_bbox: 0.0645, loss_cls: 0.2481, acc: 91.5449, loss_bbox: 0.3017, loss_mask: 0.2869, loss: 0.9354 2024-05-30 14:33:28,789 - mmdet - INFO - Epoch [2][600/7330] lr: 1.000e-04, eta: 1 day, 0:25:12, time: 1.189, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0638, loss_cls: 0.2480, acc: 91.4783, loss_bbox: 0.3047, loss_mask: 0.2916, loss: 0.9415 2024-05-30 14:34:27,178 - mmdet - INFO - Epoch [2][650/7330] lr: 1.000e-04, eta: 1 day, 0:24:51, time: 1.168, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0647, loss_cls: 0.2576, acc: 91.2725, loss_bbox: 0.3073, loss_mask: 0.3016, loss: 0.9647 2024-05-30 14:35:21,758 - mmdet - INFO - Epoch [2][700/7330] lr: 1.000e-04, eta: 1 day, 0:23:53, time: 1.092, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0336, loss_rpn_bbox: 0.0595, loss_cls: 0.2530, acc: 91.5696, loss_bbox: 0.2935, loss_mask: 0.2860, loss: 0.9256 2024-05-30 14:36:15,949 - mmdet - INFO - Epoch [2][750/7330] lr: 1.000e-04, eta: 1 day, 0:22:50, time: 1.084, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0604, loss_cls: 0.2538, acc: 91.3745, loss_bbox: 0.3047, loss_mask: 0.2947, loss: 0.9489 2024-05-30 14:37:10,097 - mmdet - INFO - Epoch [2][800/7330] lr: 1.000e-04, eta: 1 day, 0:21:48, time: 1.083, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0327, loss_rpn_bbox: 0.0619, loss_cls: 0.2492, acc: 91.4180, loss_bbox: 0.3068, loss_mask: 0.2944, loss: 0.9448 2024-05-30 14:38:03,963 - mmdet - INFO - Epoch [2][850/7330] lr: 1.000e-04, eta: 1 day, 0:20:42, time: 1.077, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0623, loss_cls: 0.2500, acc: 91.5225, loss_bbox: 0.3012, loss_mask: 0.2933, loss: 0.9373 2024-05-30 14:39:03,520 - mmdet - INFO - Epoch [2][900/7330] lr: 1.000e-04, eta: 1 day, 0:20:32, time: 1.191, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0348, loss_rpn_bbox: 0.0612, loss_cls: 0.2508, acc: 91.6201, loss_bbox: 0.2983, loss_mask: 0.3015, loss: 0.9467 2024-05-30 14:39:57,090 - mmdet - INFO - Epoch [2][950/7330] lr: 1.000e-04, eta: 1 day, 0:19:24, time: 1.071, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0578, loss_cls: 0.2458, acc: 91.6160, loss_bbox: 0.2994, loss_mask: 0.2875, loss: 0.9217 2024-05-30 14:40:52,627 - mmdet - INFO - Epoch [2][1000/7330] lr: 1.000e-04, eta: 1 day, 0:18:35, time: 1.111, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0350, loss_rpn_bbox: 0.0635, loss_cls: 0.2606, acc: 91.0876, loss_bbox: 0.3131, loss_mask: 0.2970, loss: 0.9692 2024-05-30 14:41:48,844 - mmdet - INFO - Epoch [2][1050/7330] lr: 1.000e-04, eta: 1 day, 0:17:52, time: 1.124, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0346, loss_rpn_bbox: 0.0631, loss_cls: 0.2512, acc: 91.5806, loss_bbox: 0.3030, loss_mask: 0.2903, loss: 0.9422 2024-05-30 14:42:45,953 - mmdet - INFO - Epoch [2][1100/7330] lr: 1.000e-04, eta: 1 day, 0:17:17, time: 1.142, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0586, loss_cls: 0.2459, acc: 91.6458, loss_bbox: 0.2996, loss_mask: 0.2838, loss: 0.9212 2024-05-30 14:43:43,479 - mmdet - INFO - Epoch [2][1150/7330] lr: 1.000e-04, eta: 1 day, 0:16:46, time: 1.150, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0555, loss_cls: 0.2301, acc: 92.2903, loss_bbox: 0.2773, loss_mask: 0.2776, loss: 0.8701 2024-05-30 14:44:39,773 - mmdet - INFO - Epoch [2][1200/7330] lr: 1.000e-04, eta: 1 day, 0:16:03, time: 1.126, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0601, loss_cls: 0.2521, acc: 91.4666, loss_bbox: 0.3048, loss_mask: 0.2976, loss: 0.9479 2024-05-30 14:45:33,941 - mmdet - INFO - Epoch [2][1250/7330] lr: 1.000e-04, eta: 1 day, 0:15:01, time: 1.083, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0355, loss_rpn_bbox: 0.0614, loss_cls: 0.2542, acc: 91.2292, loss_bbox: 0.3032, loss_mask: 0.2929, loss: 0.9472 2024-05-30 14:46:30,121 - mmdet - INFO - Epoch [2][1300/7330] lr: 1.000e-04, eta: 1 day, 0:14:16, time: 1.123, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0621, loss_cls: 0.2505, acc: 91.5449, loss_bbox: 0.3002, loss_mask: 0.2947, loss: 0.9392 2024-05-30 14:47:24,034 - mmdet - INFO - Epoch [2][1350/7330] lr: 1.000e-04, eta: 1 day, 0:13:11, time: 1.078, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0620, loss_cls: 0.2500, acc: 91.3447, loss_bbox: 0.3003, loss_mask: 0.2925, loss: 0.9363 2024-05-30 14:48:19,931 - mmdet - INFO - Epoch [2][1400/7330] lr: 1.000e-04, eta: 1 day, 0:12:25, time: 1.118, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0353, loss_rpn_bbox: 0.0581, loss_cls: 0.2438, acc: 91.6697, loss_bbox: 0.2969, loss_mask: 0.2829, loss: 0.9170 2024-05-30 14:49:16,168 - mmdet - INFO - Epoch [2][1450/7330] lr: 1.000e-04, eta: 1 day, 0:11:41, time: 1.125, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0333, loss_rpn_bbox: 0.0593, loss_cls: 0.2405, acc: 91.9590, loss_bbox: 0.2878, loss_mask: 0.2896, loss: 0.9106 2024-05-30 14:50:10,032 - mmdet - INFO - Epoch [2][1500/7330] lr: 1.000e-04, eta: 1 day, 0:10:36, time: 1.077, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0584, loss_cls: 0.2534, acc: 91.4204, loss_bbox: 0.3000, loss_mask: 0.2882, loss: 0.9323 2024-05-30 14:51:03,512 - mmdet - INFO - Epoch [2][1550/7330] lr: 1.000e-04, eta: 1 day, 0:09:27, time: 1.070, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0610, loss_cls: 0.2423, acc: 91.5540, loss_bbox: 0.2972, loss_mask: 0.2865, loss: 0.9199 2024-05-30 14:51:57,749 - mmdet - INFO - Epoch [2][1600/7330] lr: 1.000e-04, eta: 1 day, 0:08:26, time: 1.085, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0344, loss_rpn_bbox: 0.0614, loss_cls: 0.2543, acc: 91.3613, loss_bbox: 0.3072, loss_mask: 0.2964, loss: 0.9536 2024-05-30 14:52:54,061 - mmdet - INFO - Epoch [2][1650/7330] lr: 1.000e-04, eta: 1 day, 0:07:42, time: 1.126, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0605, loss_cls: 0.2490, acc: 91.6846, loss_bbox: 0.2960, loss_mask: 0.2929, loss: 0.9303 2024-05-30 14:53:51,368 - mmdet - INFO - Epoch [2][1700/7330] lr: 1.000e-04, eta: 1 day, 0:07:08, time: 1.146, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0319, loss_rpn_bbox: 0.0615, loss_cls: 0.2575, acc: 91.1611, loss_bbox: 0.3132, loss_mask: 0.2877, loss: 0.9518 2024-05-30 14:54:52,949 - mmdet - INFO - Epoch [2][1750/7330] lr: 1.000e-04, eta: 1 day, 0:07:10, time: 1.232, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0604, loss_cls: 0.2551, acc: 91.2686, loss_bbox: 0.3013, loss_mask: 0.2908, loss: 0.9394 2024-05-30 14:55:46,101 - mmdet - INFO - Epoch [2][1800/7330] lr: 1.000e-04, eta: 1 day, 0:05:58, time: 1.063, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0304, loss_rpn_bbox: 0.0601, loss_cls: 0.2326, acc: 91.9727, loss_bbox: 0.2857, loss_mask: 0.2828, loss: 0.8917 2024-05-30 14:56:42,134 - mmdet - INFO - Epoch [2][1850/7330] lr: 1.000e-04, eta: 1 day, 0:05:12, time: 1.121, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0565, loss_cls: 0.2414, acc: 91.8652, loss_bbox: 0.2851, loss_mask: 0.2758, loss: 0.8878 2024-05-30 14:57:36,200 - mmdet - INFO - Epoch [2][1900/7330] lr: 1.000e-04, eta: 1 day, 0:04:09, time: 1.081, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0623, loss_cls: 0.2488, acc: 91.6582, loss_bbox: 0.2987, loss_mask: 0.2928, loss: 0.9355 2024-05-30 14:58:30,205 - mmdet - INFO - Epoch [2][1950/7330] lr: 1.000e-04, eta: 1 day, 0:03:05, time: 1.080, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0340, loss_rpn_bbox: 0.0624, loss_cls: 0.2466, acc: 91.3794, loss_bbox: 0.3042, loss_mask: 0.2962, loss: 0.9435 2024-05-30 14:59:26,671 - mmdet - INFO - Epoch [2][2000/7330] lr: 1.000e-04, eta: 1 day, 0:02:22, time: 1.129, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0339, loss_rpn_bbox: 0.0623, loss_cls: 0.2425, acc: 91.6406, loss_bbox: 0.2885, loss_mask: 0.2889, loss: 0.9161 2024-05-30 15:00:22,486 - mmdet - INFO - Epoch [2][2050/7330] lr: 1.000e-04, eta: 1 day, 0:01:34, time: 1.117, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0595, loss_cls: 0.2394, acc: 91.8989, loss_bbox: 0.2881, loss_mask: 0.2801, loss: 0.8983 2024-05-30 15:01:15,878 - mmdet - INFO - Epoch [2][2100/7330] lr: 1.000e-04, eta: 1 day, 0:00:25, time: 1.068, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0560, loss_cls: 0.2349, acc: 91.9600, loss_bbox: 0.2821, loss_mask: 0.2798, loss: 0.8818 2024-05-30 15:02:09,872 - mmdet - INFO - Epoch [2][2150/7330] lr: 1.000e-04, eta: 23:59:21, time: 1.080, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0329, loss_rpn_bbox: 0.0583, loss_cls: 0.2362, acc: 91.9519, loss_bbox: 0.2916, loss_mask: 0.2846, loss: 0.9037 2024-05-30 15:03:03,873 - mmdet - INFO - Epoch [2][2200/7330] lr: 1.000e-04, eta: 23:58:18, time: 1.080, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0616, loss_cls: 0.2508, acc: 91.2290, loss_bbox: 0.3103, loss_mask: 0.2803, loss: 0.9350 2024-05-30 15:03:58,478 - mmdet - INFO - Epoch [2][2250/7330] lr: 1.000e-04, eta: 23:57:20, time: 1.092, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0347, loss_rpn_bbox: 0.0592, loss_cls: 0.2454, acc: 91.5591, loss_bbox: 0.2970, loss_mask: 0.2879, loss: 0.9242 2024-05-30 15:04:56,789 - mmdet - INFO - Epoch [2][2300/7330] lr: 1.000e-04, eta: 23:56:51, time: 1.166, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0320, loss_rpn_bbox: 0.0591, loss_cls: 0.2482, acc: 91.6035, loss_bbox: 0.2937, loss_mask: 0.2884, loss: 0.9215 2024-05-30 15:05:55,681 - mmdet - INFO - Epoch [2][2350/7330] lr: 1.000e-04, eta: 23:56:28, time: 1.178, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0334, loss_rpn_bbox: 0.0612, loss_cls: 0.2470, acc: 91.4160, loss_bbox: 0.3021, loss_mask: 0.2917, loss: 0.9355 2024-05-30 15:06:52,186 - mmdet - INFO - Epoch [2][2400/7330] lr: 1.000e-04, eta: 23:55:44, time: 1.130, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0326, loss_rpn_bbox: 0.0560, loss_cls: 0.2348, acc: 91.9089, loss_bbox: 0.2833, loss_mask: 0.2857, loss: 0.8924 2024-05-30 15:07:48,694 - mmdet - INFO - Epoch [2][2450/7330] lr: 1.000e-04, eta: 23:55:01, time: 1.130, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0335, loss_rpn_bbox: 0.0614, loss_cls: 0.2397, acc: 91.6545, loss_bbox: 0.2907, loss_mask: 0.2812, loss: 0.9066 2024-05-30 15:08:42,836 - mmdet - INFO - Epoch [2][2500/7330] lr: 1.000e-04, eta: 23:53:58, time: 1.083, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0566, loss_cls: 0.2381, acc: 91.9651, loss_bbox: 0.2838, loss_mask: 0.2800, loss: 0.8884 2024-05-30 15:09:39,523 - mmdet - INFO - Epoch [2][2550/7330] lr: 1.000e-04, eta: 23:53:16, time: 1.134, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0592, loss_cls: 0.2502, acc: 91.4285, loss_bbox: 0.3021, loss_mask: 0.2813, loss: 0.9227 2024-05-30 15:10:33,279 - mmdet - INFO - Epoch [2][2600/7330] lr: 1.000e-04, eta: 23:52:11, time: 1.075, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0578, loss_cls: 0.2481, acc: 91.5098, loss_bbox: 0.2981, loss_mask: 0.2827, loss: 0.9159 2024-05-30 15:11:29,777 - mmdet - INFO - Epoch [2][2650/7330] lr: 1.000e-04, eta: 23:51:27, time: 1.130, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0576, loss_cls: 0.2400, acc: 91.8220, loss_bbox: 0.2914, loss_mask: 0.2796, loss: 0.9002 2024-05-30 15:12:23,076 - mmdet - INFO - Epoch [2][2700/7330] lr: 1.000e-04, eta: 23:50:18, time: 1.066, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0597, loss_cls: 0.2391, acc: 91.8740, loss_bbox: 0.2899, loss_mask: 0.2811, loss: 0.9026 2024-05-30 15:13:16,158 - mmdet - INFO - Epoch [2][2750/7330] lr: 1.000e-04, eta: 23:49:08, time: 1.062, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0331, loss_rpn_bbox: 0.0573, loss_cls: 0.2436, acc: 91.7310, loss_bbox: 0.2875, loss_mask: 0.2795, loss: 0.9009 2024-05-30 15:14:09,689 - mmdet - INFO - Epoch [2][2800/7330] lr: 1.000e-04, eta: 23:48:01, time: 1.071, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0537, loss_cls: 0.2294, acc: 92.2026, loss_bbox: 0.2743, loss_mask: 0.2720, loss: 0.8588 2024-05-30 15:15:02,908 - mmdet - INFO - Epoch [2][2850/7330] lr: 1.000e-04, eta: 23:46:52, time: 1.064, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0611, loss_cls: 0.2333, acc: 91.9500, loss_bbox: 0.2886, loss_mask: 0.2807, loss: 0.8950 2024-05-30 15:16:01,355 - mmdet - INFO - Epoch [2][2900/7330] lr: 1.000e-04, eta: 23:46:23, time: 1.169, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0627, loss_cls: 0.2513, acc: 91.3354, loss_bbox: 0.3017, loss_mask: 0.2873, loss: 0.9355 2024-05-30 15:16:57,414 - mmdet - INFO - Epoch [2][2950/7330] lr: 1.000e-04, eta: 23:45:35, time: 1.121, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0310, loss_rpn_bbox: 0.0615, loss_cls: 0.2440, acc: 91.6475, loss_bbox: 0.2970, loss_mask: 0.2884, loss: 0.9220 2024-05-30 15:17:55,842 - mmdet - INFO - Epoch [2][3000/7330] lr: 1.000e-04, eta: 23:45:06, time: 1.169, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0609, loss_cls: 0.2446, acc: 91.5557, loss_bbox: 0.2944, loss_mask: 0.2867, loss: 0.9194 2024-05-30 15:18:51,394 - mmdet - INFO - Epoch [2][3050/7330] lr: 1.000e-04, eta: 23:44:14, time: 1.111, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0321, loss_rpn_bbox: 0.0578, loss_cls: 0.2519, acc: 91.3721, loss_bbox: 0.3010, loss_mask: 0.2860, loss: 0.9287 2024-05-30 15:19:47,692 - mmdet - INFO - Epoch [2][3100/7330] lr: 1.000e-04, eta: 23:43:28, time: 1.126, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0581, loss_cls: 0.2377, acc: 91.8022, loss_bbox: 0.2915, loss_mask: 0.2806, loss: 0.8963 2024-05-30 15:20:41,129 - mmdet - INFO - Epoch [2][3150/7330] lr: 1.000e-04, eta: 23:42:21, time: 1.069, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0585, loss_cls: 0.2418, acc: 91.7576, loss_bbox: 0.2869, loss_mask: 0.2809, loss: 0.8989 2024-05-30 15:21:33,944 - mmdet - INFO - Epoch [2][3200/7330] lr: 1.000e-04, eta: 23:41:09, time: 1.056, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0290, loss_rpn_bbox: 0.0560, loss_cls: 0.2371, acc: 91.8711, loss_bbox: 0.2833, loss_mask: 0.2817, loss: 0.8872 2024-05-30 15:22:27,493 - mmdet - INFO - Epoch [2][3250/7330] lr: 1.000e-04, eta: 23:40:03, time: 1.071, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0572, loss_cls: 0.2387, acc: 91.7573, loss_bbox: 0.2968, loss_mask: 0.2786, loss: 0.9001 2024-05-30 15:23:23,651 - mmdet - INFO - Epoch [2][3300/7330] lr: 1.000e-04, eta: 23:39:16, time: 1.123, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0289, loss_rpn_bbox: 0.0575, loss_cls: 0.2423, acc: 91.7761, loss_bbox: 0.2872, loss_mask: 0.2779, loss: 0.8939 2024-05-30 15:24:18,257 - mmdet - INFO - Epoch [2][3350/7330] lr: 1.000e-04, eta: 23:38:18, time: 1.092, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0612, loss_cls: 0.2451, acc: 91.5608, loss_bbox: 0.2990, loss_mask: 0.2887, loss: 0.9240 2024-05-30 15:25:11,391 - mmdet - INFO - Epoch [2][3400/7330] lr: 1.000e-04, eta: 23:37:09, time: 1.063, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0313, loss_rpn_bbox: 0.0580, loss_cls: 0.2393, acc: 91.7996, loss_bbox: 0.2906, loss_mask: 0.2862, loss: 0.9055 2024-05-30 15:26:07,235 - mmdet - INFO - Epoch [2][3450/7330] lr: 1.000e-04, eta: 23:36:19, time: 1.117, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0567, loss_cls: 0.2416, acc: 91.6887, loss_bbox: 0.2891, loss_mask: 0.2744, loss: 0.8913 2024-05-30 15:27:01,130 - mmdet - INFO - Epoch [2][3500/7330] lr: 1.000e-04, eta: 23:35:16, time: 1.078, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0567, loss_cls: 0.2349, acc: 91.9487, loss_bbox: 0.2885, loss_mask: 0.2790, loss: 0.8896 2024-05-30 15:27:57,033 - mmdet - INFO - Epoch [2][3550/7330] lr: 1.000e-04, eta: 23:34:27, time: 1.118, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0615, loss_cls: 0.2424, acc: 91.6692, loss_bbox: 0.2902, loss_mask: 0.2837, loss: 0.9102 2024-05-30 15:28:57,245 - mmdet - INFO - Epoch [2][3600/7330] lr: 1.000e-04, eta: 23:34:08, time: 1.204, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0611, loss_cls: 0.2434, acc: 91.5842, loss_bbox: 0.2919, loss_mask: 0.2865, loss: 0.9122 2024-05-30 15:29:54,744 - mmdet - INFO - Epoch [2][3650/7330] lr: 1.000e-04, eta: 23:33:30, time: 1.150, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0286, loss_rpn_bbox: 0.0566, loss_cls: 0.2371, acc: 91.8630, loss_bbox: 0.2902, loss_mask: 0.2777, loss: 0.8903 2024-05-30 15:30:48,368 - mmdet - INFO - Epoch [2][3700/7330] lr: 1.000e-04, eta: 23:32:25, time: 1.073, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0583, loss_cls: 0.2367, acc: 91.8545, loss_bbox: 0.2841, loss_mask: 0.2718, loss: 0.8807 2024-05-30 15:31:41,994 - mmdet - INFO - Epoch [2][3750/7330] lr: 1.000e-04, eta: 23:31:20, time: 1.072, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0586, loss_cls: 0.2410, acc: 91.7576, loss_bbox: 0.2921, loss_mask: 0.2834, loss: 0.9028 2024-05-30 15:32:36,090 - mmdet - INFO - Epoch [2][3800/7330] lr: 1.000e-04, eta: 23:30:18, time: 1.082, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0311, loss_rpn_bbox: 0.0580, loss_cls: 0.2391, acc: 91.8938, loss_bbox: 0.2887, loss_mask: 0.2802, loss: 0.8971 2024-05-30 15:33:30,231 - mmdet - INFO - Epoch [2][3850/7330] lr: 1.000e-04, eta: 23:29:17, time: 1.083, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0318, loss_rpn_bbox: 0.0633, loss_cls: 0.2445, acc: 91.6355, loss_bbox: 0.2943, loss_mask: 0.2843, loss: 0.9182 2024-05-30 15:34:26,527 - mmdet - INFO - Epoch [2][3900/7330] lr: 1.000e-04, eta: 23:28:30, time: 1.126, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0317, loss_rpn_bbox: 0.0602, loss_cls: 0.2475, acc: 91.6145, loss_bbox: 0.2926, loss_mask: 0.2782, loss: 0.9101 2024-05-30 15:35:20,588 - mmdet - INFO - Epoch [2][3950/7330] lr: 1.000e-04, eta: 23:27:28, time: 1.081, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0275, loss_rpn_bbox: 0.0544, loss_cls: 0.2297, acc: 92.0479, loss_bbox: 0.2834, loss_mask: 0.2812, loss: 0.8762 2024-05-30 15:36:16,548 - mmdet - INFO - Epoch [2][4000/7330] lr: 1.000e-04, eta: 23:26:39, time: 1.119, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0589, loss_cls: 0.2430, acc: 91.6340, loss_bbox: 0.2911, loss_mask: 0.2744, loss: 0.8952 2024-05-30 15:37:10,175 - mmdet - INFO - Epoch [2][4050/7330] lr: 1.000e-04, eta: 23:25:34, time: 1.072, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0592, loss_cls: 0.2469, acc: 91.6138, loss_bbox: 0.2948, loss_mask: 0.2765, loss: 0.9068 2024-05-30 15:38:03,977 - mmdet - INFO - Epoch [2][4100/7330] lr: 1.000e-04, eta: 23:24:31, time: 1.076, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0556, loss_cls: 0.2339, acc: 91.9409, loss_bbox: 0.2856, loss_mask: 0.2742, loss: 0.8808 2024-05-30 15:39:01,655 - mmdet - INFO - Epoch [2][4150/7330] lr: 1.000e-04, eta: 23:23:53, time: 1.153, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0560, loss_cls: 0.2338, acc: 91.8958, loss_bbox: 0.2842, loss_mask: 0.2758, loss: 0.8807 2024-05-30 15:39:59,296 - mmdet - INFO - Epoch [2][4200/7330] lr: 1.000e-04, eta: 23:23:15, time: 1.153, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0530, loss_cls: 0.2340, acc: 92.1460, loss_bbox: 0.2754, loss_mask: 0.2756, loss: 0.8671 2024-05-30 15:40:57,735 - mmdet - INFO - Epoch [2][4250/7330] lr: 1.000e-04, eta: 23:22:42, time: 1.169, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0328, loss_rpn_bbox: 0.0604, loss_cls: 0.2447, acc: 91.6833, loss_bbox: 0.2919, loss_mask: 0.2798, loss: 0.9096 2024-05-30 15:41:51,781 - mmdet - INFO - Epoch [2][4300/7330] lr: 1.000e-04, eta: 23:21:40, time: 1.081, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0323, loss_rpn_bbox: 0.0604, loss_cls: 0.2401, acc: 91.6008, loss_bbox: 0.2962, loss_mask: 0.2901, loss: 0.9191 2024-05-30 15:42:45,584 - mmdet - INFO - Epoch [2][4350/7330] lr: 1.000e-04, eta: 23:20:37, time: 1.076, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0330, loss_rpn_bbox: 0.0595, loss_cls: 0.2371, acc: 91.9824, loss_bbox: 0.2840, loss_mask: 0.2820, loss: 0.8957 2024-05-30 15:43:39,110 - mmdet - INFO - Epoch [2][4400/7330] lr: 1.000e-04, eta: 23:19:31, time: 1.071, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0584, loss_cls: 0.2423, acc: 91.6877, loss_bbox: 0.2878, loss_mask: 0.2820, loss: 0.8999 2024-05-30 15:44:35,915 - mmdet - INFO - Epoch [2][4450/7330] lr: 1.000e-04, eta: 23:18:48, time: 1.136, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0298, loss_rpn_bbox: 0.0569, loss_cls: 0.2475, acc: 91.3865, loss_bbox: 0.2979, loss_mask: 0.2815, loss: 0.9136 2024-05-30 15:45:30,068 - mmdet - INFO - Epoch [2][4500/7330] lr: 1.000e-04, eta: 23:17:46, time: 1.083, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0299, loss_rpn_bbox: 0.0598, loss_cls: 0.2413, acc: 91.6602, loss_bbox: 0.2920, loss_mask: 0.2833, loss: 0.9064 2024-05-30 15:46:24,020 - mmdet - INFO - Epoch [2][4550/7330] lr: 1.000e-04, eta: 23:16:44, time: 1.079, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0553, loss_cls: 0.2339, acc: 91.8481, loss_bbox: 0.2865, loss_mask: 0.2741, loss: 0.8773 2024-05-30 15:47:20,398 - mmdet - INFO - Epoch [2][4600/7330] lr: 1.000e-04, eta: 23:15:57, time: 1.128, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0305, loss_rpn_bbox: 0.0593, loss_cls: 0.2460, acc: 91.5854, loss_bbox: 0.2951, loss_mask: 0.2833, loss: 0.9143 2024-05-30 15:48:13,927 - mmdet - INFO - Epoch [2][4650/7330] lr: 1.000e-04, eta: 23:14:52, time: 1.071, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0315, loss_rpn_bbox: 0.0616, loss_cls: 0.2484, acc: 91.4333, loss_bbox: 0.2964, loss_mask: 0.2798, loss: 0.9177 2024-05-30 15:49:07,450 - mmdet - INFO - Epoch [2][4700/7330] lr: 1.000e-04, eta: 23:13:48, time: 1.070, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0616, loss_cls: 0.2406, acc: 91.6758, loss_bbox: 0.2940, loss_mask: 0.2849, loss: 0.9108 2024-05-30 15:50:07,689 - mmdet - INFO - Epoch [2][4750/7330] lr: 1.000e-04, eta: 23:13:25, time: 1.205, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0633, loss_cls: 0.2403, acc: 91.5635, loss_bbox: 0.2986, loss_mask: 0.2762, loss: 0.9077 2024-05-30 15:51:01,128 - mmdet - INFO - Epoch [2][4800/7330] lr: 1.000e-04, eta: 23:12:20, time: 1.069, data_time: 0.094, memory: 24290, loss_rpn_cls: 0.0306, loss_rpn_bbox: 0.0571, loss_cls: 0.2367, acc: 91.7515, loss_bbox: 0.2865, loss_mask: 0.2725, loss: 0.8834 2024-05-30 15:51:59,303 - mmdet - INFO - Epoch [2][4850/7330] lr: 1.000e-04, eta: 23:11:44, time: 1.163, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0312, loss_rpn_bbox: 0.0587, loss_cls: 0.2323, acc: 91.9236, loss_bbox: 0.2850, loss_mask: 0.2786, loss: 0.8858 2024-05-30 15:52:54,885 - mmdet - INFO - Epoch [2][4900/7330] lr: 1.000e-04, eta: 23:10:52, time: 1.112, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0587, loss_cls: 0.2315, acc: 91.9080, loss_bbox: 0.2856, loss_mask: 0.2758, loss: 0.8818 2024-05-30 15:53:49,575 - mmdet - INFO - Epoch [2][4950/7330] lr: 1.000e-04, eta: 23:09:54, time: 1.094, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0332, loss_rpn_bbox: 0.0633, loss_cls: 0.2439, acc: 91.5005, loss_bbox: 0.2974, loss_mask: 0.2840, loss: 0.9218 2024-05-30 15:54:43,045 - mmdet - INFO - Epoch [2][5000/7330] lr: 1.000e-04, eta: 23:08:49, time: 1.069, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0576, loss_cls: 0.2303, acc: 92.0210, loss_bbox: 0.2782, loss_mask: 0.2676, loss: 0.8611 2024-05-30 15:55:39,525 - mmdet - INFO - Epoch [2][5050/7330] lr: 1.000e-04, eta: 23:08:02, time: 1.130, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0316, loss_rpn_bbox: 0.0552, loss_cls: 0.2300, acc: 92.0308, loss_bbox: 0.2818, loss_mask: 0.2725, loss: 0.8711 2024-05-30 15:56:32,407 - mmdet - INFO - Epoch [2][5100/7330] lr: 1.000e-04, eta: 23:06:54, time: 1.058, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0294, loss_rpn_bbox: 0.0603, loss_cls: 0.2377, acc: 91.8269, loss_bbox: 0.2886, loss_mask: 0.2740, loss: 0.8901 2024-05-30 15:57:27,273 - mmdet - INFO - Epoch [2][5150/7330] lr: 1.000e-04, eta: 23:05:57, time: 1.097, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0616, loss_cls: 0.2492, acc: 91.2998, loss_bbox: 0.2951, loss_mask: 0.2782, loss: 0.9151 2024-05-30 15:58:23,218 - mmdet - INFO - Epoch [2][5200/7330] lr: 1.000e-04, eta: 23:05:07, time: 1.119, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0554, loss_cls: 0.2281, acc: 92.0896, loss_bbox: 0.2802, loss_mask: 0.2698, loss: 0.8657 2024-05-30 15:59:16,040 - mmdet - INFO - Epoch [2][5250/7330] lr: 1.000e-04, eta: 23:03:59, time: 1.056, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0547, loss_cls: 0.2360, acc: 91.7432, loss_bbox: 0.2842, loss_mask: 0.2735, loss: 0.8779 2024-05-30 16:00:14,134 - mmdet - INFO - Epoch [2][5300/7330] lr: 1.000e-04, eta: 23:03:22, time: 1.162, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0308, loss_rpn_bbox: 0.0611, loss_cls: 0.2411, acc: 91.6960, loss_bbox: 0.2893, loss_mask: 0.2819, loss: 0.9041 2024-05-30 16:01:07,568 - mmdet - INFO - Epoch [2][5350/7330] lr: 1.000e-04, eta: 23:02:17, time: 1.069, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0565, loss_cls: 0.2352, acc: 91.8911, loss_bbox: 0.2845, loss_mask: 0.2818, loss: 0.8883 2024-05-30 16:02:03,532 - mmdet - INFO - Epoch [2][5400/7330] lr: 1.000e-04, eta: 23:01:27, time: 1.119, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0568, loss_cls: 0.2379, acc: 91.6885, loss_bbox: 0.2915, loss_mask: 0.2737, loss: 0.8866 2024-05-30 16:02:59,215 - mmdet - INFO - Epoch [2][5450/7330] lr: 1.000e-04, eta: 23:00:35, time: 1.114, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0301, loss_rpn_bbox: 0.0556, loss_cls: 0.2349, acc: 91.8931, loss_bbox: 0.2869, loss_mask: 0.2744, loss: 0.8820 2024-05-30 16:03:57,577 - mmdet - INFO - Epoch [2][5500/7330] lr: 1.000e-04, eta: 22:59:59, time: 1.167, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0295, loss_rpn_bbox: 0.0592, loss_cls: 0.2306, acc: 91.9878, loss_bbox: 0.2816, loss_mask: 0.2690, loss: 0.8699 2024-05-30 16:04:52,305 - mmdet - INFO - Epoch [2][5550/7330] lr: 1.000e-04, eta: 22:59:02, time: 1.095, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0324, loss_rpn_bbox: 0.0605, loss_cls: 0.2418, acc: 91.5356, loss_bbox: 0.2949, loss_mask: 0.2786, loss: 0.9082 2024-05-30 16:05:45,528 - mmdet - INFO - Epoch [2][5600/7330] lr: 1.000e-04, eta: 22:57:56, time: 1.064, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0548, loss_cls: 0.2269, acc: 92.0938, loss_bbox: 0.2764, loss_mask: 0.2755, loss: 0.8602 2024-05-30 16:06:39,043 - mmdet - INFO - Epoch [2][5650/7330] lr: 1.000e-04, eta: 22:56:52, time: 1.070, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0544, loss_cls: 0.2317, acc: 92.0269, loss_bbox: 0.2806, loss_mask: 0.2710, loss: 0.8673 2024-05-30 16:07:34,668 - mmdet - INFO - Epoch [2][5700/7330] lr: 1.000e-04, eta: 22:56:00, time: 1.113, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0543, loss_cls: 0.2342, acc: 91.8867, loss_bbox: 0.2829, loss_mask: 0.2741, loss: 0.8752 2024-05-30 16:08:30,382 - mmdet - INFO - Epoch [2][5750/7330] lr: 1.000e-04, eta: 22:55:08, time: 1.114, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0554, loss_cls: 0.2336, acc: 91.8989, loss_bbox: 0.2812, loss_mask: 0.2676, loss: 0.8652 2024-05-30 16:09:24,192 - mmdet - INFO - Epoch [2][5800/7330] lr: 1.000e-04, eta: 22:54:06, time: 1.076, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0572, loss_cls: 0.2275, acc: 92.1570, loss_bbox: 0.2773, loss_mask: 0.2744, loss: 0.8656 2024-05-30 16:10:23,047 - mmdet - INFO - Epoch [2][5850/7330] lr: 1.000e-04, eta: 22:53:32, time: 1.177, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0325, loss_rpn_bbox: 0.0617, loss_cls: 0.2435, acc: 91.6262, loss_bbox: 0.2863, loss_mask: 0.2790, loss: 0.9030 2024-05-30 16:11:17,760 - mmdet - INFO - Epoch [2][5900/7330] lr: 1.000e-04, eta: 22:52:35, time: 1.094, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0291, loss_rpn_bbox: 0.0604, loss_cls: 0.2324, acc: 91.9031, loss_bbox: 0.2833, loss_mask: 0.2754, loss: 0.8806 2024-05-30 16:12:11,262 - mmdet - INFO - Epoch [2][5950/7330] lr: 1.000e-04, eta: 22:51:31, time: 1.070, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0578, loss_cls: 0.2355, acc: 91.7776, loss_bbox: 0.2882, loss_mask: 0.2799, loss: 0.8892 2024-05-30 16:13:07,149 - mmdet - INFO - Epoch [2][6000/7330] lr: 1.000e-04, eta: 22:50:40, time: 1.118, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0548, loss_cls: 0.2280, acc: 92.0999, loss_bbox: 0.2717, loss_mask: 0.2689, loss: 0.8507 2024-05-30 16:14:00,661 - mmdet - INFO - Epoch [2][6050/7330] lr: 1.000e-04, eta: 22:49:36, time: 1.070, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0279, loss_rpn_bbox: 0.0557, loss_cls: 0.2321, acc: 91.9934, loss_bbox: 0.2803, loss_mask: 0.2701, loss: 0.8661 2024-05-30 16:14:59,964 - mmdet - INFO - Epoch [2][6100/7330] lr: 1.000e-04, eta: 22:49:04, time: 1.186, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0287, loss_rpn_bbox: 0.0594, loss_cls: 0.2357, acc: 91.6167, loss_bbox: 0.2918, loss_mask: 0.2736, loss: 0.8893 2024-05-30 16:15:56,155 - mmdet - INFO - Epoch [2][6150/7330] lr: 1.000e-04, eta: 22:48:15, time: 1.124, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0278, loss_rpn_bbox: 0.0572, loss_cls: 0.2371, acc: 91.8118, loss_bbox: 0.2915, loss_mask: 0.2714, loss: 0.8850 2024-05-30 16:16:50,051 - mmdet - INFO - Epoch [2][6200/7330] lr: 1.000e-04, eta: 22:47:13, time: 1.078, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0550, loss_cls: 0.2327, acc: 91.9856, loss_bbox: 0.2769, loss_mask: 0.2667, loss: 0.8591 2024-05-30 16:17:43,581 - mmdet - INFO - Epoch [2][6250/7330] lr: 1.000e-04, eta: 22:46:10, time: 1.070, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0296, loss_rpn_bbox: 0.0564, loss_cls: 0.2321, acc: 92.0139, loss_bbox: 0.2802, loss_mask: 0.2751, loss: 0.8734 2024-05-30 16:18:41,436 - mmdet - INFO - Epoch [2][6300/7330] lr: 1.000e-04, eta: 22:45:29, time: 1.157, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0550, loss_cls: 0.2278, acc: 92.1841, loss_bbox: 0.2750, loss_mask: 0.2730, loss: 0.8580 2024-05-30 16:19:35,365 - mmdet - INFO - Epoch [2][6350/7330] lr: 1.000e-04, eta: 22:44:28, time: 1.079, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0307, loss_rpn_bbox: 0.0572, loss_cls: 0.2301, acc: 91.9719, loss_bbox: 0.2771, loss_mask: 0.2690, loss: 0.8641 2024-05-30 16:20:29,815 - mmdet - INFO - Epoch [2][6400/7330] lr: 1.000e-04, eta: 22:43:29, time: 1.089, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0577, loss_cls: 0.2361, acc: 91.9414, loss_bbox: 0.2793, loss_mask: 0.2685, loss: 0.8699 2024-05-30 16:21:28,119 - mmdet - INFO - Epoch [2][6450/7330] lr: 1.000e-04, eta: 22:42:51, time: 1.166, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0288, loss_rpn_bbox: 0.0577, loss_cls: 0.2339, acc: 92.0461, loss_bbox: 0.2804, loss_mask: 0.2704, loss: 0.8712 2024-05-30 16:22:22,410 - mmdet - INFO - Epoch [2][6500/7330] lr: 1.000e-04, eta: 22:41:52, time: 1.086, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0271, loss_rpn_bbox: 0.0571, loss_cls: 0.2300, acc: 92.0386, loss_bbox: 0.2785, loss_mask: 0.2663, loss: 0.8590 2024-05-30 16:23:16,662 - mmdet - INFO - Epoch [2][6550/7330] lr: 1.000e-04, eta: 22:40:52, time: 1.085, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0562, loss_cls: 0.2347, acc: 91.8071, loss_bbox: 0.2852, loss_mask: 0.2758, loss: 0.8799 2024-05-30 16:24:09,868 - mmdet - INFO - Epoch [2][6600/7330] lr: 1.000e-04, eta: 22:39:47, time: 1.064, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0529, loss_cls: 0.2252, acc: 92.1128, loss_bbox: 0.2756, loss_mask: 0.2654, loss: 0.8463 2024-05-30 16:25:05,713 - mmdet - INFO - Epoch [2][6650/7330] lr: 1.000e-04, eta: 22:38:56, time: 1.117, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0322, loss_rpn_bbox: 0.0579, loss_cls: 0.2317, acc: 91.8831, loss_bbox: 0.2836, loss_mask: 0.2715, loss: 0.8769 2024-05-30 16:26:03,936 - mmdet - INFO - Epoch [2][6700/7330] lr: 1.000e-04, eta: 22:38:17, time: 1.164, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0517, loss_cls: 0.2195, acc: 92.3914, loss_bbox: 0.2635, loss_mask: 0.2613, loss: 0.8218 2024-05-30 16:26:59,639 - mmdet - INFO - Epoch [2][6750/7330] lr: 1.000e-04, eta: 22:37:25, time: 1.114, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0280, loss_rpn_bbox: 0.0584, loss_cls: 0.2320, acc: 91.9001, loss_bbox: 0.2778, loss_mask: 0.2701, loss: 0.8663 2024-05-30 16:27:52,957 - mmdet - INFO - Epoch [2][6800/7330] lr: 1.000e-04, eta: 22:36:20, time: 1.066, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0273, loss_rpn_bbox: 0.0533, loss_cls: 0.2282, acc: 92.1409, loss_bbox: 0.2784, loss_mask: 0.2699, loss: 0.8572 2024-05-30 16:28:48,337 - mmdet - INFO - Epoch [2][6850/7330] lr: 1.000e-04, eta: 22:35:27, time: 1.108, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0568, loss_cls: 0.2323, acc: 91.8257, loss_bbox: 0.2803, loss_mask: 0.2726, loss: 0.8679 2024-05-30 16:29:44,409 - mmdet - INFO - Epoch [2][6900/7330] lr: 1.000e-04, eta: 22:34:37, time: 1.121, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0302, loss_rpn_bbox: 0.0552, loss_cls: 0.2365, acc: 91.7122, loss_bbox: 0.2880, loss_mask: 0.2736, loss: 0.8835 2024-05-30 16:30:38,284 - mmdet - INFO - Epoch [2][6950/7330] lr: 1.000e-04, eta: 22:33:35, time: 1.077, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0276, loss_rpn_bbox: 0.0593, loss_cls: 0.2305, acc: 91.8884, loss_bbox: 0.2813, loss_mask: 0.2750, loss: 0.8737 2024-05-30 16:31:34,171 - mmdet - INFO - Epoch [2][7000/7330] lr: 1.000e-04, eta: 22:32:44, time: 1.118, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0285, loss_rpn_bbox: 0.0573, loss_cls: 0.2308, acc: 91.8477, loss_bbox: 0.2837, loss_mask: 0.2715, loss: 0.8718 2024-05-30 16:32:27,149 - mmdet - INFO - Epoch [2][7050/7330] lr: 1.000e-04, eta: 22:31:38, time: 1.060, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0538, loss_cls: 0.2197, acc: 92.4944, loss_bbox: 0.2699, loss_mask: 0.2648, loss: 0.8367 2024-05-30 16:33:23,622 - mmdet - INFO - Epoch [2][7100/7330] lr: 1.000e-04, eta: 22:30:50, time: 1.129, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0309, loss_rpn_bbox: 0.0583, loss_cls: 0.2383, acc: 91.8291, loss_bbox: 0.2814, loss_mask: 0.2681, loss: 0.8770 2024-05-30 16:34:16,862 - mmdet - INFO - Epoch [2][7150/7330] lr: 1.000e-04, eta: 22:29:45, time: 1.065, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0551, loss_cls: 0.2279, acc: 91.9629, loss_bbox: 0.2802, loss_mask: 0.2695, loss: 0.8598 2024-05-30 16:35:10,665 - mmdet - INFO - Epoch [2][7200/7330] lr: 1.000e-04, eta: 22:28:43, time: 1.076, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0565, loss_cls: 0.2306, acc: 92.0393, loss_bbox: 0.2766, loss_mask: 0.2616, loss: 0.8526 2024-05-30 16:36:06,319 - mmdet - INFO - Epoch [2][7250/7330] lr: 1.000e-04, eta: 22:27:51, time: 1.113, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0293, loss_rpn_bbox: 0.0552, loss_cls: 0.2276, acc: 92.0220, loss_bbox: 0.2782, loss_mask: 0.2706, loss: 0.8609 2024-05-30 16:36:59,969 - mmdet - INFO - Epoch [2][7300/7330] lr: 1.000e-04, eta: 22:26:49, time: 1.073, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0292, loss_rpn_bbox: 0.0554, loss_cls: 0.2283, acc: 92.0378, loss_bbox: 0.2761, loss_mask: 0.2700, loss: 0.8590 2024-05-30 16:37:37,071 - mmdet - INFO - Saving checkpoint at 2 epochs 2024-05-30 16:40:02,623 - mmdet - INFO - Evaluating bbox... 2024-05-30 16:40:26,459 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.383 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.638 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.407 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.229 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.427 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.523 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.509 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.325 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.552 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.663 2024-05-30 16:40:26,459 - mmdet - INFO - Evaluating segm... 2024-05-30 16:40:56,371 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.360 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.596 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.376 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.156 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.394 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.567 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.473 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.473 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.473 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.261 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.521 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.666 2024-05-30 16:40:56,811 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 16:40:56,813 - mmdet - INFO - Epoch(val) [2][625] bbox_mAP: 0.3830, bbox_mAP_50: 0.6380, bbox_mAP_75: 0.4070, bbox_mAP_s: 0.2290, bbox_mAP_m: 0.4270, bbox_mAP_l: 0.5230, bbox_mAP_copypaste: 0.383 0.638 0.407 0.229 0.427 0.523, segm_mAP: 0.3600, segm_mAP_50: 0.5960, segm_mAP_75: 0.3760, segm_mAP_s: 0.1560, segm_mAP_m: 0.3940, segm_mAP_l: 0.5670, segm_mAP_copypaste: 0.360 0.596 0.376 0.156 0.394 0.567 2024-05-30 16:41:59,608 - mmdet - INFO - Epoch [3][50/7330] lr: 1.000e-04, eta: 22:23:14, time: 1.256, data_time: 0.139, memory: 24290, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0549, loss_cls: 0.2279, acc: 91.9390, loss_bbox: 0.2849, loss_mask: 0.2751, loss: 0.8687 2024-05-30 16:42:53,871 - mmdet - INFO - Epoch [3][100/7330] lr: 1.000e-04, eta: 22:22:15, time: 1.085, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0571, loss_cls: 0.2219, acc: 92.1748, loss_bbox: 0.2769, loss_mask: 0.2719, loss: 0.8549 2024-05-30 16:43:48,437 - mmdet - INFO - Epoch [3][150/7330] lr: 1.000e-04, eta: 22:21:18, time: 1.091, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0550, loss_cls: 0.2180, acc: 92.2795, loss_bbox: 0.2747, loss_mask: 0.2646, loss: 0.8383 2024-05-30 16:44:42,392 - mmdet - INFO - Epoch [3][200/7330] lr: 1.000e-04, eta: 22:20:18, time: 1.079, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0566, loss_cls: 0.2244, acc: 92.0190, loss_bbox: 0.2848, loss_mask: 0.2673, loss: 0.8562 2024-05-30 16:45:36,106 - mmdet - INFO - Epoch [3][250/7330] lr: 1.000e-04, eta: 22:19:16, time: 1.074, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0540, loss_cls: 0.2160, acc: 92.3313, loss_bbox: 0.2699, loss_mask: 0.2699, loss: 0.8351 2024-05-30 16:46:30,453 - mmdet - INFO - Epoch [3][300/7330] lr: 1.000e-04, eta: 22:18:18, time: 1.087, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0525, loss_cls: 0.2231, acc: 92.2188, loss_bbox: 0.2734, loss_mask: 0.2699, loss: 0.8460 2024-05-30 16:47:24,512 - mmdet - INFO - Epoch [3][350/7330] lr: 1.000e-04, eta: 22:17:19, time: 1.081, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0541, loss_cls: 0.2297, acc: 91.7524, loss_bbox: 0.2814, loss_mask: 0.2646, loss: 0.8549 2024-05-30 16:48:21,939 - mmdet - INFO - Epoch [3][400/7330] lr: 1.000e-04, eta: 22:16:35, time: 1.148, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0267, loss_rpn_bbox: 0.0574, loss_cls: 0.2289, acc: 91.8735, loss_bbox: 0.2773, loss_mask: 0.2694, loss: 0.8596 2024-05-30 16:49:15,975 - mmdet - INFO - Epoch [3][450/7330] lr: 1.000e-04, eta: 22:15:36, time: 1.081, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0546, loss_cls: 0.2231, acc: 92.0627, loss_bbox: 0.2776, loss_mask: 0.2630, loss: 0.8435 2024-05-30 16:50:11,640 - mmdet - INFO - Epoch [3][500/7330] lr: 1.000e-04, eta: 22:14:44, time: 1.113, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0518, loss_cls: 0.2164, acc: 92.3252, loss_bbox: 0.2657, loss_mask: 0.2616, loss: 0.8192 2024-05-30 16:51:05,757 - mmdet - INFO - Epoch [3][550/7330] lr: 1.000e-04, eta: 22:13:45, time: 1.082, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0534, loss_cls: 0.2155, acc: 92.3792, loss_bbox: 0.2708, loss_mask: 0.2603, loss: 0.8258 2024-05-30 16:51:59,386 - mmdet - INFO - Epoch [3][600/7330] lr: 1.000e-04, eta: 22:12:43, time: 1.073, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0529, loss_cls: 0.2170, acc: 92.3838, loss_bbox: 0.2641, loss_mask: 0.2603, loss: 0.8203 2024-05-30 16:52:55,798 - mmdet - INFO - Epoch [3][650/7330] lr: 1.000e-04, eta: 22:11:55, time: 1.128, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0523, loss_cls: 0.2181, acc: 92.2512, loss_bbox: 0.2709, loss_mask: 0.2581, loss: 0.8231 2024-05-30 16:53:52,192 - mmdet - INFO - Epoch [3][700/7330] lr: 1.000e-04, eta: 22:11:06, time: 1.128, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0559, loss_cls: 0.2186, acc: 92.1775, loss_bbox: 0.2715, loss_mask: 0.2675, loss: 0.8404 2024-05-30 16:54:49,135 - mmdet - INFO - Epoch [3][750/7330] lr: 1.000e-04, eta: 22:10:21, time: 1.139, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0553, loss_cls: 0.2142, acc: 92.3003, loss_bbox: 0.2658, loss_mask: 0.2658, loss: 0.8267 2024-05-30 16:55:42,725 - mmdet - INFO - Epoch [3][800/7330] lr: 1.000e-04, eta: 22:09:19, time: 1.072, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0555, loss_cls: 0.2144, acc: 92.4028, loss_bbox: 0.2696, loss_mask: 0.2605, loss: 0.8234 2024-05-30 16:56:38,933 - mmdet - INFO - Epoch [3][850/7330] lr: 1.000e-04, eta: 22:08:30, time: 1.124, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0502, loss_cls: 0.2086, acc: 92.6257, loss_bbox: 0.2625, loss_mask: 0.2618, loss: 0.8078 2024-05-30 16:57:40,263 - mmdet - INFO - Epoch [3][900/7330] lr: 1.000e-04, eta: 22:08:04, time: 1.227, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0563, loss_cls: 0.2230, acc: 92.1912, loss_bbox: 0.2731, loss_mask: 0.2622, loss: 0.8390 2024-05-30 16:58:34,601 - mmdet - INFO - Epoch [3][950/7330] lr: 1.000e-04, eta: 22:07:06, time: 1.087, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0282, loss_rpn_bbox: 0.0582, loss_cls: 0.2376, acc: 91.5439, loss_bbox: 0.2935, loss_mask: 0.2669, loss: 0.8845 2024-05-30 16:59:29,902 - mmdet - INFO - Epoch [3][1000/7330] lr: 1.000e-04, eta: 22:06:12, time: 1.106, data_time: 0.093, memory: 24290, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0576, loss_cls: 0.2302, acc: 91.8516, loss_bbox: 0.2855, loss_mask: 0.2669, loss: 0.8678 2024-05-30 17:00:23,708 - mmdet - INFO - Epoch [3][1050/7330] lr: 1.000e-04, eta: 22:05:11, time: 1.076, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0544, loss_cls: 0.2185, acc: 92.2273, loss_bbox: 0.2767, loss_mask: 0.2688, loss: 0.8445 2024-05-30 17:01:20,366 - mmdet - INFO - Epoch [3][1100/7330] lr: 1.000e-04, eta: 22:04:24, time: 1.133, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0551, loss_cls: 0.2305, acc: 91.7986, loss_bbox: 0.2872, loss_mask: 0.2622, loss: 0.8602 2024-05-30 17:02:14,330 - mmdet - INFO - Epoch [3][1150/7330] lr: 1.000e-04, eta: 22:03:24, time: 1.079, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0537, loss_cls: 0.2205, acc: 92.2971, loss_bbox: 0.2699, loss_mask: 0.2602, loss: 0.8291 2024-05-30 17:03:08,402 - mmdet - INFO - Epoch [3][1200/7330] lr: 1.000e-04, eta: 22:02:24, time: 1.081, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0560, loss_cls: 0.2242, acc: 92.1538, loss_bbox: 0.2800, loss_mask: 0.2637, loss: 0.8484 2024-05-30 17:04:04,547 - mmdet - INFO - Epoch [3][1250/7330] lr: 1.000e-04, eta: 22:01:34, time: 1.123, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0261, loss_rpn_bbox: 0.0527, loss_cls: 0.2173, acc: 92.2451, loss_bbox: 0.2699, loss_mask: 0.2577, loss: 0.8237 2024-05-30 17:05:04,141 - mmdet - INFO - Epoch [3][1300/7330] lr: 1.000e-04, eta: 22:01:00, time: 1.192, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0524, loss_cls: 0.2244, acc: 92.1777, loss_bbox: 0.2706, loss_mask: 0.2642, loss: 0.8359 2024-05-30 17:05:57,596 - mmdet - INFO - Epoch [3][1350/7330] lr: 1.000e-04, eta: 21:59:58, time: 1.069, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0522, loss_cls: 0.2210, acc: 92.2227, loss_bbox: 0.2720, loss_mask: 0.2672, loss: 0.8364 2024-05-30 17:06:54,049 - mmdet - INFO - Epoch [3][1400/7330] lr: 1.000e-04, eta: 21:59:09, time: 1.129, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0556, loss_cls: 0.2268, acc: 91.9656, loss_bbox: 0.2796, loss_mask: 0.2666, loss: 0.8542 2024-05-30 17:07:50,961 - mmdet - INFO - Epoch [3][1450/7330] lr: 1.000e-04, eta: 21:58:22, time: 1.138, data_time: 0.094, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0527, loss_cls: 0.2132, acc: 92.5190, loss_bbox: 0.2598, loss_mask: 0.2563, loss: 0.8059 2024-05-30 17:08:49,841 - mmdet - INFO - Epoch [3][1500/7330] lr: 1.000e-04, eta: 21:57:44, time: 1.178, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0527, loss_cls: 0.2184, acc: 92.3752, loss_bbox: 0.2680, loss_mask: 0.2617, loss: 0.8253 2024-05-30 17:09:44,900 - mmdet - INFO - Epoch [3][1550/7330] lr: 1.000e-04, eta: 21:56:49, time: 1.101, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0269, loss_rpn_bbox: 0.0559, loss_cls: 0.2239, acc: 92.0654, loss_bbox: 0.2746, loss_mask: 0.2625, loss: 0.8437 2024-05-30 17:10:39,465 - mmdet - INFO - Epoch [3][1600/7330] lr: 1.000e-04, eta: 21:55:52, time: 1.091, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0554, loss_cls: 0.2216, acc: 92.2166, loss_bbox: 0.2726, loss_mask: 0.2602, loss: 0.8356 2024-05-30 17:11:37,311 - mmdet - INFO - Epoch [3][1650/7330] lr: 1.000e-04, eta: 21:55:09, time: 1.157, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0541, loss_cls: 0.2148, acc: 92.4626, loss_bbox: 0.2644, loss_mask: 0.2648, loss: 0.8247 2024-05-30 17:12:31,735 - mmdet - INFO - Epoch [3][1700/7330] lr: 1.000e-04, eta: 21:54:11, time: 1.088, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0513, loss_cls: 0.2216, acc: 92.2275, loss_bbox: 0.2714, loss_mask: 0.2569, loss: 0.8265 2024-05-30 17:13:27,039 - mmdet - INFO - Epoch [3][1750/7330] lr: 1.000e-04, eta: 21:53:17, time: 1.106, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0270, loss_rpn_bbox: 0.0574, loss_cls: 0.2247, acc: 91.9580, loss_bbox: 0.2801, loss_mask: 0.2602, loss: 0.8493 2024-05-30 17:14:24,970 - mmdet - INFO - Epoch [3][1800/7330] lr: 1.000e-04, eta: 21:52:35, time: 1.158, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0297, loss_rpn_bbox: 0.0525, loss_cls: 0.2204, acc: 92.1125, loss_bbox: 0.2710, loss_mask: 0.2670, loss: 0.8406 2024-05-30 17:15:22,043 - mmdet - INFO - Epoch [3][1850/7330] lr: 1.000e-04, eta: 21:51:48, time: 1.142, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0564, loss_cls: 0.2116, acc: 92.4890, loss_bbox: 0.2681, loss_mask: 0.2596, loss: 0.8220 2024-05-30 17:16:15,604 - mmdet - INFO - Epoch [3][1900/7330] lr: 1.000e-04, eta: 21:50:47, time: 1.071, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0526, loss_cls: 0.2219, acc: 92.1504, loss_bbox: 0.2766, loss_mask: 0.2610, loss: 0.8374 2024-05-30 17:17:09,138 - mmdet - INFO - Epoch [3][1950/7330] lr: 1.000e-04, eta: 21:49:45, time: 1.071, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0508, loss_cls: 0.2145, acc: 92.4094, loss_bbox: 0.2649, loss_mask: 0.2568, loss: 0.8094 2024-05-30 17:18:08,210 - mmdet - INFO - Epoch [3][2000/7330] lr: 1.000e-04, eta: 21:49:07, time: 1.181, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0520, loss_cls: 0.2144, acc: 92.4358, loss_bbox: 0.2622, loss_mask: 0.2562, loss: 0.8076 2024-05-30 17:19:02,328 - mmdet - INFO - Epoch [3][2050/7330] lr: 1.000e-04, eta: 21:48:08, time: 1.082, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0558, loss_cls: 0.2274, acc: 91.8508, loss_bbox: 0.2799, loss_mask: 0.2633, loss: 0.8520 2024-05-30 17:19:58,932 - mmdet - INFO - Epoch [3][2100/7330] lr: 1.000e-04, eta: 21:47:19, time: 1.132, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0509, loss_cls: 0.2130, acc: 92.4321, loss_bbox: 0.2621, loss_mask: 0.2607, loss: 0.8099 2024-05-30 17:20:54,910 - mmdet - INFO - Epoch [3][2150/7330] lr: 1.000e-04, eta: 21:46:28, time: 1.120, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0529, loss_cls: 0.2219, acc: 92.2593, loss_bbox: 0.2727, loss_mask: 0.2633, loss: 0.8353 2024-05-30 17:21:48,437 - mmdet - INFO - Epoch [3][2200/7330] lr: 1.000e-04, eta: 21:45:26, time: 1.070, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0530, loss_cls: 0.2212, acc: 92.1887, loss_bbox: 0.2701, loss_mask: 0.2584, loss: 0.8261 2024-05-30 17:22:45,409 - mmdet - INFO - Epoch [3][2250/7330] lr: 1.000e-04, eta: 21:44:39, time: 1.139, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0565, loss_cls: 0.2262, acc: 91.9814, loss_bbox: 0.2825, loss_mask: 0.2624, loss: 0.8527 2024-05-30 17:23:39,790 - mmdet - INFO - Epoch [3][2300/7330] lr: 1.000e-04, eta: 21:43:41, time: 1.088, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0284, loss_rpn_bbox: 0.0566, loss_cls: 0.2227, acc: 92.2510, loss_bbox: 0.2707, loss_mask: 0.2526, loss: 0.8310 2024-05-30 17:24:34,452 - mmdet - INFO - Epoch [3][2350/7330] lr: 1.000e-04, eta: 21:42:44, time: 1.093, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0268, loss_rpn_bbox: 0.0559, loss_cls: 0.2273, acc: 91.8818, loss_bbox: 0.2852, loss_mask: 0.2671, loss: 0.8623 2024-05-30 17:25:33,454 - mmdet - INFO - Epoch [3][2400/7330] lr: 1.000e-04, eta: 21:42:06, time: 1.180, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0274, loss_rpn_bbox: 0.0552, loss_cls: 0.2192, acc: 92.3069, loss_bbox: 0.2696, loss_mask: 0.2635, loss: 0.8349 2024-05-30 17:26:30,085 - mmdet - INFO - Epoch [3][2450/7330] lr: 1.000e-04, eta: 21:41:17, time: 1.133, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0553, loss_cls: 0.2249, acc: 92.0989, loss_bbox: 0.2794, loss_mask: 0.2616, loss: 0.8484 2024-05-30 17:27:25,005 - mmdet - INFO - Epoch [3][2500/7330] lr: 1.000e-04, eta: 21:40:21, time: 1.098, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0259, loss_rpn_bbox: 0.0557, loss_cls: 0.2338, acc: 91.6047, loss_bbox: 0.2892, loss_mask: 0.2665, loss: 0.8710 2024-05-30 17:28:19,616 - mmdet - INFO - Epoch [3][2550/7330] lr: 1.000e-04, eta: 21:39:24, time: 1.092, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0559, loss_cls: 0.2277, acc: 91.8044, loss_bbox: 0.2851, loss_mask: 0.2720, loss: 0.8684 2024-05-30 17:29:16,340 - mmdet - INFO - Epoch [3][2600/7330] lr: 1.000e-04, eta: 21:38:35, time: 1.134, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0573, loss_cls: 0.2141, acc: 92.2224, loss_bbox: 0.2738, loss_mask: 0.2593, loss: 0.8299 2024-05-30 17:30:16,577 - mmdet - INFO - Epoch [3][2650/7330] lr: 1.000e-04, eta: 21:38:01, time: 1.205, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0522, loss_cls: 0.2145, acc: 92.3901, loss_bbox: 0.2706, loss_mask: 0.2561, loss: 0.8175 2024-05-30 17:31:11,843 - mmdet - INFO - Epoch [3][2700/7330] lr: 1.000e-04, eta: 21:37:07, time: 1.105, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0536, loss_cls: 0.2172, acc: 92.3726, loss_bbox: 0.2631, loss_mask: 0.2639, loss: 0.8234 2024-05-30 17:32:08,574 - mmdet - INFO - Epoch [3][2750/7330] lr: 1.000e-04, eta: 21:36:18, time: 1.135, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0555, loss_cls: 0.2274, acc: 91.9929, loss_bbox: 0.2765, loss_mask: 0.2628, loss: 0.8473 2024-05-30 17:33:04,952 - mmdet - INFO - Epoch [3][2800/7330] lr: 1.000e-04, eta: 21:35:28, time: 1.128, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0512, loss_cls: 0.2106, acc: 92.4546, loss_bbox: 0.2631, loss_mask: 0.2554, loss: 0.8045 2024-05-30 17:33:58,731 - mmdet - INFO - Epoch [3][2850/7330] lr: 1.000e-04, eta: 21:34:28, time: 1.076, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0522, loss_cls: 0.2121, acc: 92.4395, loss_bbox: 0.2688, loss_mask: 0.2638, loss: 0.8221 2024-05-30 17:34:52,527 - mmdet - INFO - Epoch [3][2900/7330] lr: 1.000e-04, eta: 21:33:27, time: 1.076, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0529, loss_cls: 0.2192, acc: 92.2700, loss_bbox: 0.2671, loss_mask: 0.2650, loss: 0.8297 2024-05-30 17:35:53,823 - mmdet - INFO - Epoch [3][2950/7330] lr: 1.000e-04, eta: 21:32:57, time: 1.226, data_time: 0.092, memory: 24290, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0545, loss_cls: 0.2234, acc: 92.1389, loss_bbox: 0.2710, loss_mask: 0.2570, loss: 0.8315 2024-05-30 17:36:47,494 - mmdet - INFO - Epoch [3][3000/7330] lr: 1.000e-04, eta: 21:31:56, time: 1.073, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0512, loss_cls: 0.2168, acc: 92.5413, loss_bbox: 0.2563, loss_mask: 0.2574, loss: 0.8059 2024-05-30 17:37:41,488 - mmdet - INFO - Epoch [3][3050/7330] lr: 1.000e-04, eta: 21:30:56, time: 1.080, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0536, loss_cls: 0.2201, acc: 92.1470, loss_bbox: 0.2692, loss_mask: 0.2612, loss: 0.8284 2024-05-30 17:38:35,380 - mmdet - INFO - Epoch [3][3100/7330] lr: 1.000e-04, eta: 21:29:56, time: 1.078, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0515, loss_cls: 0.2091, acc: 92.5549, loss_bbox: 0.2608, loss_mask: 0.2524, loss: 0.7966 2024-05-30 17:39:34,837 - mmdet - INFO - Epoch [3][3150/7330] lr: 1.000e-04, eta: 21:29:18, time: 1.189, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0266, loss_rpn_bbox: 0.0540, loss_cls: 0.2149, acc: 92.3896, loss_bbox: 0.2638, loss_mask: 0.2579, loss: 0.8171 2024-05-30 17:40:29,049 - mmdet - INFO - Epoch [3][3200/7330] lr: 1.000e-04, eta: 21:28:19, time: 1.084, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0538, loss_cls: 0.2130, acc: 92.4553, loss_bbox: 0.2557, loss_mask: 0.2571, loss: 0.8041 2024-05-30 17:41:24,973 - mmdet - INFO - Epoch [3][3250/7330] lr: 1.000e-04, eta: 21:27:27, time: 1.118, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0486, loss_cls: 0.2076, acc: 92.6311, loss_bbox: 0.2539, loss_mask: 0.2490, loss: 0.7827 2024-05-30 17:42:21,009 - mmdet - INFO - Epoch [3][3300/7330] lr: 1.000e-04, eta: 21:26:35, time: 1.121, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0513, loss_cls: 0.2147, acc: 92.4004, loss_bbox: 0.2681, loss_mask: 0.2605, loss: 0.8184 2024-05-30 17:43:15,956 - mmdet - INFO - Epoch [3][3350/7330] lr: 1.000e-04, eta: 21:25:40, time: 1.099, data_time: 0.092, memory: 24290, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0597, loss_cls: 0.2229, acc: 92.0054, loss_bbox: 0.2818, loss_mask: 0.2594, loss: 0.8494 2024-05-30 17:44:13,563 - mmdet - INFO - Epoch [3][3400/7330] lr: 1.000e-04, eta: 21:24:54, time: 1.152, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0277, loss_rpn_bbox: 0.0581, loss_cls: 0.2239, acc: 92.1567, loss_bbox: 0.2746, loss_mask: 0.2691, loss: 0.8535 2024-05-30 17:45:09,861 - mmdet - INFO - Epoch [3][3450/7330] lr: 1.000e-04, eta: 21:24:03, time: 1.126, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0263, loss_rpn_bbox: 0.0533, loss_cls: 0.2263, acc: 91.9448, loss_bbox: 0.2826, loss_mask: 0.2604, loss: 0.8489 2024-05-30 17:46:03,919 - mmdet - INFO - Epoch [3][3500/7330] lr: 1.000e-04, eta: 21:23:04, time: 1.081, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0539, loss_cls: 0.2082, acc: 92.6243, loss_bbox: 0.2548, loss_mask: 0.2510, loss: 0.7932 2024-05-30 17:47:00,424 - mmdet - INFO - Epoch [3][3550/7330] lr: 1.000e-04, eta: 21:22:14, time: 1.130, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0550, loss_cls: 0.2149, acc: 92.2681, loss_bbox: 0.2734, loss_mask: 0.2577, loss: 0.8263 2024-05-30 17:47:56,502 - mmdet - INFO - Epoch [3][3600/7330] lr: 1.000e-04, eta: 21:21:22, time: 1.122, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0514, loss_cls: 0.2106, acc: 92.6252, loss_bbox: 0.2536, loss_mask: 0.2560, loss: 0.7953 2024-05-30 17:48:54,262 - mmdet - INFO - Epoch [3][3650/7330] lr: 1.000e-04, eta: 21:20:37, time: 1.155, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0537, loss_cls: 0.2287, acc: 91.8093, loss_bbox: 0.2756, loss_mask: 0.2550, loss: 0.8379 2024-05-30 17:49:50,572 - mmdet - INFO - Epoch [3][3700/7330] lr: 1.000e-04, eta: 21:19:46, time: 1.126, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0559, loss_cls: 0.2244, acc: 92.0552, loss_bbox: 0.2767, loss_mask: 0.2579, loss: 0.8380 2024-05-30 17:50:44,615 - mmdet - INFO - Epoch [3][3750/7330] lr: 1.000e-04, eta: 21:18:47, time: 1.081, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0245, loss_rpn_bbox: 0.0546, loss_cls: 0.2253, acc: 91.9832, loss_bbox: 0.2798, loss_mask: 0.2596, loss: 0.8437 2024-05-30 17:51:38,335 - mmdet - INFO - Epoch [3][3800/7330] lr: 1.000e-04, eta: 21:17:46, time: 1.074, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0524, loss_cls: 0.2164, acc: 92.2659, loss_bbox: 0.2679, loss_mask: 0.2610, loss: 0.8222 2024-05-30 17:52:34,636 - mmdet - INFO - Epoch [3][3850/7330] lr: 1.000e-04, eta: 21:16:55, time: 1.126, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0504, loss_cls: 0.2088, acc: 92.5620, loss_bbox: 0.2568, loss_mask: 0.2522, loss: 0.7897 2024-05-30 17:53:32,911 - mmdet - INFO - Epoch [3][3900/7330] lr: 1.000e-04, eta: 21:16:12, time: 1.166, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0525, loss_cls: 0.2183, acc: 92.2974, loss_bbox: 0.2664, loss_mask: 0.2591, loss: 0.8206 2024-05-30 17:54:27,009 - mmdet - INFO - Epoch [3][3950/7330] lr: 1.000e-04, eta: 21:15:13, time: 1.082, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0237, loss_rpn_bbox: 0.0513, loss_cls: 0.2155, acc: 92.4922, loss_bbox: 0.2618, loss_mask: 0.2587, loss: 0.8109 2024-05-30 17:55:21,603 - mmdet - INFO - Epoch [3][4000/7330] lr: 1.000e-04, eta: 21:14:16, time: 1.092, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0542, loss_cls: 0.2234, acc: 92.1184, loss_bbox: 0.2717, loss_mask: 0.2646, loss: 0.8389 2024-05-30 17:56:19,579 - mmdet - INFO - Epoch [3][4050/7330] lr: 1.000e-04, eta: 21:13:31, time: 1.160, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0548, loss_cls: 0.2297, acc: 91.9189, loss_bbox: 0.2817, loss_mask: 0.2639, loss: 0.8550 2024-05-30 17:57:16,348 - mmdet - INFO - Epoch [3][4100/7330] lr: 1.000e-04, eta: 21:12:41, time: 1.135, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0552, loss_cls: 0.2227, acc: 92.2139, loss_bbox: 0.2739, loss_mask: 0.2606, loss: 0.8378 2024-05-30 17:58:09,899 - mmdet - INFO - Epoch [3][4150/7330] lr: 1.000e-04, eta: 21:11:40, time: 1.071, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0516, loss_cls: 0.2109, acc: 92.5571, loss_bbox: 0.2568, loss_mask: 0.2579, loss: 0.7999 2024-05-30 17:59:06,920 - mmdet - INFO - Epoch [3][4200/7330] lr: 1.000e-04, eta: 21:10:52, time: 1.140, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0539, loss_cls: 0.2117, acc: 92.4834, loss_bbox: 0.2638, loss_mask: 0.2556, loss: 0.8068 2024-05-30 18:00:00,852 - mmdet - INFO - Epoch [3][4250/7330] lr: 1.000e-04, eta: 21:09:52, time: 1.079, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0248, loss_rpn_bbox: 0.0520, loss_cls: 0.2183, acc: 92.3289, loss_bbox: 0.2664, loss_mask: 0.2610, loss: 0.8226 2024-05-30 18:00:56,859 - mmdet - INFO - Epoch [3][4300/7330] lr: 1.000e-04, eta: 21:09:00, time: 1.120, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0247, loss_rpn_bbox: 0.0534, loss_cls: 0.2193, acc: 92.1638, loss_bbox: 0.2718, loss_mask: 0.2604, loss: 0.8296 2024-05-30 18:01:51,021 - mmdet - INFO - Epoch [3][4350/7330] lr: 1.000e-04, eta: 21:08:01, time: 1.083, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0527, loss_cls: 0.2093, acc: 92.5400, loss_bbox: 0.2605, loss_mask: 0.2566, loss: 0.8049 2024-05-30 18:02:50,319 - mmdet - INFO - Epoch [3][4400/7330] lr: 1.000e-04, eta: 21:07:21, time: 1.186, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0535, loss_cls: 0.2192, acc: 92.1270, loss_bbox: 0.2682, loss_mask: 0.2542, loss: 0.8194 2024-05-30 18:03:47,516 - mmdet - INFO - Epoch [3][4450/7330] lr: 1.000e-04, eta: 21:06:33, time: 1.144, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0520, loss_cls: 0.2207, acc: 92.1899, loss_bbox: 0.2684, loss_mask: 0.2558, loss: 0.8208 2024-05-30 18:04:41,244 - mmdet - INFO - Epoch [3][4500/7330] lr: 1.000e-04, eta: 21:05:33, time: 1.074, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0508, loss_cls: 0.2183, acc: 92.1875, loss_bbox: 0.2715, loss_mask: 0.2563, loss: 0.8209 2024-05-30 18:05:35,034 - mmdet - INFO - Epoch [3][4550/7330] lr: 1.000e-04, eta: 21:04:32, time: 1.076, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0540, loss_cls: 0.2210, acc: 92.1877, loss_bbox: 0.2728, loss_mask: 0.2618, loss: 0.8338 2024-05-30 18:06:33,879 - mmdet - INFO - Epoch [3][4600/7330] lr: 1.000e-04, eta: 21:03:50, time: 1.177, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0540, loss_cls: 0.2225, acc: 92.1694, loss_bbox: 0.2753, loss_mask: 0.2593, loss: 0.8361 2024-05-30 18:07:27,442 - mmdet - INFO - Epoch [3][4650/7330] lr: 1.000e-04, eta: 21:02:49, time: 1.071, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0530, loss_cls: 0.2101, acc: 92.6023, loss_bbox: 0.2605, loss_mask: 0.2530, loss: 0.8022 2024-05-30 18:08:23,678 - mmdet - INFO - Epoch [3][4700/7330] lr: 1.000e-04, eta: 21:01:58, time: 1.125, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0511, loss_cls: 0.2031, acc: 92.7476, loss_bbox: 0.2564, loss_mask: 0.2547, loss: 0.7887 2024-05-30 18:09:18,053 - mmdet - INFO - Epoch [3][4750/7330] lr: 1.000e-04, eta: 21:01:00, time: 1.088, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0258, loss_rpn_bbox: 0.0532, loss_cls: 0.2158, acc: 92.4343, loss_bbox: 0.2608, loss_mask: 0.2613, loss: 0.8170 2024-05-30 18:10:14,196 - mmdet - INFO - Epoch [3][4800/7330] lr: 1.000e-04, eta: 21:00:08, time: 1.123, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0257, loss_rpn_bbox: 0.0519, loss_cls: 0.2042, acc: 92.8137, loss_bbox: 0.2542, loss_mask: 0.2571, loss: 0.7931 2024-05-30 18:11:10,236 - mmdet - INFO - Epoch [3][4850/7330] lr: 1.000e-04, eta: 20:59:16, time: 1.121, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0521, loss_cls: 0.2145, acc: 92.3560, loss_bbox: 0.2617, loss_mask: 0.2600, loss: 0.8143 2024-05-30 18:12:06,215 - mmdet - INFO - Epoch [3][4900/7330] lr: 1.000e-04, eta: 20:58:23, time: 1.120, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0522, loss_cls: 0.2025, acc: 92.8450, loss_bbox: 0.2527, loss_mask: 0.2560, loss: 0.7897 2024-05-30 18:13:05,447 - mmdet - INFO - Epoch [3][4950/7330] lr: 1.000e-04, eta: 20:57:42, time: 1.185, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0256, loss_rpn_bbox: 0.0538, loss_cls: 0.2136, acc: 92.4905, loss_bbox: 0.2642, loss_mask: 0.2609, loss: 0.8181 2024-05-30 18:13:59,841 - mmdet - INFO - Epoch [3][5000/7330] lr: 1.000e-04, eta: 20:56:44, time: 1.088, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0504, loss_cls: 0.2154, acc: 92.4143, loss_bbox: 0.2635, loss_mask: 0.2556, loss: 0.8076 2024-05-30 18:14:54,169 - mmdet - INFO - Epoch [3][5050/7330] lr: 1.000e-04, eta: 20:55:46, time: 1.087, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0530, loss_cls: 0.2168, acc: 92.4424, loss_bbox: 0.2582, loss_mask: 0.2610, loss: 0.8144 2024-05-30 18:15:50,319 - mmdet - INFO - Epoch [3][5100/7330] lr: 1.000e-04, eta: 20:54:54, time: 1.123, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0506, loss_cls: 0.2084, acc: 92.6858, loss_bbox: 0.2603, loss_mask: 0.2505, loss: 0.7920 2024-05-30 18:16:47,190 - mmdet - INFO - Epoch [3][5150/7330] lr: 1.000e-04, eta: 20:54:05, time: 1.137, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0534, loss_cls: 0.2084, acc: 92.5339, loss_bbox: 0.2553, loss_mask: 0.2525, loss: 0.7957 2024-05-30 18:17:41,444 - mmdet - INFO - Epoch [3][5200/7330] lr: 1.000e-04, eta: 20:53:06, time: 1.085, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0517, loss_cls: 0.2136, acc: 92.4172, loss_bbox: 0.2619, loss_mask: 0.2552, loss: 0.8060 2024-05-30 18:18:35,366 - mmdet - INFO - Epoch [3][5250/7330] lr: 1.000e-04, eta: 20:52:07, time: 1.078, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0536, loss_cls: 0.2192, acc: 92.3584, loss_bbox: 0.2617, loss_mask: 0.2591, loss: 0.8198 2024-05-30 18:19:30,001 - mmdet - INFO - Epoch [3][5300/7330] lr: 1.000e-04, eta: 20:51:09, time: 1.093, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0235, loss_rpn_bbox: 0.0503, loss_cls: 0.2110, acc: 92.4836, loss_bbox: 0.2603, loss_mask: 0.2555, loss: 0.8006 2024-05-30 18:20:28,856 - mmdet - INFO - Epoch [3][5350/7330] lr: 1.000e-04, eta: 20:50:27, time: 1.177, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0553, loss_cls: 0.2252, acc: 92.0334, loss_bbox: 0.2710, loss_mask: 0.2610, loss: 0.8367 2024-05-30 18:21:22,276 - mmdet - INFO - Epoch [3][5400/7330] lr: 1.000e-04, eta: 20:49:25, time: 1.068, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0493, loss_cls: 0.2065, acc: 92.6267, loss_bbox: 0.2571, loss_mask: 0.2529, loss: 0.7878 2024-05-30 18:22:22,444 - mmdet - INFO - Epoch [3][5450/7330] lr: 1.000e-04, eta: 20:48:47, time: 1.203, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0500, loss_cls: 0.2218, acc: 92.1667, loss_bbox: 0.2710, loss_mask: 0.2602, loss: 0.8244 2024-05-30 18:23:20,215 - mmdet - INFO - Epoch [3][5500/7330] lr: 1.000e-04, eta: 20:48:00, time: 1.155, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0251, loss_rpn_bbox: 0.0516, loss_cls: 0.2099, acc: 92.5984, loss_bbox: 0.2501, loss_mask: 0.2520, loss: 0.7887 2024-05-30 18:24:14,138 - mmdet - INFO - Epoch [3][5550/7330] lr: 1.000e-04, eta: 20:47:01, time: 1.078, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0512, loss_cls: 0.2141, acc: 92.4597, loss_bbox: 0.2569, loss_mask: 0.2498, loss: 0.7963 2024-05-30 18:25:08,112 - mmdet - INFO - Epoch [3][5600/7330] lr: 1.000e-04, eta: 20:46:01, time: 1.080, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0512, loss_cls: 0.2174, acc: 92.2737, loss_bbox: 0.2722, loss_mask: 0.2595, loss: 0.8226 2024-05-30 18:26:03,851 - mmdet - INFO - Epoch [3][5650/7330] lr: 1.000e-04, eta: 20:45:08, time: 1.115, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0254, loss_rpn_bbox: 0.0529, loss_cls: 0.2143, acc: 92.4111, loss_bbox: 0.2616, loss_mask: 0.2581, loss: 0.8124 2024-05-30 18:27:00,187 - mmdet - INFO - Epoch [3][5700/7330] lr: 1.000e-04, eta: 20:44:16, time: 1.127, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0500, loss_cls: 0.2095, acc: 92.5056, loss_bbox: 0.2610, loss_mask: 0.2561, loss: 0.8017 2024-05-30 18:27:54,849 - mmdet - INFO - Epoch [3][5750/7330] lr: 1.000e-04, eta: 20:43:19, time: 1.093, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0523, loss_cls: 0.2170, acc: 92.2751, loss_bbox: 0.2686, loss_mask: 0.2595, loss: 0.8216 2024-05-30 18:28:48,966 - mmdet - INFO - Epoch [3][5800/7330] lr: 1.000e-04, eta: 20:42:21, time: 1.082, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0495, loss_cls: 0.2096, acc: 92.5508, loss_bbox: 0.2534, loss_mask: 0.2548, loss: 0.7915 2024-05-30 18:29:42,936 - mmdet - INFO - Epoch [3][5850/7330] lr: 1.000e-04, eta: 20:41:21, time: 1.079, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0508, loss_cls: 0.2113, acc: 92.4717, loss_bbox: 0.2625, loss_mask: 0.2521, loss: 0.7989 2024-05-30 18:30:36,747 - mmdet - INFO - Epoch [3][5900/7330] lr: 1.000e-04, eta: 20:40:21, time: 1.076, data_time: 0.050, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0532, loss_cls: 0.2105, acc: 92.5632, loss_bbox: 0.2555, loss_mask: 0.2548, loss: 0.7980 2024-05-30 18:31:37,025 - mmdet - INFO - Epoch [3][5950/7330] lr: 1.000e-04, eta: 20:39:43, time: 1.206, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0543, loss_cls: 0.2172, acc: 92.2358, loss_bbox: 0.2685, loss_mask: 0.2593, loss: 0.8244 2024-05-30 18:32:35,357 - mmdet - INFO - Epoch [3][6000/7330] lr: 1.000e-04, eta: 20:38:58, time: 1.167, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0488, loss_cls: 0.2176, acc: 92.4946, loss_bbox: 0.2597, loss_mask: 0.2551, loss: 0.8038 2024-05-30 18:33:29,136 - mmdet - INFO - Epoch [3][6050/7330] lr: 1.000e-04, eta: 20:37:58, time: 1.076, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0533, loss_cls: 0.2070, acc: 92.6665, loss_bbox: 0.2583, loss_mask: 0.2519, loss: 0.7948 2024-05-30 18:34:25,280 - mmdet - INFO - Epoch [3][6100/7330] lr: 1.000e-04, eta: 20:37:05, time: 1.123, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0519, loss_cls: 0.2230, acc: 92.0581, loss_bbox: 0.2674, loss_mask: 0.2549, loss: 0.8232 2024-05-30 18:35:20,865 - mmdet - INFO - Epoch [3][6150/7330] lr: 1.000e-04, eta: 20:36:05, time: 1.071, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0511, loss_cls: 0.2093, acc: 92.5515, loss_bbox: 0.2536, loss_mask: 0.2508, loss: 0.7862 2024-05-30 18:36:17,373 - mmdet - INFO - Epoch [3][6200/7330] lr: 1.000e-04, eta: 20:35:13, time: 1.128, data_time: 0.121, memory: 24290, loss_rpn_cls: 0.0262, loss_rpn_bbox: 0.0527, loss_cls: 0.2040, acc: 92.6892, loss_bbox: 0.2537, loss_mask: 0.2521, loss: 0.7888 2024-05-30 18:37:11,229 - mmdet - INFO - Epoch [3][6250/7330] lr: 1.000e-04, eta: 20:34:21, time: 1.121, data_time: 0.106, memory: 24290, loss_rpn_cls: 0.0242, loss_rpn_bbox: 0.0530, loss_cls: 0.2087, acc: 92.4307, loss_bbox: 0.2643, loss_mask: 0.2627, loss: 0.8129 2024-05-30 18:38:05,627 - mmdet - INFO - Epoch [3][6300/7330] lr: 1.000e-04, eta: 20:33:23, time: 1.088, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0556, loss_cls: 0.2164, acc: 92.3005, loss_bbox: 0.2634, loss_mask: 0.2599, loss: 0.8193 2024-05-30 18:38:59,282 - mmdet - INFO - Epoch [3][6350/7330] lr: 1.000e-04, eta: 20:32:23, time: 1.073, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0511, loss_cls: 0.2132, acc: 92.3892, loss_bbox: 0.2642, loss_mask: 0.2540, loss: 0.8058 2024-05-30 18:39:54,333 - mmdet - INFO - Epoch [3][6400/7330] lr: 1.000e-04, eta: 20:31:27, time: 1.101, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0559, loss_cls: 0.2161, acc: 92.2080, loss_bbox: 0.2651, loss_mask: 0.2602, loss: 0.8225 2024-05-30 18:40:48,944 - mmdet - INFO - Epoch [3][6450/7330] lr: 1.000e-04, eta: 20:30:30, time: 1.092, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0574, loss_cls: 0.2252, acc: 92.0349, loss_bbox: 0.2708, loss_mask: 0.2603, loss: 0.8386 2024-05-30 18:41:44,726 - mmdet - INFO - Epoch [3][6500/7330] lr: 1.000e-04, eta: 20:29:36, time: 1.116, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0497, loss_cls: 0.2007, acc: 92.8752, loss_bbox: 0.2535, loss_mask: 0.2478, loss: 0.7755 2024-05-30 18:42:40,849 - mmdet - INFO - Epoch [3][6550/7330] lr: 1.000e-04, eta: 20:28:44, time: 1.123, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0228, loss_rpn_bbox: 0.0530, loss_cls: 0.2146, acc: 92.3828, loss_bbox: 0.2645, loss_mask: 0.2488, loss: 0.8037 2024-05-30 18:43:41,228 - mmdet - INFO - Epoch [3][6600/7330] lr: 1.000e-04, eta: 20:28:05, time: 1.208, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0515, loss_cls: 0.2144, acc: 92.3748, loss_bbox: 0.2579, loss_mask: 0.2555, loss: 0.8016 2024-05-30 18:44:38,728 - mmdet - INFO - Epoch [3][6650/7330] lr: 1.000e-04, eta: 20:27:17, time: 1.150, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0549, loss_cls: 0.2243, acc: 92.0330, loss_bbox: 0.2666, loss_mask: 0.2637, loss: 0.8346 2024-05-30 18:45:32,051 - mmdet - INFO - Epoch [3][6700/7330] lr: 1.000e-04, eta: 20:26:15, time: 1.066, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0249, loss_rpn_bbox: 0.0513, loss_cls: 0.2119, acc: 92.5659, loss_bbox: 0.2570, loss_mask: 0.2580, loss: 0.8030 2024-05-30 18:46:27,327 - mmdet - INFO - Epoch [3][6750/7330] lr: 1.000e-04, eta: 20:25:20, time: 1.106, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0519, loss_cls: 0.2168, acc: 92.4165, loss_bbox: 0.2629, loss_mask: 0.2561, loss: 0.8094 2024-05-30 18:47:23,052 - mmdet - INFO - Epoch [3][6800/7330] lr: 1.000e-04, eta: 20:24:26, time: 1.114, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0515, loss_cls: 0.2084, acc: 92.6128, loss_bbox: 0.2563, loss_mask: 0.2547, loss: 0.7942 2024-05-30 18:48:17,839 - mmdet - INFO - Epoch [3][6850/7330] lr: 1.000e-04, eta: 20:23:30, time: 1.096, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0255, loss_rpn_bbox: 0.0519, loss_cls: 0.2215, acc: 92.1685, loss_bbox: 0.2664, loss_mask: 0.2628, loss: 0.8281 2024-05-30 18:49:11,036 - mmdet - INFO - Epoch [3][6900/7330] lr: 1.000e-04, eta: 20:22:28, time: 1.064, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0224, loss_rpn_bbox: 0.0495, loss_cls: 0.2110, acc: 92.5503, loss_bbox: 0.2610, loss_mask: 0.2526, loss: 0.7964 2024-05-30 18:50:05,764 - mmdet - INFO - Epoch [3][6950/7330] lr: 1.000e-04, eta: 20:21:32, time: 1.095, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0260, loss_rpn_bbox: 0.0555, loss_cls: 0.2193, acc: 92.1697, loss_bbox: 0.2710, loss_mask: 0.2588, loss: 0.8306 2024-05-30 18:50:59,696 - mmdet - INFO - Epoch [3][7000/7330] lr: 1.000e-04, eta: 20:20:32, time: 1.079, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0487, loss_cls: 0.2137, acc: 92.4285, loss_bbox: 0.2598, loss_mask: 0.2501, loss: 0.7939 2024-05-30 18:51:54,539 - mmdet - INFO - Epoch [3][7050/7330] lr: 1.000e-04, eta: 20:19:36, time: 1.097, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0272, loss_rpn_bbox: 0.0583, loss_cls: 0.2238, acc: 92.0132, loss_bbox: 0.2730, loss_mask: 0.2597, loss: 0.8420 2024-05-30 18:52:55,213 - mmdet - INFO - Epoch [3][7100/7330] lr: 1.000e-04, eta: 20:18:57, time: 1.213, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0250, loss_rpn_bbox: 0.0537, loss_cls: 0.2188, acc: 92.2161, loss_bbox: 0.2660, loss_mask: 0.2540, loss: 0.8175 2024-05-30 18:53:52,403 - mmdet - INFO - Epoch [3][7150/7330] lr: 1.000e-04, eta: 20:18:08, time: 1.144, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0243, loss_rpn_bbox: 0.0532, loss_cls: 0.2153, acc: 92.3804, loss_bbox: 0.2634, loss_mask: 0.2546, loss: 0.8109 2024-05-30 18:54:48,479 - mmdet - INFO - Epoch [3][7200/7330] lr: 1.000e-04, eta: 20:17:15, time: 1.122, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0253, loss_rpn_bbox: 0.0512, loss_cls: 0.2077, acc: 92.5771, loss_bbox: 0.2538, loss_mask: 0.2490, loss: 0.7870 2024-05-30 18:55:45,021 - mmdet - INFO - Epoch [3][7250/7330] lr: 1.000e-04, eta: 20:16:24, time: 1.131, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0497, loss_cls: 0.2115, acc: 92.3750, loss_bbox: 0.2644, loss_mask: 0.2549, loss: 0.8031 2024-05-30 18:56:42,558 - mmdet - INFO - Epoch [3][7300/7330] lr: 1.000e-04, eta: 20:15:35, time: 1.151, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0241, loss_rpn_bbox: 0.0538, loss_cls: 0.2173, acc: 92.3716, loss_bbox: 0.2595, loss_mask: 0.2561, loss: 0.8109 2024-05-30 18:57:15,418 - mmdet - INFO - Saving checkpoint at 3 epochs 2024-05-30 18:59:38,730 - mmdet - INFO - Evaluating bbox... 2024-05-30 19:00:04,441 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.417 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.668 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.452 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.256 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.453 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.561 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.372 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.589 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.698 2024-05-30 19:00:04,441 - mmdet - INFO - Evaluating segm... 2024-05-30 19:00:31,530 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.382 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.631 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.400 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.178 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.418 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.587 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.504 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.504 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.504 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.304 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.683 2024-05-30 19:00:31,872 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 19:00:31,874 - mmdet - INFO - Epoch(val) [3][625] bbox_mAP: 0.4170, bbox_mAP_50: 0.6680, bbox_mAP_75: 0.4520, bbox_mAP_s: 0.2560, bbox_mAP_m: 0.4530, bbox_mAP_l: 0.5610, bbox_mAP_copypaste: 0.417 0.668 0.452 0.256 0.453 0.561, segm_mAP: 0.3820, segm_mAP_50: 0.6310, segm_mAP_75: 0.4000, segm_mAP_s: 0.1780, segm_mAP_m: 0.4180, segm_mAP_l: 0.5870, segm_mAP_copypaste: 0.382 0.631 0.400 0.178 0.418 0.587 2024-05-30 19:01:41,614 - mmdet - INFO - Epoch [4][50/7330] lr: 1.000e-04, eta: 20:13:11, time: 1.394, data_time: 0.155, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0515, loss_cls: 0.2046, acc: 92.6965, loss_bbox: 0.2551, loss_mask: 0.2460, loss: 0.7789 2024-05-30 19:02:35,552 - mmdet - INFO - Epoch [4][100/7330] lr: 1.000e-04, eta: 20:12:12, time: 1.079, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0494, loss_cls: 0.2025, acc: 92.6807, loss_bbox: 0.2570, loss_mask: 0.2450, loss: 0.7746 2024-05-30 19:03:29,526 - mmdet - INFO - Epoch [4][150/7330] lr: 1.000e-04, eta: 20:11:13, time: 1.079, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0498, loss_cls: 0.1991, acc: 92.7917, loss_bbox: 0.2562, loss_mask: 0.2428, loss: 0.7682 2024-05-30 19:04:26,432 - mmdet - INFO - Epoch [4][200/7330] lr: 1.000e-04, eta: 20:10:23, time: 1.138, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0473, loss_cls: 0.1953, acc: 92.8818, loss_bbox: 0.2465, loss_mask: 0.2455, loss: 0.7566 2024-05-30 19:05:26,150 - mmdet - INFO - Epoch [4][250/7330] lr: 1.000e-04, eta: 20:09:41, time: 1.194, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0491, loss_cls: 0.1990, acc: 92.6267, loss_bbox: 0.2546, loss_mask: 0.2436, loss: 0.7681 2024-05-30 19:06:19,548 - mmdet - INFO - Epoch [4][300/7330] lr: 1.000e-04, eta: 20:08:41, time: 1.068, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0463, loss_cls: 0.2018, acc: 92.7051, loss_bbox: 0.2536, loss_mask: 0.2498, loss: 0.7722 2024-05-30 19:07:21,310 - mmdet - INFO - Epoch [4][350/7330] lr: 1.000e-04, eta: 20:08:05, time: 1.235, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0505, loss_cls: 0.2018, acc: 92.7444, loss_bbox: 0.2551, loss_mask: 0.2503, loss: 0.7799 2024-05-30 19:08:15,035 - mmdet - INFO - Epoch [4][400/7330] lr: 1.000e-04, eta: 20:07:05, time: 1.075, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0492, loss_cls: 0.2017, acc: 92.7019, loss_bbox: 0.2531, loss_mask: 0.2475, loss: 0.7736 2024-05-30 19:09:08,774 - mmdet - INFO - Epoch [4][450/7330] lr: 1.000e-04, eta: 20:06:05, time: 1.075, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0509, loss_cls: 0.1947, acc: 92.9666, loss_bbox: 0.2457, loss_mask: 0.2416, loss: 0.7540 2024-05-30 19:10:02,713 - mmdet - INFO - Epoch [4][500/7330] lr: 1.000e-04, eta: 20:05:06, time: 1.079, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0477, loss_cls: 0.2026, acc: 92.7063, loss_bbox: 0.2522, loss_mask: 0.2513, loss: 0.7753 2024-05-30 19:11:00,177 - mmdet - INFO - Epoch [4][550/7330] lr: 1.000e-04, eta: 20:04:18, time: 1.149, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0252, loss_rpn_bbox: 0.0558, loss_cls: 0.2246, acc: 91.8037, loss_bbox: 0.2761, loss_mask: 0.2612, loss: 0.8428 2024-05-30 19:11:56,264 - mmdet - INFO - Epoch [4][600/7330] lr: 1.000e-04, eta: 20:03:25, time: 1.122, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0504, loss_cls: 0.2095, acc: 92.3596, loss_bbox: 0.2567, loss_mask: 0.2465, loss: 0.7839 2024-05-30 19:12:50,416 - mmdet - INFO - Epoch [4][650/7330] lr: 1.000e-04, eta: 20:02:27, time: 1.083, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0502, loss_cls: 0.2038, acc: 92.7251, loss_bbox: 0.2514, loss_mask: 0.2474, loss: 0.7733 2024-05-30 19:13:47,255 - mmdet - INFO - Epoch [4][700/7330] lr: 1.000e-04, eta: 20:01:36, time: 1.136, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0519, loss_cls: 0.2128, acc: 92.3833, loss_bbox: 0.2628, loss_mask: 0.2559, loss: 0.8049 2024-05-30 19:14:41,850 - mmdet - INFO - Epoch [4][750/7330] lr: 1.000e-04, eta: 20:00:39, time: 1.092, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0510, loss_cls: 0.2044, acc: 92.6956, loss_bbox: 0.2522, loss_mask: 0.2470, loss: 0.7760 2024-05-30 19:15:41,342 - mmdet - INFO - Epoch [4][800/7330] lr: 1.000e-04, eta: 19:59:56, time: 1.190, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0523, loss_cls: 0.2073, acc: 92.6299, loss_bbox: 0.2555, loss_mask: 0.2549, loss: 0.7918 2024-05-30 19:16:35,931 - mmdet - INFO - Epoch [4][850/7330] lr: 1.000e-04, eta: 19:58:59, time: 1.092, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0500, loss_cls: 0.2023, acc: 92.6909, loss_bbox: 0.2516, loss_mask: 0.2499, loss: 0.7751 2024-05-30 19:17:32,491 - mmdet - INFO - Epoch [4][900/7330] lr: 1.000e-04, eta: 19:58:07, time: 1.131, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0518, loss_cls: 0.2109, acc: 92.4832, loss_bbox: 0.2612, loss_mask: 0.2510, loss: 0.7952 2024-05-30 19:18:28,801 - mmdet - INFO - Epoch [4][950/7330] lr: 1.000e-04, eta: 19:57:15, time: 1.126, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0493, loss_cls: 0.2057, acc: 92.8447, loss_bbox: 0.2538, loss_mask: 0.2426, loss: 0.7731 2024-05-30 19:19:25,246 - mmdet - INFO - Epoch [4][1000/7330] lr: 1.000e-04, eta: 19:56:23, time: 1.129, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0505, loss_cls: 0.2045, acc: 92.7048, loss_bbox: 0.2563, loss_mask: 0.2507, loss: 0.7838 2024-05-30 19:20:19,202 - mmdet - INFO - Epoch [4][1050/7330] lr: 1.000e-04, eta: 19:55:24, time: 1.079, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0516, loss_cls: 0.1944, acc: 92.8638, loss_bbox: 0.2459, loss_mask: 0.2444, loss: 0.7583 2024-05-30 19:21:15,367 - mmdet - INFO - Epoch [4][1100/7330] lr: 1.000e-04, eta: 19:54:32, time: 1.123, data_time: 0.050, memory: 24290, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0473, loss_cls: 0.2024, acc: 92.8267, loss_bbox: 0.2467, loss_mask: 0.2423, loss: 0.7597 2024-05-30 19:22:09,869 - mmdet - INFO - Epoch [4][1150/7330] lr: 1.000e-04, eta: 19:53:35, time: 1.090, data_time: 0.099, memory: 24290, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0528, loss_cls: 0.2049, acc: 92.5715, loss_bbox: 0.2590, loss_mask: 0.2486, loss: 0.7882 2024-05-30 19:23:04,778 - mmdet - INFO - Epoch [4][1200/7330] lr: 1.000e-04, eta: 19:52:38, time: 1.098, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0519, loss_cls: 0.2072, acc: 92.5637, loss_bbox: 0.2537, loss_mask: 0.2467, loss: 0.7814 2024-05-30 19:24:02,841 - mmdet - INFO - Epoch [4][1250/7330] lr: 1.000e-04, eta: 19:51:51, time: 1.161, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0495, loss_cls: 0.2037, acc: 92.6025, loss_bbox: 0.2555, loss_mask: 0.2531, loss: 0.7824 2024-05-30 19:24:57,183 - mmdet - INFO - Epoch [4][1300/7330] lr: 1.000e-04, eta: 19:50:53, time: 1.087, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0510, loss_cls: 0.2049, acc: 92.5950, loss_bbox: 0.2571, loss_mask: 0.2483, loss: 0.7846 2024-05-30 19:25:55,897 - mmdet - INFO - Epoch [4][1350/7330] lr: 1.000e-04, eta: 19:50:08, time: 1.174, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0499, loss_cls: 0.1992, acc: 92.8486, loss_bbox: 0.2464, loss_mask: 0.2466, loss: 0.7654 2024-05-30 19:26:51,444 - mmdet - INFO - Epoch [4][1400/7330] lr: 1.000e-04, eta: 19:49:13, time: 1.111, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0490, loss_cls: 0.2054, acc: 92.5864, loss_bbox: 0.2552, loss_mask: 0.2513, loss: 0.7830 2024-05-30 19:27:47,846 - mmdet - INFO - Epoch [4][1450/7330] lr: 1.000e-04, eta: 19:48:21, time: 1.128, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0496, loss_cls: 0.2045, acc: 92.6028, loss_bbox: 0.2514, loss_mask: 0.2453, loss: 0.7723 2024-05-30 19:28:46,969 - mmdet - INFO - Epoch [4][1500/7330] lr: 1.000e-04, eta: 19:47:36, time: 1.182, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0507, loss_cls: 0.2008, acc: 92.7432, loss_bbox: 0.2505, loss_mask: 0.2495, loss: 0.7731 2024-05-30 19:29:41,378 - mmdet - INFO - Epoch [4][1550/7330] lr: 1.000e-04, eta: 19:46:39, time: 1.088, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0505, loss_cls: 0.2047, acc: 92.6306, loss_bbox: 0.2595, loss_mask: 0.2487, loss: 0.7855 2024-05-30 19:30:35,586 - mmdet - INFO - Epoch [4][1600/7330] lr: 1.000e-04, eta: 19:45:41, time: 1.084, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0239, loss_rpn_bbox: 0.0516, loss_cls: 0.2075, acc: 92.4858, loss_bbox: 0.2573, loss_mask: 0.2517, loss: 0.7919 2024-05-30 19:31:30,050 - mmdet - INFO - Epoch [4][1650/7330] lr: 1.000e-04, eta: 19:44:43, time: 1.089, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0525, loss_cls: 0.2020, acc: 92.6213, loss_bbox: 0.2547, loss_mask: 0.2465, loss: 0.7786 2024-05-30 19:32:26,567 - mmdet - INFO - Epoch [4][1700/7330] lr: 1.000e-04, eta: 19:43:51, time: 1.130, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0497, loss_cls: 0.1969, acc: 92.7634, loss_bbox: 0.2532, loss_mask: 0.2499, loss: 0.7701 2024-05-30 19:33:21,641 - mmdet - INFO - Epoch [4][1750/7330] lr: 1.000e-04, eta: 19:42:56, time: 1.102, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0521, loss_cls: 0.2213, acc: 92.1541, loss_bbox: 0.2703, loss_mask: 0.2551, loss: 0.8218 2024-05-30 19:34:16,045 - mmdet - INFO - Epoch [4][1800/7330] lr: 1.000e-04, eta: 19:41:58, time: 1.088, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0236, loss_rpn_bbox: 0.0511, loss_cls: 0.2094, acc: 92.5032, loss_bbox: 0.2549, loss_mask: 0.2445, loss: 0.7835 2024-05-30 19:35:15,387 - mmdet - INFO - Epoch [4][1850/7330] lr: 1.000e-04, eta: 19:41:14, time: 1.187, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0517, loss_cls: 0.2015, acc: 92.7063, loss_bbox: 0.2570, loss_mask: 0.2469, loss: 0.7791 2024-05-30 19:36:14,797 - mmdet - INFO - Epoch [4][1900/7330] lr: 1.000e-04, eta: 19:40:30, time: 1.188, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0522, loss_cls: 0.2122, acc: 92.1252, loss_bbox: 0.2673, loss_mask: 0.2551, loss: 0.8077 2024-05-30 19:37:08,793 - mmdet - INFO - Epoch [4][1950/7330] lr: 1.000e-04, eta: 19:39:31, time: 1.080, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0505, loss_cls: 0.2005, acc: 92.6292, loss_bbox: 0.2599, loss_mask: 0.2454, loss: 0.7770 2024-05-30 19:38:05,434 - mmdet - INFO - Epoch [4][2000/7330] lr: 1.000e-04, eta: 19:38:39, time: 1.133, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0534, loss_cls: 0.2087, acc: 92.5083, loss_bbox: 0.2658, loss_mask: 0.2518, loss: 0.8017 2024-05-30 19:39:04,305 - mmdet - INFO - Epoch [4][2050/7330] lr: 1.000e-04, eta: 19:37:54, time: 1.177, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0563, loss_cls: 0.2129, acc: 92.2305, loss_bbox: 0.2714, loss_mask: 0.2556, loss: 0.8173 2024-05-30 19:39:59,009 - mmdet - INFO - Epoch [4][2100/7330] lr: 1.000e-04, eta: 19:36:57, time: 1.094, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0527, loss_cls: 0.2065, acc: 92.5144, loss_bbox: 0.2607, loss_mask: 0.2528, loss: 0.7938 2024-05-30 19:40:53,468 - mmdet - INFO - Epoch [4][2150/7330] lr: 1.000e-04, eta: 19:35:59, time: 1.089, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0492, loss_cls: 0.2065, acc: 92.6787, loss_bbox: 0.2485, loss_mask: 0.2457, loss: 0.7715 2024-05-30 19:41:48,393 - mmdet - INFO - Epoch [4][2200/7330] lr: 1.000e-04, eta: 19:35:03, time: 1.099, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0511, loss_cls: 0.2059, acc: 92.5686, loss_bbox: 0.2637, loss_mask: 0.2522, loss: 0.7960 2024-05-30 19:42:42,413 - mmdet - INFO - Epoch [4][2250/7330] lr: 1.000e-04, eta: 19:34:04, time: 1.080, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0485, loss_cls: 0.2063, acc: 92.5525, loss_bbox: 0.2548, loss_mask: 0.2540, loss: 0.7849 2024-05-30 19:43:36,750 - mmdet - INFO - Epoch [4][2300/7330] lr: 1.000e-04, eta: 19:33:07, time: 1.087, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0494, loss_cls: 0.1984, acc: 92.8057, loss_bbox: 0.2503, loss_mask: 0.2436, loss: 0.7612 2024-05-30 19:44:33,261 - mmdet - INFO - Epoch [4][2350/7330] lr: 1.000e-04, eta: 19:32:15, time: 1.130, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0531, loss_cls: 0.2123, acc: 92.3547, loss_bbox: 0.2627, loss_mask: 0.2504, loss: 0.8019 2024-05-30 19:45:28,158 - mmdet - INFO - Epoch [4][2400/7330] lr: 1.000e-04, eta: 19:31:18, time: 1.098, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0485, loss_cls: 0.2011, acc: 92.6724, loss_bbox: 0.2496, loss_mask: 0.2440, loss: 0.7645 2024-05-30 19:46:30,991 - mmdet - INFO - Epoch [4][2450/7330] lr: 1.000e-04, eta: 19:30:43, time: 1.257, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0461, loss_cls: 0.1951, acc: 93.0137, loss_bbox: 0.2423, loss_mask: 0.2405, loss: 0.7456 2024-05-30 19:47:28,584 - mmdet - INFO - Epoch [4][2500/7330] lr: 1.000e-04, eta: 19:29:53, time: 1.152, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0529, loss_cls: 0.2066, acc: 92.4314, loss_bbox: 0.2601, loss_mask: 0.2505, loss: 0.7927 2024-05-30 19:48:28,732 - mmdet - INFO - Epoch [4][2550/7330] lr: 1.000e-04, eta: 19:29:11, time: 1.203, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0490, loss_cls: 0.1961, acc: 92.8027, loss_bbox: 0.2465, loss_mask: 0.2452, loss: 0.7574 2024-05-30 19:49:23,108 - mmdet - INFO - Epoch [4][2600/7330] lr: 1.000e-04, eta: 19:28:13, time: 1.087, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0223, loss_rpn_bbox: 0.0516, loss_cls: 0.2062, acc: 92.5249, loss_bbox: 0.2592, loss_mask: 0.2453, loss: 0.7846 2024-05-30 19:50:19,340 - mmdet - INFO - Epoch [4][2650/7330] lr: 1.000e-04, eta: 19:27:20, time: 1.125, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0500, loss_cls: 0.2059, acc: 92.6880, loss_bbox: 0.2492, loss_mask: 0.2444, loss: 0.7711 2024-05-30 19:51:13,661 - mmdet - INFO - Epoch [4][2700/7330] lr: 1.000e-04, eta: 19:26:22, time: 1.086, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0514, loss_cls: 0.2058, acc: 92.5781, loss_bbox: 0.2489, loss_mask: 0.2461, loss: 0.7727 2024-05-30 19:52:09,060 - mmdet - INFO - Epoch [4][2750/7330] lr: 1.000e-04, eta: 19:25:27, time: 1.108, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0244, loss_rpn_bbox: 0.0518, loss_cls: 0.1999, acc: 92.8564, loss_bbox: 0.2487, loss_mask: 0.2450, loss: 0.7698 2024-05-30 19:53:03,192 - mmdet - INFO - Epoch [4][2800/7330] lr: 1.000e-04, eta: 19:24:29, time: 1.083, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0507, loss_cls: 0.2088, acc: 92.5520, loss_bbox: 0.2527, loss_mask: 0.2451, loss: 0.7784 2024-05-30 19:53:57,113 - mmdet - INFO - Epoch [4][2850/7330] lr: 1.000e-04, eta: 19:23:30, time: 1.078, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0456, loss_cls: 0.1968, acc: 92.9673, loss_bbox: 0.2470, loss_mask: 0.2500, loss: 0.7601 2024-05-30 19:54:51,219 - mmdet - INFO - Epoch [4][2900/7330] lr: 1.000e-04, eta: 19:22:32, time: 1.082, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0511, loss_cls: 0.2051, acc: 92.5183, loss_bbox: 0.2610, loss_mask: 0.2547, loss: 0.7949 2024-05-30 19:55:48,244 - mmdet - INFO - Epoch [4][2950/7330] lr: 1.000e-04, eta: 19:21:41, time: 1.141, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0513, loss_cls: 0.2089, acc: 92.5396, loss_bbox: 0.2539, loss_mask: 0.2509, loss: 0.7868 2024-05-30 19:56:49,374 - mmdet - INFO - Epoch [4][3000/7330] lr: 1.000e-04, eta: 19:21:00, time: 1.223, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0485, loss_cls: 0.2054, acc: 92.5312, loss_bbox: 0.2583, loss_mask: 0.2502, loss: 0.7832 2024-05-30 19:57:43,799 - mmdet - INFO - Epoch [4][3050/7330] lr: 1.000e-04, eta: 19:20:03, time: 1.088, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0496, loss_cls: 0.2045, acc: 92.6235, loss_bbox: 0.2596, loss_mask: 0.2464, loss: 0.7816 2024-05-30 19:58:42,724 - mmdet - INFO - Epoch [4][3100/7330] lr: 1.000e-04, eta: 19:19:16, time: 1.179, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0467, loss_cls: 0.2008, acc: 92.7852, loss_bbox: 0.2475, loss_mask: 0.2476, loss: 0.7639 2024-05-30 19:59:41,245 - mmdet - INFO - Epoch [4][3150/7330] lr: 1.000e-04, eta: 19:18:29, time: 1.170, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0490, loss_cls: 0.2077, acc: 92.5308, loss_bbox: 0.2563, loss_mask: 0.2484, loss: 0.7833 2024-05-30 20:00:37,030 - mmdet - INFO - Epoch [4][3200/7330] lr: 1.000e-04, eta: 19:17:35, time: 1.116, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0480, loss_cls: 0.1967, acc: 92.8936, loss_bbox: 0.2483, loss_mask: 0.2440, loss: 0.7575 2024-05-30 20:01:31,017 - mmdet - INFO - Epoch [4][3250/7330] lr: 1.000e-04, eta: 19:16:36, time: 1.080, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0492, loss_cls: 0.2060, acc: 92.6177, loss_bbox: 0.2550, loss_mask: 0.2506, loss: 0.7839 2024-05-30 20:02:25,751 - mmdet - INFO - Epoch [4][3300/7330] lr: 1.000e-04, eta: 19:15:39, time: 1.095, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0521, loss_cls: 0.2102, acc: 92.3499, loss_bbox: 0.2625, loss_mask: 0.2515, loss: 0.7977 2024-05-30 20:03:19,685 - mmdet - INFO - Epoch [4][3350/7330] lr: 1.000e-04, eta: 19:14:41, time: 1.079, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0500, loss_cls: 0.2053, acc: 92.7446, loss_bbox: 0.2463, loss_mask: 0.2410, loss: 0.7648 2024-05-30 20:04:15,023 - mmdet - INFO - Epoch [4][3400/7330] lr: 1.000e-04, eta: 19:13:45, time: 1.107, data_time: 0.092, memory: 24290, loss_rpn_cls: 0.0234, loss_rpn_bbox: 0.0531, loss_cls: 0.2095, acc: 92.3940, loss_bbox: 0.2597, loss_mask: 0.2504, loss: 0.7961 2024-05-30 20:05:09,473 - mmdet - INFO - Epoch [4][3450/7330] lr: 1.000e-04, eta: 19:12:48, time: 1.089, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0518, loss_cls: 0.2111, acc: 92.5137, loss_bbox: 0.2527, loss_mask: 0.2489, loss: 0.7874 2024-05-30 20:06:03,007 - mmdet - INFO - Epoch [4][3500/7330] lr: 1.000e-04, eta: 19:11:48, time: 1.071, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0467, loss_cls: 0.1893, acc: 93.1335, loss_bbox: 0.2436, loss_mask: 0.2446, loss: 0.7424 2024-05-30 20:07:05,525 - mmdet - INFO - Epoch [4][3550/7330] lr: 1.000e-04, eta: 19:11:11, time: 1.250, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0506, loss_cls: 0.2030, acc: 92.5857, loss_bbox: 0.2587, loss_mask: 0.2485, loss: 0.7833 2024-05-30 20:08:02,241 - mmdet - INFO - Epoch [4][3600/7330] lr: 1.000e-04, eta: 19:10:19, time: 1.134, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0227, loss_rpn_bbox: 0.0494, loss_cls: 0.2014, acc: 92.7769, loss_bbox: 0.2489, loss_mask: 0.2433, loss: 0.7656 2024-05-30 20:08:58,698 - mmdet - INFO - Epoch [4][3650/7330] lr: 1.000e-04, eta: 19:09:26, time: 1.129, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0509, loss_cls: 0.2086, acc: 92.5115, loss_bbox: 0.2562, loss_mask: 0.2485, loss: 0.7860 2024-05-30 20:09:57,561 - mmdet - INFO - Epoch [4][3700/7330] lr: 1.000e-04, eta: 19:08:39, time: 1.177, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0231, loss_rpn_bbox: 0.0514, loss_cls: 0.2149, acc: 92.4026, loss_bbox: 0.2595, loss_mask: 0.2484, loss: 0.7975 2024-05-30 20:10:51,682 - mmdet - INFO - Epoch [4][3750/7330] lr: 1.000e-04, eta: 19:07:41, time: 1.082, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0481, loss_cls: 0.1963, acc: 92.9333, loss_bbox: 0.2485, loss_mask: 0.2443, loss: 0.7590 2024-05-30 20:11:47,947 - mmdet - INFO - Epoch [4][3800/7330] lr: 1.000e-04, eta: 19:06:48, time: 1.125, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0504, loss_cls: 0.2014, acc: 92.6743, loss_bbox: 0.2576, loss_mask: 0.2462, loss: 0.7762 2024-05-30 20:12:42,584 - mmdet - INFO - Epoch [4][3850/7330] lr: 1.000e-04, eta: 19:05:51, time: 1.093, data_time: 0.104, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0503, loss_cls: 0.1986, acc: 92.8550, loss_bbox: 0.2482, loss_mask: 0.2502, loss: 0.7676 2024-05-30 20:13:37,627 - mmdet - INFO - Epoch [4][3900/7330] lr: 1.000e-04, eta: 19:04:55, time: 1.101, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0512, loss_cls: 0.2165, acc: 92.2417, loss_bbox: 0.2709, loss_mask: 0.2551, loss: 0.8166 2024-05-30 20:14:32,194 - mmdet - INFO - Epoch [4][3950/7330] lr: 1.000e-04, eta: 19:03:58, time: 1.091, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0508, loss_cls: 0.2140, acc: 92.3540, loss_bbox: 0.2585, loss_mask: 0.2527, loss: 0.7982 2024-05-30 20:15:26,531 - mmdet - INFO - Epoch [4][4000/7330] lr: 1.000e-04, eta: 19:03:00, time: 1.087, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0502, loss_cls: 0.2028, acc: 92.5940, loss_bbox: 0.2542, loss_mask: 0.2454, loss: 0.7739 2024-05-30 20:16:23,448 - mmdet - INFO - Epoch [4][4050/7330] lr: 1.000e-04, eta: 19:02:08, time: 1.138, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0495, loss_cls: 0.2061, acc: 92.6338, loss_bbox: 0.2490, loss_mask: 0.2535, loss: 0.7797 2024-05-30 20:17:22,174 - mmdet - INFO - Epoch [4][4100/7330] lr: 1.000e-04, eta: 19:01:21, time: 1.174, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0503, loss_cls: 0.1981, acc: 92.9712, loss_bbox: 0.2426, loss_mask: 0.2440, loss: 0.7556 2024-05-30 20:18:18,731 - mmdet - INFO - Epoch [4][4150/7330] lr: 1.000e-04, eta: 19:00:29, time: 1.131, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0488, loss_cls: 0.1929, acc: 93.0215, loss_bbox: 0.2444, loss_mask: 0.2449, loss: 0.7521 2024-05-30 20:19:18,052 - mmdet - INFO - Epoch [4][4200/7330] lr: 1.000e-04, eta: 18:59:43, time: 1.187, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0518, loss_cls: 0.2025, acc: 92.8181, loss_bbox: 0.2460, loss_mask: 0.2458, loss: 0.7678 2024-05-30 20:20:13,147 - mmdet - INFO - Epoch [4][4250/7330] lr: 1.000e-04, eta: 18:58:47, time: 1.102, data_time: 0.092, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0513, loss_cls: 0.2092, acc: 92.4316, loss_bbox: 0.2616, loss_mask: 0.2511, loss: 0.7951 2024-05-30 20:21:07,702 - mmdet - INFO - Epoch [4][4300/7330] lr: 1.000e-04, eta: 18:57:49, time: 1.091, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0469, loss_cls: 0.1918, acc: 93.0508, loss_bbox: 0.2402, loss_mask: 0.2366, loss: 0.7359 2024-05-30 20:22:04,839 - mmdet - INFO - Epoch [4][4350/7330] lr: 1.000e-04, eta: 18:56:58, time: 1.143, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0479, loss_cls: 0.1967, acc: 92.7903, loss_bbox: 0.2485, loss_mask: 0.2445, loss: 0.7571 2024-05-30 20:23:01,544 - mmdet - INFO - Epoch [4][4400/7330] lr: 1.000e-04, eta: 18:56:06, time: 1.134, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0518, loss_cls: 0.2135, acc: 92.3938, loss_bbox: 0.2615, loss_mask: 0.2560, loss: 0.8047 2024-05-30 20:23:55,630 - mmdet - INFO - Epoch [4][4450/7330] lr: 1.000e-04, eta: 18:55:08, time: 1.082, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0230, loss_rpn_bbox: 0.0509, loss_cls: 0.2025, acc: 92.7417, loss_bbox: 0.2517, loss_mask: 0.2477, loss: 0.7758 2024-05-30 20:24:49,337 - mmdet - INFO - Epoch [4][4500/7330] lr: 1.000e-04, eta: 18:54:09, time: 1.074, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0485, loss_cls: 0.2005, acc: 92.8076, loss_bbox: 0.2491, loss_mask: 0.2464, loss: 0.7655 2024-05-30 20:25:44,053 - mmdet - INFO - Epoch [4][4550/7330] lr: 1.000e-04, eta: 18:53:12, time: 1.094, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0471, loss_cls: 0.1971, acc: 92.9456, loss_bbox: 0.2489, loss_mask: 0.2390, loss: 0.7512 2024-05-30 20:26:42,568 - mmdet - INFO - Epoch [4][4600/7330] lr: 1.000e-04, eta: 18:52:24, time: 1.170, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0459, loss_cls: 0.1961, acc: 93.0098, loss_bbox: 0.2436, loss_mask: 0.2427, loss: 0.7478 2024-05-30 20:27:39,925 - mmdet - INFO - Epoch [4][4650/7330] lr: 1.000e-04, eta: 18:51:33, time: 1.147, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0529, loss_cls: 0.2086, acc: 92.5232, loss_bbox: 0.2620, loss_mask: 0.2481, loss: 0.7927 2024-05-30 20:28:33,450 - mmdet - INFO - Epoch [4][4700/7330] lr: 1.000e-04, eta: 18:50:33, time: 1.071, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0482, loss_cls: 0.1996, acc: 92.7881, loss_bbox: 0.2458, loss_mask: 0.2464, loss: 0.7607 2024-05-30 20:29:32,301 - mmdet - INFO - Epoch [4][4750/7330] lr: 1.000e-04, eta: 18:49:46, time: 1.177, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0490, loss_cls: 0.2040, acc: 92.6392, loss_bbox: 0.2548, loss_mask: 0.2428, loss: 0.7723 2024-05-30 20:30:26,842 - mmdet - INFO - Epoch [4][4800/7330] lr: 1.000e-04, eta: 18:48:49, time: 1.091, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0505, loss_cls: 0.1990, acc: 92.8154, loss_bbox: 0.2477, loss_mask: 0.2481, loss: 0.7664 2024-05-30 20:31:24,243 - mmdet - INFO - Epoch [4][4850/7330] lr: 1.000e-04, eta: 18:47:58, time: 1.148, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0508, loss_cls: 0.2040, acc: 92.6472, loss_bbox: 0.2524, loss_mask: 0.2460, loss: 0.7731 2024-05-30 20:32:18,764 - mmdet - INFO - Epoch [4][4900/7330] lr: 1.000e-04, eta: 18:47:01, time: 1.090, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0232, loss_rpn_bbox: 0.0529, loss_cls: 0.2011, acc: 92.7795, loss_bbox: 0.2505, loss_mask: 0.2477, loss: 0.7753 2024-05-30 20:33:17,360 - mmdet - INFO - Epoch [4][4950/7330] lr: 1.000e-04, eta: 18:46:13, time: 1.172, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0463, loss_cls: 0.1978, acc: 92.8257, loss_bbox: 0.2480, loss_mask: 0.2525, loss: 0.7651 2024-05-30 20:34:11,556 - mmdet - INFO - Epoch [4][5000/7330] lr: 1.000e-04, eta: 18:45:15, time: 1.084, data_time: 0.098, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0506, loss_cls: 0.1943, acc: 93.0093, loss_bbox: 0.2445, loss_mask: 0.2550, loss: 0.7647 2024-05-30 20:35:06,101 - mmdet - INFO - Epoch [4][5050/7330] lr: 1.000e-04, eta: 18:44:17, time: 1.091, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0482, loss_cls: 0.2086, acc: 92.6421, loss_bbox: 0.2512, loss_mask: 0.2452, loss: 0.7751 2024-05-30 20:36:00,121 - mmdet - INFO - Epoch [4][5100/7330] lr: 1.000e-04, eta: 18:43:19, time: 1.080, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0463, loss_cls: 0.1913, acc: 93.0300, loss_bbox: 0.2385, loss_mask: 0.2372, loss: 0.7321 2024-05-30 20:36:57,363 - mmdet - INFO - Epoch [4][5150/7330] lr: 1.000e-04, eta: 18:42:28, time: 1.145, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0512, loss_cls: 0.1968, acc: 92.7666, loss_bbox: 0.2456, loss_mask: 0.2438, loss: 0.7576 2024-05-30 20:37:53,585 - mmdet - INFO - Epoch [4][5200/7330] lr: 1.000e-04, eta: 18:41:34, time: 1.124, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0470, loss_cls: 0.2046, acc: 92.6077, loss_bbox: 0.2519, loss_mask: 0.2492, loss: 0.7735 2024-05-30 20:38:49,281 - mmdet - INFO - Epoch [4][5250/7330] lr: 1.000e-04, eta: 18:40:40, time: 1.114, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0219, loss_rpn_bbox: 0.0470, loss_cls: 0.1977, acc: 92.8123, loss_bbox: 0.2503, loss_mask: 0.2414, loss: 0.7583 2024-05-30 20:39:45,865 - mmdet - INFO - Epoch [4][5300/7330] lr: 1.000e-04, eta: 18:39:47, time: 1.132, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0518, loss_cls: 0.2037, acc: 92.6455, loss_bbox: 0.2526, loss_mask: 0.2435, loss: 0.7719 2024-05-30 20:40:39,953 - mmdet - INFO - Epoch [4][5350/7330] lr: 1.000e-04, eta: 18:38:49, time: 1.082, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0492, loss_cls: 0.1937, acc: 92.9290, loss_bbox: 0.2458, loss_mask: 0.2453, loss: 0.7557 2024-05-30 20:41:36,931 - mmdet - INFO - Epoch [4][5400/7330] lr: 1.000e-04, eta: 18:37:57, time: 1.140, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0233, loss_rpn_bbox: 0.0510, loss_cls: 0.2011, acc: 92.6973, loss_bbox: 0.2533, loss_mask: 0.2468, loss: 0.7755 2024-05-30 20:42:35,634 - mmdet - INFO - Epoch [4][5450/7330] lr: 1.000e-04, eta: 18:37:09, time: 1.174, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0503, loss_cls: 0.2011, acc: 92.8071, loss_bbox: 0.2484, loss_mask: 0.2448, loss: 0.7656 2024-05-30 20:43:29,742 - mmdet - INFO - Epoch [4][5500/7330] lr: 1.000e-04, eta: 18:36:11, time: 1.082, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0462, loss_cls: 0.1940, acc: 93.0173, loss_bbox: 0.2407, loss_mask: 0.2466, loss: 0.7494 2024-05-30 20:44:25,729 - mmdet - INFO - Epoch [4][5550/7330] lr: 1.000e-04, eta: 18:35:17, time: 1.120, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0482, loss_cls: 0.1991, acc: 92.7400, loss_bbox: 0.2488, loss_mask: 0.2458, loss: 0.7642 2024-05-30 20:45:19,723 - mmdet - INFO - Epoch [4][5600/7330] lr: 1.000e-04, eta: 18:34:18, time: 1.080, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0210, loss_rpn_bbox: 0.0484, loss_cls: 0.2027, acc: 92.6870, loss_bbox: 0.2496, loss_mask: 0.2450, loss: 0.7667 2024-05-30 20:46:13,310 - mmdet - INFO - Epoch [4][5650/7330] lr: 1.000e-04, eta: 18:33:19, time: 1.072, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0483, loss_cls: 0.2070, acc: 92.6162, loss_bbox: 0.2555, loss_mask: 0.2456, loss: 0.7770 2024-05-30 20:47:07,891 - mmdet - INFO - Epoch [4][5700/7330] lr: 1.000e-04, eta: 18:32:22, time: 1.092, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0486, loss_cls: 0.1935, acc: 92.9663, loss_bbox: 0.2414, loss_mask: 0.2484, loss: 0.7523 2024-05-30 20:48:07,126 - mmdet - INFO - Epoch [4][5750/7330] lr: 1.000e-04, eta: 18:31:35, time: 1.185, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0221, loss_rpn_bbox: 0.0496, loss_cls: 0.2036, acc: 92.5569, loss_bbox: 0.2539, loss_mask: 0.2474, loss: 0.7766 2024-05-30 20:49:00,325 - mmdet - INFO - Epoch [4][5800/7330] lr: 1.000e-04, eta: 18:30:35, time: 1.064, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0500, loss_cls: 0.2091, acc: 92.4297, loss_bbox: 0.2623, loss_mask: 0.2556, loss: 0.7978 2024-05-30 20:49:58,659 - mmdet - INFO - Epoch [4][5850/7330] lr: 1.000e-04, eta: 18:29:46, time: 1.167, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0506, loss_cls: 0.2071, acc: 92.5554, loss_bbox: 0.2500, loss_mask: 0.2491, loss: 0.7776 2024-05-30 20:50:53,332 - mmdet - INFO - Epoch [4][5900/7330] lr: 1.000e-04, eta: 18:28:49, time: 1.093, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0511, loss_cls: 0.2061, acc: 92.6113, loss_bbox: 0.2560, loss_mask: 0.2457, loss: 0.7800 2024-05-30 20:51:47,408 - mmdet - INFO - Epoch [4][5950/7330] lr: 1.000e-04, eta: 18:27:51, time: 1.081, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0222, loss_rpn_bbox: 0.0513, loss_cls: 0.2018, acc: 92.6638, loss_bbox: 0.2514, loss_mask: 0.2464, loss: 0.7732 2024-05-30 20:52:43,696 - mmdet - INFO - Epoch [4][6000/7330] lr: 1.000e-04, eta: 18:26:58, time: 1.126, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0495, loss_cls: 0.2021, acc: 92.6682, loss_bbox: 0.2526, loss_mask: 0.2484, loss: 0.7728 2024-05-30 20:53:38,107 - mmdet - INFO - Epoch [4][6050/7330] lr: 1.000e-04, eta: 18:26:00, time: 1.088, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0509, loss_cls: 0.2004, acc: 92.8530, loss_bbox: 0.2490, loss_mask: 0.2446, loss: 0.7657 2024-05-30 20:54:38,384 - mmdet - INFO - Epoch [4][6100/7330] lr: 1.000e-04, eta: 18:25:15, time: 1.206, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0476, loss_cls: 0.1941, acc: 93.0442, loss_bbox: 0.2385, loss_mask: 0.2442, loss: 0.7443 2024-05-30 20:55:33,510 - mmdet - INFO - Epoch [4][6150/7330] lr: 1.000e-04, eta: 18:24:19, time: 1.103, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0520, loss_cls: 0.2138, acc: 92.4146, loss_bbox: 0.2590, loss_mask: 0.2490, loss: 0.7952 2024-05-30 20:56:27,474 - mmdet - INFO - Epoch [4][6200/7330] lr: 1.000e-04, eta: 18:23:21, time: 1.079, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0483, loss_cls: 0.2054, acc: 92.5588, loss_bbox: 0.2529, loss_mask: 0.2452, loss: 0.7716 2024-05-30 20:57:21,589 - mmdet - INFO - Epoch [4][6250/7330] lr: 1.000e-04, eta: 18:22:23, time: 1.082, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0474, loss_cls: 0.1975, acc: 92.9065, loss_bbox: 0.2478, loss_mask: 0.2341, loss: 0.7451 2024-05-30 20:58:21,347 - mmdet - INFO - Epoch [4][6300/7330] lr: 1.000e-04, eta: 18:21:37, time: 1.195, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0521, loss_cls: 0.2030, acc: 92.6130, loss_bbox: 0.2566, loss_mask: 0.2455, loss: 0.7775 2024-05-30 20:59:15,602 - mmdet - INFO - Epoch [4][6350/7330] lr: 1.000e-04, eta: 18:20:39, time: 1.085, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0217, loss_rpn_bbox: 0.0485, loss_cls: 0.1954, acc: 92.9092, loss_bbox: 0.2442, loss_mask: 0.2494, loss: 0.7591 2024-05-30 21:00:11,732 - mmdet - INFO - Epoch [4][6400/7330] lr: 1.000e-04, eta: 18:19:45, time: 1.123, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0220, loss_rpn_bbox: 0.0495, loss_cls: 0.2068, acc: 92.4817, loss_bbox: 0.2526, loss_mask: 0.2458, loss: 0.7768 2024-05-30 21:01:06,751 - mmdet - INFO - Epoch [4][6450/7330] lr: 1.000e-04, eta: 18:18:49, time: 1.100, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0508, loss_cls: 0.2042, acc: 92.7334, loss_bbox: 0.2520, loss_mask: 0.2461, loss: 0.7731 2024-05-30 21:02:02,821 - mmdet - INFO - Epoch [4][6500/7330] lr: 1.000e-04, eta: 18:17:55, time: 1.121, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0480, loss_cls: 0.1983, acc: 92.7805, loss_bbox: 0.2477, loss_mask: 0.2452, loss: 0.7599 2024-05-30 21:02:57,547 - mmdet - INFO - Epoch [4][6550/7330] lr: 1.000e-04, eta: 18:16:58, time: 1.095, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0482, loss_cls: 0.1984, acc: 92.8354, loss_bbox: 0.2433, loss_mask: 0.2390, loss: 0.7505 2024-05-30 21:03:53,691 - mmdet - INFO - Epoch [4][6600/7330] lr: 1.000e-04, eta: 18:16:04, time: 1.123, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0479, loss_cls: 0.2044, acc: 92.5667, loss_bbox: 0.2515, loss_mask: 0.2501, loss: 0.7758 2024-05-30 21:04:47,294 - mmdet - INFO - Epoch [4][6650/7330] lr: 1.000e-04, eta: 18:15:05, time: 1.072, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0455, loss_cls: 0.1984, acc: 93.0405, loss_bbox: 0.2406, loss_mask: 0.2407, loss: 0.7450 2024-05-30 21:05:48,468 - mmdet - INFO - Epoch [4][6700/7330] lr: 1.000e-04, eta: 18:14:22, time: 1.223, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0509, loss_cls: 0.2106, acc: 92.2781, loss_bbox: 0.2638, loss_mask: 0.2524, loss: 0.8015 2024-05-30 21:06:42,763 - mmdet - INFO - Epoch [4][6750/7330] lr: 1.000e-04, eta: 18:13:24, time: 1.086, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0226, loss_rpn_bbox: 0.0503, loss_cls: 0.1993, acc: 92.8315, loss_bbox: 0.2436, loss_mask: 0.2497, loss: 0.7656 2024-05-30 21:07:36,683 - mmdet - INFO - Epoch [4][6800/7330] lr: 1.000e-04, eta: 18:12:26, time: 1.078, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0476, loss_cls: 0.1972, acc: 92.7620, loss_bbox: 0.2475, loss_mask: 0.2425, loss: 0.7562 2024-05-30 21:08:35,725 - mmdet - INFO - Epoch [4][6850/7330] lr: 1.000e-04, eta: 18:11:38, time: 1.181, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0454, loss_cls: 0.1985, acc: 92.8562, loss_bbox: 0.2436, loss_mask: 0.2448, loss: 0.7526 2024-05-30 21:09:29,956 - mmdet - INFO - Epoch [4][6900/7330] lr: 1.000e-04, eta: 18:10:40, time: 1.085, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0486, loss_cls: 0.2047, acc: 92.5667, loss_bbox: 0.2557, loss_mask: 0.2458, loss: 0.7753 2024-05-30 21:10:24,318 - mmdet - INFO - Epoch [4][6950/7330] lr: 1.000e-04, eta: 18:09:43, time: 1.087, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0473, loss_cls: 0.1935, acc: 92.9368, loss_bbox: 0.2390, loss_mask: 0.2430, loss: 0.7444 2024-05-30 21:11:21,147 - mmdet - INFO - Epoch [4][7000/7330] lr: 1.000e-04, eta: 18:08:50, time: 1.137, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0506, loss_cls: 0.2053, acc: 92.6479, loss_bbox: 0.2549, loss_mask: 0.2509, loss: 0.7825 2024-05-30 21:12:15,065 - mmdet - INFO - Epoch [4][7050/7330] lr: 1.000e-04, eta: 18:07:52, time: 1.078, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0496, loss_cls: 0.2011, acc: 92.7957, loss_bbox: 0.2491, loss_mask: 0.2444, loss: 0.7641 2024-05-30 21:13:11,378 - mmdet - INFO - Epoch [4][7100/7330] lr: 1.000e-04, eta: 18:06:58, time: 1.126, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0238, loss_rpn_bbox: 0.0470, loss_cls: 0.2094, acc: 92.6230, loss_bbox: 0.2540, loss_mask: 0.2462, loss: 0.7805 2024-05-30 21:14:08,705 - mmdet - INFO - Epoch [4][7150/7330] lr: 1.000e-04, eta: 18:06:07, time: 1.147, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0240, loss_rpn_bbox: 0.0498, loss_cls: 0.2020, acc: 92.7383, loss_bbox: 0.2481, loss_mask: 0.2515, loss: 0.7754 2024-05-30 21:15:03,463 - mmdet - INFO - Epoch [4][7200/7330] lr: 1.000e-04, eta: 18:05:10, time: 1.095, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0462, loss_cls: 0.1963, acc: 92.8508, loss_bbox: 0.2491, loss_mask: 0.2437, loss: 0.7560 2024-05-30 21:15:58,223 - mmdet - INFO - Epoch [4][7250/7330] lr: 1.000e-04, eta: 18:04:13, time: 1.095, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0229, loss_rpn_bbox: 0.0523, loss_cls: 0.2096, acc: 92.3992, loss_bbox: 0.2615, loss_mask: 0.2431, loss: 0.7894 2024-05-30 21:16:56,440 - mmdet - INFO - Epoch [4][7300/7330] lr: 1.000e-04, eta: 18:03:24, time: 1.164, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0482, loss_cls: 0.1998, acc: 92.8381, loss_bbox: 0.2471, loss_mask: 0.2410, loss: 0.7574 2024-05-30 21:17:32,080 - mmdet - INFO - Saving checkpoint at 4 epochs 2024-05-30 21:19:55,710 - mmdet - INFO - Evaluating bbox... 2024-05-30 21:20:17,510 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.439 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.684 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.481 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.271 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.477 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.593 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.562 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.562 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.562 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.385 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.714 2024-05-30 21:20:17,511 - mmdet - INFO - Evaluating segm... 2024-05-30 21:20:42,666 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.404 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.650 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.430 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.194 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.437 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.613 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.517 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.322 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.560 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.699 2024-05-30 21:20:42,990 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 21:20:42,991 - mmdet - INFO - Epoch(val) [4][625] bbox_mAP: 0.4390, bbox_mAP_50: 0.6840, bbox_mAP_75: 0.4810, bbox_mAP_s: 0.2710, bbox_mAP_m: 0.4770, bbox_mAP_l: 0.5930, bbox_mAP_copypaste: 0.439 0.684 0.481 0.271 0.477 0.593, segm_mAP: 0.4040, segm_mAP_50: 0.6500, segm_mAP_75: 0.4300, segm_mAP_s: 0.1940, segm_mAP_m: 0.4370, segm_mAP_l: 0.6130, segm_mAP_copypaste: 0.404 0.650 0.430 0.194 0.437 0.613 2024-05-30 21:21:53,256 - mmdet - INFO - Epoch [5][50/7330] lr: 1.000e-04, eta: 18:01:18, time: 1.405, data_time: 0.187, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0448, loss_cls: 0.1908, acc: 93.0469, loss_bbox: 0.2400, loss_mask: 0.2371, loss: 0.7307 2024-05-30 21:22:49,927 - mmdet - INFO - Epoch [5][100/7330] lr: 1.000e-04, eta: 18:00:26, time: 1.133, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0469, loss_cls: 0.1903, acc: 93.0857, loss_bbox: 0.2407, loss_mask: 0.2389, loss: 0.7371 2024-05-30 21:23:46,816 - mmdet - INFO - Epoch [5][150/7330] lr: 1.000e-04, eta: 17:59:33, time: 1.138, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0446, loss_cls: 0.1863, acc: 93.2502, loss_bbox: 0.2349, loss_mask: 0.2317, loss: 0.7168 2024-05-30 21:24:41,277 - mmdet - INFO - Epoch [5][200/7330] lr: 1.000e-04, eta: 17:58:36, time: 1.089, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0465, loss_cls: 0.1808, acc: 93.3765, loss_bbox: 0.2356, loss_mask: 0.2313, loss: 0.7124 2024-05-30 21:25:37,117 - mmdet - INFO - Epoch [5][250/7330] lr: 1.000e-04, eta: 17:57:42, time: 1.117, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0506, loss_cls: 0.2000, acc: 92.7046, loss_bbox: 0.2539, loss_mask: 0.2410, loss: 0.7645 2024-05-30 21:26:31,714 - mmdet - INFO - Epoch [5][300/7330] lr: 1.000e-04, eta: 17:56:45, time: 1.092, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0486, loss_cls: 0.1825, acc: 93.3733, loss_bbox: 0.2331, loss_mask: 0.2390, loss: 0.7207 2024-05-30 21:27:26,602 - mmdet - INFO - Epoch [5][350/7330] lr: 1.000e-04, eta: 17:55:48, time: 1.098, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0494, loss_cls: 0.1984, acc: 92.7349, loss_bbox: 0.2563, loss_mask: 0.2474, loss: 0.7713 2024-05-30 21:28:20,942 - mmdet - INFO - Epoch [5][400/7330] lr: 1.000e-04, eta: 17:54:51, time: 1.087, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0470, loss_cls: 0.1992, acc: 92.7415, loss_bbox: 0.2495, loss_mask: 0.2493, loss: 0.7648 2024-05-30 21:29:17,417 - mmdet - INFO - Epoch [5][450/7330] lr: 1.000e-04, eta: 17:53:58, time: 1.129, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0497, loss_cls: 0.1869, acc: 93.1489, loss_bbox: 0.2350, loss_mask: 0.2381, loss: 0.7290 2024-05-30 21:30:15,309 - mmdet - INFO - Epoch [5][500/7330] lr: 1.000e-04, eta: 17:53:07, time: 1.158, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0486, loss_cls: 0.1981, acc: 92.7266, loss_bbox: 0.2502, loss_mask: 0.2402, loss: 0.7579 2024-05-30 21:31:11,746 - mmdet - INFO - Epoch [5][550/7330] lr: 1.000e-04, eta: 17:52:14, time: 1.129, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0426, loss_cls: 0.1778, acc: 93.3655, loss_bbox: 0.2311, loss_mask: 0.2304, loss: 0.6993 2024-05-30 21:32:10,480 - mmdet - INFO - Epoch [5][600/7330] lr: 1.000e-04, eta: 17:51:25, time: 1.175, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0460, loss_cls: 0.1908, acc: 93.0027, loss_bbox: 0.2416, loss_mask: 0.2429, loss: 0.7402 2024-05-30 21:33:09,907 - mmdet - INFO - Epoch [5][650/7330] lr: 1.000e-04, eta: 17:50:38, time: 1.189, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0467, loss_cls: 0.1892, acc: 93.0176, loss_bbox: 0.2419, loss_mask: 0.2379, loss: 0.7345 2024-05-30 21:34:04,912 - mmdet - INFO - Epoch [5][700/7330] lr: 1.000e-04, eta: 17:49:42, time: 1.100, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0486, loss_cls: 0.1974, acc: 92.8125, loss_bbox: 0.2490, loss_mask: 0.2413, loss: 0.7549 2024-05-30 21:35:01,041 - mmdet - INFO - Epoch [5][750/7330] lr: 1.000e-04, eta: 17:48:48, time: 1.123, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0465, loss_cls: 0.1902, acc: 93.0605, loss_bbox: 0.2398, loss_mask: 0.2407, loss: 0.7364 2024-05-30 21:35:55,046 - mmdet - INFO - Epoch [5][800/7330] lr: 1.000e-04, eta: 17:47:50, time: 1.080, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0453, loss_cls: 0.1868, acc: 93.1521, loss_bbox: 0.2373, loss_mask: 0.2374, loss: 0.7255 2024-05-30 21:36:49,511 - mmdet - INFO - Epoch [5][850/7330] lr: 1.000e-04, eta: 17:46:52, time: 1.090, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0499, loss_cls: 0.1915, acc: 92.9949, loss_bbox: 0.2423, loss_mask: 0.2427, loss: 0.7452 2024-05-30 21:37:44,241 - mmdet - INFO - Epoch [5][900/7330] lr: 1.000e-04, eta: 17:45:56, time: 1.094, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0493, loss_cls: 0.1899, acc: 93.0535, loss_bbox: 0.2458, loss_mask: 0.2415, loss: 0.7451 2024-05-30 21:38:38,393 - mmdet - INFO - Epoch [5][950/7330] lr: 1.000e-04, eta: 17:44:58, time: 1.083, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0475, loss_cls: 0.1911, acc: 92.9739, loss_bbox: 0.2465, loss_mask: 0.2348, loss: 0.7401 2024-05-30 21:39:35,333 - mmdet - INFO - Epoch [5][1000/7330] lr: 1.000e-04, eta: 17:44:06, time: 1.139, data_time: 0.096, memory: 24290, loss_rpn_cls: 0.0225, loss_rpn_bbox: 0.0505, loss_cls: 0.1992, acc: 92.6526, loss_bbox: 0.2541, loss_mask: 0.2430, loss: 0.7693 2024-05-30 21:40:29,951 - mmdet - INFO - Epoch [5][1050/7330] lr: 1.000e-04, eta: 17:43:09, time: 1.092, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0458, loss_cls: 0.1878, acc: 93.1321, loss_bbox: 0.2419, loss_mask: 0.2371, loss: 0.7315 2024-05-30 21:41:30,675 - mmdet - INFO - Epoch [5][1100/7330] lr: 1.000e-04, eta: 17:42:24, time: 1.214, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0466, loss_cls: 0.2060, acc: 92.6589, loss_bbox: 0.2483, loss_mask: 0.2375, loss: 0.7561 2024-05-30 21:42:27,507 - mmdet - INFO - Epoch [5][1150/7330] lr: 1.000e-04, eta: 17:41:31, time: 1.137, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0476, loss_cls: 0.1922, acc: 92.9106, loss_bbox: 0.2423, loss_mask: 0.2411, loss: 0.7437 2024-05-30 21:43:25,754 - mmdet - INFO - Epoch [5][1200/7330] lr: 1.000e-04, eta: 17:40:41, time: 1.165, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0485, loss_cls: 0.1944, acc: 92.9060, loss_bbox: 0.2418, loss_mask: 0.2384, loss: 0.7416 2024-05-30 21:44:22,307 - mmdet - INFO - Epoch [5][1250/7330] lr: 1.000e-04, eta: 17:39:48, time: 1.131, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0464, loss_cls: 0.1935, acc: 92.9529, loss_bbox: 0.2430, loss_mask: 0.2382, loss: 0.7401 2024-05-30 21:45:17,123 - mmdet - INFO - Epoch [5][1300/7330] lr: 1.000e-04, eta: 17:38:51, time: 1.096, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0484, loss_cls: 0.1935, acc: 92.9531, loss_bbox: 0.2428, loss_mask: 0.2398, loss: 0.7452 2024-05-30 21:46:14,418 - mmdet - INFO - Epoch [5][1350/7330] lr: 1.000e-04, eta: 17:37:59, time: 1.146, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0510, loss_cls: 0.1968, acc: 92.8230, loss_bbox: 0.2495, loss_mask: 0.2397, loss: 0.7570 2024-05-30 21:47:09,371 - mmdet - INFO - Epoch [5][1400/7330] lr: 1.000e-04, eta: 17:37:03, time: 1.099, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0476, loss_cls: 0.1897, acc: 93.0144, loss_bbox: 0.2416, loss_mask: 0.2341, loss: 0.7316 2024-05-30 21:48:04,458 - mmdet - INFO - Epoch [5][1450/7330] lr: 1.000e-04, eta: 17:36:07, time: 1.102, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0491, loss_cls: 0.2022, acc: 92.6389, loss_bbox: 0.2534, loss_mask: 0.2389, loss: 0.7630 2024-05-30 21:49:01,295 - mmdet - INFO - Epoch [5][1500/7330] lr: 1.000e-04, eta: 17:35:14, time: 1.137, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0464, loss_cls: 0.1856, acc: 93.1860, loss_bbox: 0.2362, loss_mask: 0.2396, loss: 0.7261 2024-05-30 21:49:56,067 - mmdet - INFO - Epoch [5][1550/7330] lr: 1.000e-04, eta: 17:34:18, time: 1.095, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0455, loss_cls: 0.1841, acc: 93.3076, loss_bbox: 0.2313, loss_mask: 0.2329, loss: 0.7117 2024-05-30 21:50:50,180 - mmdet - INFO - Epoch [5][1600/7330] lr: 1.000e-04, eta: 17:33:20, time: 1.082, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0451, loss_cls: 0.1874, acc: 93.2754, loss_bbox: 0.2350, loss_mask: 0.2391, loss: 0.7241 2024-05-30 21:51:46,315 - mmdet - INFO - Epoch [5][1650/7330] lr: 1.000e-04, eta: 17:32:26, time: 1.123, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0476, loss_cls: 0.1981, acc: 92.8879, loss_bbox: 0.2453, loss_mask: 0.2393, loss: 0.7488 2024-05-30 21:52:45,580 - mmdet - INFO - Epoch [5][1700/7330] lr: 1.000e-04, eta: 17:31:38, time: 1.185, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0485, loss_cls: 0.1937, acc: 92.9570, loss_bbox: 0.2470, loss_mask: 0.2451, loss: 0.7532 2024-05-30 21:53:44,963 - mmdet - INFO - Epoch [5][1750/7330] lr: 1.000e-04, eta: 17:30:49, time: 1.188, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0494, loss_cls: 0.2016, acc: 92.6565, loss_bbox: 0.2553, loss_mask: 0.2448, loss: 0.7714 2024-05-30 21:54:39,480 - mmdet - INFO - Epoch [5][1800/7330] lr: 1.000e-04, eta: 17:29:52, time: 1.090, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0461, loss_cls: 0.1992, acc: 92.7854, loss_bbox: 0.2476, loss_mask: 0.2443, loss: 0.7556 2024-05-30 21:55:35,979 - mmdet - INFO - Epoch [5][1850/7330] lr: 1.000e-04, eta: 17:28:59, time: 1.130, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0471, loss_cls: 0.1902, acc: 93.0083, loss_bbox: 0.2392, loss_mask: 0.2372, loss: 0.7336 2024-05-30 21:56:32,631 - mmdet - INFO - Epoch [5][1900/7330] lr: 1.000e-04, eta: 17:28:06, time: 1.133, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0480, loss_cls: 0.1962, acc: 92.8647, loss_bbox: 0.2411, loss_mask: 0.2443, loss: 0.7497 2024-05-30 21:57:26,729 - mmdet - INFO - Epoch [5][1950/7330] lr: 1.000e-04, eta: 17:27:08, time: 1.082, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0477, loss_cls: 0.1949, acc: 92.8586, loss_bbox: 0.2483, loss_mask: 0.2418, loss: 0.7523 2024-05-30 21:58:20,668 - mmdet - INFO - Epoch [5][2000/7330] lr: 1.000e-04, eta: 17:26:10, time: 1.079, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0483, loss_cls: 0.1885, acc: 93.0925, loss_bbox: 0.2389, loss_mask: 0.2408, loss: 0.7365 2024-05-30 21:59:14,966 - mmdet - INFO - Epoch [5][2050/7330] lr: 1.000e-04, eta: 17:25:12, time: 1.086, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0461, loss_cls: 0.1916, acc: 92.8953, loss_bbox: 0.2449, loss_mask: 0.2402, loss: 0.7416 2024-05-30 22:00:09,962 - mmdet - INFO - Epoch [5][2100/7330] lr: 1.000e-04, eta: 17:24:16, time: 1.100, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0493, loss_cls: 0.1949, acc: 92.9563, loss_bbox: 0.2438, loss_mask: 0.2368, loss: 0.7432 2024-05-30 22:01:07,422 - mmdet - INFO - Epoch [5][2150/7330] lr: 1.000e-04, eta: 17:23:25, time: 1.149, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0498, loss_cls: 0.1986, acc: 92.7878, loss_bbox: 0.2457, loss_mask: 0.2410, loss: 0.7560 2024-05-30 22:02:04,382 - mmdet - INFO - Epoch [5][2200/7330] lr: 1.000e-04, eta: 17:22:32, time: 1.139, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0487, loss_cls: 0.1982, acc: 92.9038, loss_bbox: 0.2393, loss_mask: 0.2335, loss: 0.7376 2024-05-30 22:03:01,541 - mmdet - INFO - Epoch [5][2250/7330] lr: 1.000e-04, eta: 17:21:40, time: 1.143, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0509, loss_cls: 0.1930, acc: 92.9175, loss_bbox: 0.2456, loss_mask: 0.2440, loss: 0.7522 2024-05-30 22:04:02,481 - mmdet - INFO - Epoch [5][2300/7330] lr: 1.000e-04, eta: 17:20:54, time: 1.219, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0485, loss_cls: 0.1924, acc: 92.9700, loss_bbox: 0.2440, loss_mask: 0.2445, loss: 0.7494 2024-05-30 22:04:56,814 - mmdet - INFO - Epoch [5][2350/7330] lr: 1.000e-04, eta: 17:19:57, time: 1.087, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0469, loss_cls: 0.1875, acc: 93.1521, loss_bbox: 0.2381, loss_mask: 0.2385, loss: 0.7300 2024-05-30 22:05:51,094 - mmdet - INFO - Epoch [5][2400/7330] lr: 1.000e-04, eta: 17:18:59, time: 1.086, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0475, loss_cls: 0.1917, acc: 92.9932, loss_bbox: 0.2437, loss_mask: 0.2430, loss: 0.7458 2024-05-30 22:06:47,108 - mmdet - INFO - Epoch [5][2450/7330] lr: 1.000e-04, eta: 17:18:05, time: 1.120, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0464, loss_cls: 0.1933, acc: 92.9888, loss_bbox: 0.2422, loss_mask: 0.2391, loss: 0.7400 2024-05-30 22:07:44,062 - mmdet - INFO - Epoch [5][2500/7330] lr: 1.000e-04, eta: 17:17:12, time: 1.139, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0215, loss_rpn_bbox: 0.0496, loss_cls: 0.2016, acc: 92.5862, loss_bbox: 0.2549, loss_mask: 0.2463, loss: 0.7739 2024-05-30 22:08:38,276 - mmdet - INFO - Epoch [5][2550/7330] lr: 1.000e-04, eta: 17:16:15, time: 1.084, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0472, loss_cls: 0.1871, acc: 93.0400, loss_bbox: 0.2364, loss_mask: 0.2405, loss: 0.7310 2024-05-30 22:09:32,301 - mmdet - INFO - Epoch [5][2600/7330] lr: 1.000e-04, eta: 17:15:17, time: 1.081, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0476, loss_cls: 0.1915, acc: 92.9204, loss_bbox: 0.2438, loss_mask: 0.2408, loss: 0.7421 2024-05-30 22:10:27,391 - mmdet - INFO - Epoch [5][2650/7330] lr: 1.000e-04, eta: 17:14:21, time: 1.102, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0478, loss_cls: 0.1936, acc: 93.0403, loss_bbox: 0.2388, loss_mask: 0.2357, loss: 0.7352 2024-05-30 22:11:22,224 - mmdet - INFO - Epoch [5][2700/7330] lr: 1.000e-04, eta: 17:13:24, time: 1.097, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0464, loss_cls: 0.1871, acc: 93.1587, loss_bbox: 0.2385, loss_mask: 0.2357, loss: 0.7274 2024-05-30 22:12:21,597 - mmdet - INFO - Epoch [5][2750/7330] lr: 1.000e-04, eta: 17:12:36, time: 1.188, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0193, loss_rpn_bbox: 0.0481, loss_cls: 0.1915, acc: 93.0015, loss_bbox: 0.2434, loss_mask: 0.2367, loss: 0.7390 2024-05-30 22:13:15,733 - mmdet - INFO - Epoch [5][2800/7330] lr: 1.000e-04, eta: 17:11:38, time: 1.083, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0464, loss_cls: 0.1876, acc: 93.2102, loss_bbox: 0.2373, loss_mask: 0.2374, loss: 0.7287 2024-05-30 22:14:19,125 - mmdet - INFO - Epoch [5][2850/7330] lr: 1.000e-04, eta: 17:10:56, time: 1.268, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0469, loss_cls: 0.1919, acc: 92.9490, loss_bbox: 0.2500, loss_mask: 0.2444, loss: 0.7525 2024-05-30 22:15:13,246 - mmdet - INFO - Epoch [5][2900/7330] lr: 1.000e-04, eta: 17:09:59, time: 1.082, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0464, loss_cls: 0.1915, acc: 92.9592, loss_bbox: 0.2413, loss_mask: 0.2354, loss: 0.7342 2024-05-30 22:16:08,394 - mmdet - INFO - Epoch [5][2950/7330] lr: 1.000e-04, eta: 17:09:03, time: 1.103, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0489, loss_cls: 0.1960, acc: 92.5703, loss_bbox: 0.2566, loss_mask: 0.2461, loss: 0.7661 2024-05-30 22:17:03,684 - mmdet - INFO - Epoch [5][3000/7330] lr: 1.000e-04, eta: 17:08:07, time: 1.106, data_time: 0.097, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0521, loss_cls: 0.2033, acc: 92.5852, loss_bbox: 0.2514, loss_mask: 0.2438, loss: 0.7713 2024-05-30 22:17:59,465 - mmdet - INFO - Epoch [5][3050/7330] lr: 1.000e-04, eta: 17:07:12, time: 1.116, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0443, loss_cls: 0.1940, acc: 92.9170, loss_bbox: 0.2439, loss_mask: 0.2361, loss: 0.7367 2024-05-30 22:18:56,058 - mmdet - INFO - Epoch [5][3100/7330] lr: 1.000e-04, eta: 17:06:19, time: 1.132, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0466, loss_cls: 0.1929, acc: 92.8535, loss_bbox: 0.2454, loss_mask: 0.2389, loss: 0.7429 2024-05-30 22:19:50,396 - mmdet - INFO - Epoch [5][3150/7330] lr: 1.000e-04, eta: 17:05:21, time: 1.087, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0213, loss_rpn_bbox: 0.0485, loss_cls: 0.1887, acc: 93.1006, loss_bbox: 0.2411, loss_mask: 0.2373, loss: 0.7370 2024-05-30 22:20:44,678 - mmdet - INFO - Epoch [5][3200/7330] lr: 1.000e-04, eta: 17:04:24, time: 1.086, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0492, loss_cls: 0.1969, acc: 92.8635, loss_bbox: 0.2472, loss_mask: 0.2460, loss: 0.7582 2024-05-30 22:21:38,907 - mmdet - INFO - Epoch [5][3250/7330] lr: 1.000e-04, eta: 17:03:27, time: 1.085, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0463, loss_cls: 0.1918, acc: 93.0029, loss_bbox: 0.2429, loss_mask: 0.2428, loss: 0.7432 2024-05-30 22:22:35,178 - mmdet - INFO - Epoch [5][3300/7330] lr: 1.000e-04, eta: 17:02:33, time: 1.125, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0446, loss_cls: 0.1818, acc: 93.4082, loss_bbox: 0.2313, loss_mask: 0.2368, loss: 0.7132 2024-05-30 22:23:33,016 - mmdet - INFO - Epoch [5][3350/7330] lr: 1.000e-04, eta: 17:01:41, time: 1.157, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0451, loss_cls: 0.1832, acc: 93.2456, loss_bbox: 0.2334, loss_mask: 0.2400, loss: 0.7199 2024-05-30 22:24:32,038 - mmdet - INFO - Epoch [5][3400/7330] lr: 1.000e-04, eta: 17:00:52, time: 1.180, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0481, loss_cls: 0.1930, acc: 92.9924, loss_bbox: 0.2363, loss_mask: 0.2390, loss: 0.7378 2024-05-30 22:25:27,905 - mmdet - INFO - Epoch [5][3450/7330] lr: 1.000e-04, eta: 16:59:57, time: 1.117, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0463, loss_cls: 0.1886, acc: 93.1509, loss_bbox: 0.2291, loss_mask: 0.2402, loss: 0.7246 2024-05-30 22:26:23,282 - mmdet - INFO - Epoch [5][3500/7330] lr: 1.000e-04, eta: 16:59:02, time: 1.108, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0200, loss_rpn_bbox: 0.0474, loss_cls: 0.1995, acc: 92.6472, loss_bbox: 0.2469, loss_mask: 0.2416, loss: 0.7554 2024-05-30 22:27:19,009 - mmdet - INFO - Epoch [5][3550/7330] lr: 1.000e-04, eta: 16:58:07, time: 1.114, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0206, loss_rpn_bbox: 0.0482, loss_cls: 0.2051, acc: 92.4985, loss_bbox: 0.2541, loss_mask: 0.2449, loss: 0.7728 2024-05-30 22:28:14,757 - mmdet - INFO - Epoch [5][3600/7330] lr: 1.000e-04, eta: 16:57:12, time: 1.115, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0511, loss_cls: 0.1975, acc: 92.6895, loss_bbox: 0.2526, loss_mask: 0.2455, loss: 0.7658 2024-05-30 22:29:10,988 - mmdet - INFO - Epoch [5][3650/7330] lr: 1.000e-04, eta: 16:56:18, time: 1.125, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0486, loss_cls: 0.1859, acc: 93.1184, loss_bbox: 0.2402, loss_mask: 0.2414, loss: 0.7347 2024-05-30 22:30:06,229 - mmdet - INFO - Epoch [5][3700/7330] lr: 1.000e-04, eta: 16:55:22, time: 1.105, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0472, loss_cls: 0.1929, acc: 92.8818, loss_bbox: 0.2449, loss_mask: 0.2443, loss: 0.7479 2024-05-30 22:31:00,123 - mmdet - INFO - Epoch [5][3750/7330] lr: 1.000e-04, eta: 16:54:24, time: 1.078, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0447, loss_cls: 0.1886, acc: 93.0256, loss_bbox: 0.2394, loss_mask: 0.2372, loss: 0.7284 2024-05-30 22:31:55,439 - mmdet - INFO - Epoch [5][3800/7330] lr: 1.000e-04, eta: 16:53:28, time: 1.106, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0467, loss_cls: 0.1914, acc: 93.0122, loss_bbox: 0.2408, loss_mask: 0.2388, loss: 0.7365 2024-05-30 22:32:51,933 - mmdet - INFO - Epoch [5][3850/7330] lr: 1.000e-04, eta: 16:52:34, time: 1.130, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0462, loss_cls: 0.1889, acc: 93.2129, loss_bbox: 0.2349, loss_mask: 0.2391, loss: 0.7286 2024-05-30 22:33:49,877 - mmdet - INFO - Epoch [5][3900/7330] lr: 1.000e-04, eta: 16:51:43, time: 1.159, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0477, loss_cls: 0.1922, acc: 92.9514, loss_bbox: 0.2377, loss_mask: 0.2405, loss: 0.7385 2024-05-30 22:34:47,675 - mmdet - INFO - Epoch [5][3950/7330] lr: 1.000e-04, eta: 16:50:51, time: 1.156, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0457, loss_cls: 0.1853, acc: 93.1899, loss_bbox: 0.2329, loss_mask: 0.2366, loss: 0.7188 2024-05-30 22:35:42,999 - mmdet - INFO - Epoch [5][4000/7330] lr: 1.000e-04, eta: 16:49:56, time: 1.106, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0211, loss_rpn_bbox: 0.0479, loss_cls: 0.1928, acc: 93.0027, loss_bbox: 0.2413, loss_mask: 0.2384, loss: 0.7415 2024-05-30 22:36:39,939 - mmdet - INFO - Epoch [5][4050/7330] lr: 1.000e-04, eta: 16:49:03, time: 1.139, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0455, loss_cls: 0.1858, acc: 93.2866, loss_bbox: 0.2329, loss_mask: 0.2376, loss: 0.7191 2024-05-30 22:37:36,722 - mmdet - INFO - Epoch [5][4100/7330] lr: 1.000e-04, eta: 16:48:10, time: 1.136, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0462, loss_cls: 0.1900, acc: 93.0623, loss_bbox: 0.2375, loss_mask: 0.2388, loss: 0.7296 2024-05-30 22:38:30,644 - mmdet - INFO - Epoch [5][4150/7330] lr: 1.000e-04, eta: 16:47:12, time: 1.078, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0437, loss_cls: 0.1827, acc: 93.3645, loss_bbox: 0.2276, loss_mask: 0.2358, loss: 0.7076 2024-05-30 22:39:24,601 - mmdet - INFO - Epoch [5][4200/7330] lr: 1.000e-04, eta: 16:46:14, time: 1.079, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0463, loss_cls: 0.1865, acc: 93.1675, loss_bbox: 0.2373, loss_mask: 0.2402, loss: 0.7299 2024-05-30 22:40:21,337 - mmdet - INFO - Epoch [5][4250/7330] lr: 1.000e-04, eta: 16:45:20, time: 1.135, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0491, loss_cls: 0.2026, acc: 92.6143, loss_bbox: 0.2527, loss_mask: 0.2408, loss: 0.7661 2024-05-30 22:41:16,327 - mmdet - INFO - Epoch [5][4300/7330] lr: 1.000e-04, eta: 16:44:24, time: 1.100, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0470, loss_cls: 0.1981, acc: 92.7849, loss_bbox: 0.2455, loss_mask: 0.2406, loss: 0.7490 2024-05-30 22:42:11,103 - mmdet - INFO - Epoch [5][4350/7330] lr: 1.000e-04, eta: 16:43:28, time: 1.096, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0489, loss_cls: 0.1974, acc: 92.7449, loss_bbox: 0.2513, loss_mask: 0.2444, loss: 0.7622 2024-05-30 22:43:07,110 - mmdet - INFO - Epoch [5][4400/7330] lr: 1.000e-04, eta: 16:42:33, time: 1.120, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0460, loss_cls: 0.1894, acc: 93.1418, loss_bbox: 0.2358, loss_mask: 0.2368, loss: 0.7281 2024-05-30 22:44:03,474 - mmdet - INFO - Epoch [5][4450/7330] lr: 1.000e-04, eta: 16:41:39, time: 1.127, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0481, loss_cls: 0.1949, acc: 92.8633, loss_bbox: 0.2471, loss_mask: 0.2347, loss: 0.7439 2024-05-30 22:45:03,768 - mmdet - INFO - Epoch [5][4500/7330] lr: 1.000e-04, eta: 16:40:51, time: 1.206, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0478, loss_cls: 0.1976, acc: 92.8184, loss_bbox: 0.2468, loss_mask: 0.2394, loss: 0.7508 2024-05-30 22:45:58,287 - mmdet - INFO - Epoch [5][4550/7330] lr: 1.000e-04, eta: 16:39:54, time: 1.090, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0468, loss_cls: 0.1917, acc: 93.0603, loss_bbox: 0.2351, loss_mask: 0.2345, loss: 0.7259 2024-05-30 22:46:55,438 - mmdet - INFO - Epoch [5][4600/7330] lr: 1.000e-04, eta: 16:39:02, time: 1.143, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0484, loss_cls: 0.1997, acc: 92.6719, loss_bbox: 0.2487, loss_mask: 0.2441, loss: 0.7611 2024-05-30 22:47:50,398 - mmdet - INFO - Epoch [5][4650/7330] lr: 1.000e-04, eta: 16:38:05, time: 1.099, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0457, loss_cls: 0.1907, acc: 92.9224, loss_bbox: 0.2430, loss_mask: 0.2385, loss: 0.7362 2024-05-30 22:48:45,116 - mmdet - INFO - Epoch [5][4700/7330] lr: 1.000e-04, eta: 16:37:09, time: 1.094, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0467, loss_cls: 0.1878, acc: 93.1431, loss_bbox: 0.2425, loss_mask: 0.2340, loss: 0.7289 2024-05-30 22:49:42,190 - mmdet - INFO - Epoch [5][4750/7330] lr: 1.000e-04, eta: 16:36:16, time: 1.141, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0467, loss_cls: 0.1860, acc: 93.2256, loss_bbox: 0.2359, loss_mask: 0.2324, loss: 0.7194 2024-05-30 22:50:36,860 - mmdet - INFO - Epoch [5][4800/7330] lr: 1.000e-04, eta: 16:35:19, time: 1.093, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0459, loss_cls: 0.1840, acc: 93.1963, loss_bbox: 0.2345, loss_mask: 0.2342, loss: 0.7163 2024-05-30 22:51:33,073 - mmdet - INFO - Epoch [5][4850/7330] lr: 1.000e-04, eta: 16:34:25, time: 1.124, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0456, loss_cls: 0.1884, acc: 93.1467, loss_bbox: 0.2416, loss_mask: 0.2435, loss: 0.7393 2024-05-30 22:52:27,231 - mmdet - INFO - Epoch [5][4900/7330] lr: 1.000e-04, eta: 16:33:27, time: 1.083, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0474, loss_cls: 0.1906, acc: 92.8687, loss_bbox: 0.2437, loss_mask: 0.2444, loss: 0.7447 2024-05-30 22:53:24,118 - mmdet - INFO - Epoch [5][4950/7330] lr: 1.000e-04, eta: 16:32:34, time: 1.138, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0203, loss_rpn_bbox: 0.0474, loss_cls: 0.2012, acc: 92.6697, loss_bbox: 0.2474, loss_mask: 0.2411, loss: 0.7574 2024-05-30 22:54:22,360 - mmdet - INFO - Epoch [5][5000/7330] lr: 1.000e-04, eta: 16:31:43, time: 1.165, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0441, loss_cls: 0.1934, acc: 92.9717, loss_bbox: 0.2405, loss_mask: 0.2360, loss: 0.7319 2024-05-30 22:55:18,898 - mmdet - INFO - Epoch [5][5050/7330] lr: 1.000e-04, eta: 16:30:49, time: 1.131, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0499, loss_cls: 0.1964, acc: 92.7910, loss_bbox: 0.2473, loss_mask: 0.2417, loss: 0.7548 2024-05-30 22:56:14,587 - mmdet - INFO - Epoch [5][5100/7330] lr: 1.000e-04, eta: 16:29:54, time: 1.114, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0471, loss_cls: 0.1874, acc: 93.2380, loss_bbox: 0.2367, loss_mask: 0.2427, loss: 0.7333 2024-05-30 22:57:09,176 - mmdet - INFO - Epoch [5][5150/7330] lr: 1.000e-04, eta: 16:28:57, time: 1.092, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0460, loss_cls: 0.1893, acc: 93.0706, loss_bbox: 0.2370, loss_mask: 0.2393, loss: 0.7307 2024-05-30 22:58:05,201 - mmdet - INFO - Epoch [5][5200/7330] lr: 1.000e-04, eta: 16:28:03, time: 1.120, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0462, loss_cls: 0.1897, acc: 93.2112, loss_bbox: 0.2338, loss_mask: 0.2339, loss: 0.7237 2024-05-30 22:59:00,134 - mmdet - INFO - Epoch [5][5250/7330] lr: 1.000e-04, eta: 16:27:06, time: 1.099, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0486, loss_cls: 0.1897, acc: 93.1221, loss_bbox: 0.2353, loss_mask: 0.2421, loss: 0.7358 2024-05-30 22:59:53,900 - mmdet - INFO - Epoch [5][5300/7330] lr: 1.000e-04, eta: 16:26:08, time: 1.075, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0201, loss_rpn_bbox: 0.0478, loss_cls: 0.1924, acc: 93.0178, loss_bbox: 0.2356, loss_mask: 0.2296, loss: 0.7255 2024-05-30 23:00:49,684 - mmdet - INFO - Epoch [5][5350/7330] lr: 1.000e-04, eta: 16:25:13, time: 1.115, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0471, loss_cls: 0.1939, acc: 93.0181, loss_bbox: 0.2410, loss_mask: 0.2390, loss: 0.7391 2024-05-30 23:01:43,256 - mmdet - INFO - Epoch [5][5400/7330] lr: 1.000e-04, eta: 16:24:15, time: 1.072, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0461, loss_cls: 0.1862, acc: 93.1875, loss_bbox: 0.2420, loss_mask: 0.2395, loss: 0.7326 2024-05-30 23:02:38,012 - mmdet - INFO - Epoch [5][5450/7330] lr: 1.000e-04, eta: 16:23:18, time: 1.095, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0212, loss_rpn_bbox: 0.0496, loss_cls: 0.1934, acc: 92.9204, loss_bbox: 0.2406, loss_mask: 0.2292, loss: 0.7339 2024-05-30 23:03:38,664 - mmdet - INFO - Epoch [5][5500/7330] lr: 1.000e-04, eta: 16:22:31, time: 1.212, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0482, loss_cls: 0.1944, acc: 92.8552, loss_bbox: 0.2486, loss_mask: 0.2375, loss: 0.7466 2024-05-30 23:04:38,037 - mmdet - INFO - Epoch [5][5550/7330] lr: 1.000e-04, eta: 16:21:41, time: 1.188, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0453, loss_cls: 0.1813, acc: 93.3066, loss_bbox: 0.2312, loss_mask: 0.2330, loss: 0.7081 2024-05-30 23:05:34,099 - mmdet - INFO - Epoch [5][5600/7330] lr: 1.000e-04, eta: 16:20:47, time: 1.121, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0477, loss_cls: 0.1908, acc: 92.9653, loss_bbox: 0.2445, loss_mask: 0.2431, loss: 0.7455 2024-05-30 23:06:28,592 - mmdet - INFO - Epoch [5][5650/7330] lr: 1.000e-04, eta: 16:19:50, time: 1.090, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0214, loss_rpn_bbox: 0.0475, loss_cls: 0.1830, acc: 93.2871, loss_bbox: 0.2366, loss_mask: 0.2382, loss: 0.7267 2024-05-30 23:07:25,372 - mmdet - INFO - Epoch [5][5700/7330] lr: 1.000e-04, eta: 16:18:56, time: 1.136, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0207, loss_rpn_bbox: 0.0475, loss_cls: 0.1840, acc: 93.2371, loss_bbox: 0.2359, loss_mask: 0.2371, loss: 0.7253 2024-05-30 23:08:19,096 - mmdet - INFO - Epoch [5][5750/7330] lr: 1.000e-04, eta: 16:17:58, time: 1.074, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0449, loss_cls: 0.1901, acc: 93.1360, loss_bbox: 0.2420, loss_mask: 0.2405, loss: 0.7359 2024-05-30 23:09:16,873 - mmdet - INFO - Epoch [5][5800/7330] lr: 1.000e-04, eta: 16:17:06, time: 1.155, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0505, loss_cls: 0.1917, acc: 92.9133, loss_bbox: 0.2427, loss_mask: 0.2353, loss: 0.7412 2024-05-30 23:10:11,767 - mmdet - INFO - Epoch [5][5850/7330] lr: 1.000e-04, eta: 16:16:10, time: 1.098, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0454, loss_cls: 0.1854, acc: 93.1519, loss_bbox: 0.2365, loss_mask: 0.2385, loss: 0.7248 2024-05-30 23:11:06,305 - mmdet - INFO - Epoch [5][5900/7330] lr: 1.000e-04, eta: 16:15:13, time: 1.091, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0469, loss_cls: 0.1908, acc: 93.0073, loss_bbox: 0.2399, loss_mask: 0.2351, loss: 0.7324 2024-05-30 23:12:02,609 - mmdet - INFO - Epoch [5][5950/7330] lr: 1.000e-04, eta: 16:14:19, time: 1.126, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0504, loss_cls: 0.1932, acc: 92.8757, loss_bbox: 0.2476, loss_mask: 0.2440, loss: 0.7557 2024-05-30 23:12:57,468 - mmdet - INFO - Epoch [5][6000/7330] lr: 1.000e-04, eta: 16:13:22, time: 1.097, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0190, loss_rpn_bbox: 0.0462, loss_cls: 0.1907, acc: 93.1392, loss_bbox: 0.2423, loss_mask: 0.2426, loss: 0.7409 2024-05-30 23:13:53,689 - mmdet - INFO - Epoch [5][6050/7330] lr: 1.000e-04, eta: 16:12:28, time: 1.124, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0464, loss_cls: 0.1883, acc: 93.0984, loss_bbox: 0.2370, loss_mask: 0.2391, loss: 0.7290 2024-05-30 23:14:55,763 - mmdet - INFO - Epoch [5][6100/7330] lr: 1.000e-04, eta: 16:11:42, time: 1.241, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0447, loss_cls: 0.1840, acc: 93.3264, loss_bbox: 0.2361, loss_mask: 0.2354, loss: 0.7171 2024-05-30 23:15:52,120 - mmdet - INFO - Epoch [5][6150/7330] lr: 1.000e-04, eta: 16:10:48, time: 1.127, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0472, loss_cls: 0.1904, acc: 92.9924, loss_bbox: 0.2368, loss_mask: 0.2381, loss: 0.7329 2024-05-30 23:16:46,085 - mmdet - INFO - Epoch [5][6200/7330] lr: 1.000e-04, eta: 16:09:50, time: 1.079, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0462, loss_cls: 0.1922, acc: 92.9944, loss_bbox: 0.2433, loss_mask: 0.2422, loss: 0.7426 2024-05-30 23:17:42,652 - mmdet - INFO - Epoch [5][6250/7330] lr: 1.000e-04, eta: 16:08:56, time: 1.131, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0478, loss_cls: 0.1989, acc: 92.8364, loss_bbox: 0.2434, loss_mask: 0.2373, loss: 0.7460 2024-05-30 23:18:36,484 - mmdet - INFO - Epoch [5][6300/7330] lr: 1.000e-04, eta: 16:07:58, time: 1.077, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0465, loss_cls: 0.1883, acc: 93.2505, loss_bbox: 0.2361, loss_mask: 0.2394, loss: 0.7285 2024-05-30 23:19:33,846 - mmdet - INFO - Epoch [5][6350/7330] lr: 1.000e-04, eta: 16:07:06, time: 1.147, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0475, loss_cls: 0.1948, acc: 92.8845, loss_bbox: 0.2418, loss_mask: 0.2392, loss: 0.7420 2024-05-30 23:20:28,352 - mmdet - INFO - Epoch [5][6400/7330] lr: 1.000e-04, eta: 16:06:09, time: 1.090, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0473, loss_cls: 0.1912, acc: 93.0896, loss_bbox: 0.2409, loss_mask: 0.2365, loss: 0.7351 2024-05-30 23:21:22,588 - mmdet - INFO - Epoch [5][6450/7330] lr: 1.000e-04, eta: 16:05:12, time: 1.085, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0438, loss_cls: 0.1804, acc: 93.3428, loss_bbox: 0.2297, loss_mask: 0.2312, loss: 0.7042 2024-05-30 23:22:16,961 - mmdet - INFO - Epoch [5][6500/7330] lr: 1.000e-04, eta: 16:04:14, time: 1.087, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0460, loss_cls: 0.1871, acc: 93.1321, loss_bbox: 0.2371, loss_mask: 0.2394, loss: 0.7287 2024-05-30 23:23:15,635 - mmdet - INFO - Epoch [5][6550/7330] lr: 1.000e-04, eta: 16:03:24, time: 1.174, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0444, loss_cls: 0.1866, acc: 93.1912, loss_bbox: 0.2299, loss_mask: 0.2345, loss: 0.7140 2024-05-30 23:24:10,524 - mmdet - INFO - Epoch [5][6600/7330] lr: 1.000e-04, eta: 16:02:27, time: 1.098, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0208, loss_rpn_bbox: 0.0474, loss_cls: 0.1977, acc: 92.6772, loss_bbox: 0.2472, loss_mask: 0.2425, loss: 0.7556 2024-05-30 23:25:07,406 - mmdet - INFO - Epoch [5][6650/7330] lr: 1.000e-04, eta: 16:01:34, time: 1.138, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0495, loss_cls: 0.1951, acc: 92.8416, loss_bbox: 0.2401, loss_mask: 0.2359, loss: 0.7395 2024-05-30 23:26:08,553 - mmdet - INFO - Epoch [5][6700/7330] lr: 1.000e-04, eta: 16:00:46, time: 1.223, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0434, loss_cls: 0.1844, acc: 93.2688, loss_bbox: 0.2349, loss_mask: 0.2367, loss: 0.7181 2024-05-30 23:27:03,183 - mmdet - INFO - Epoch [5][6750/7330] lr: 1.000e-04, eta: 15:59:50, time: 1.093, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0216, loss_rpn_bbox: 0.0480, loss_cls: 0.1997, acc: 92.7415, loss_bbox: 0.2454, loss_mask: 0.2431, loss: 0.7579 2024-05-30 23:27:58,699 - mmdet - INFO - Epoch [5][6800/7330] lr: 1.000e-04, eta: 15:58:54, time: 1.110, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0499, loss_cls: 0.1944, acc: 92.8215, loss_bbox: 0.2467, loss_mask: 0.2468, loss: 0.7587 2024-05-30 23:28:54,523 - mmdet - INFO - Epoch [5][6850/7330] lr: 1.000e-04, eta: 15:57:59, time: 1.116, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0469, loss_cls: 0.1941, acc: 92.9507, loss_bbox: 0.2490, loss_mask: 0.2392, loss: 0.7484 2024-05-30 23:29:48,780 - mmdet - INFO - Epoch [5][6900/7330] lr: 1.000e-04, eta: 15:57:02, time: 1.085, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0452, loss_cls: 0.1884, acc: 93.0676, loss_bbox: 0.2375, loss_mask: 0.2386, loss: 0.7276 2024-05-30 23:30:46,756 - mmdet - INFO - Epoch [5][6950/7330] lr: 1.000e-04, eta: 15:56:10, time: 1.160, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0474, loss_cls: 0.1933, acc: 92.9988, loss_bbox: 0.2440, loss_mask: 0.2355, loss: 0.7387 2024-05-30 23:31:40,828 - mmdet - INFO - Epoch [5][7000/7330] lr: 1.000e-04, eta: 15:55:13, time: 1.081, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0445, loss_cls: 0.1868, acc: 93.2021, loss_bbox: 0.2324, loss_mask: 0.2383, loss: 0.7196 2024-05-30 23:32:34,751 - mmdet - INFO - Epoch [5][7050/7330] lr: 1.000e-04, eta: 15:54:15, time: 1.078, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0458, loss_cls: 0.1840, acc: 93.3547, loss_bbox: 0.2291, loss_mask: 0.2343, loss: 0.7117 2024-05-30 23:33:30,985 - mmdet - INFO - Epoch [5][7100/7330] lr: 1.000e-04, eta: 15:53:20, time: 1.125, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0205, loss_rpn_bbox: 0.0440, loss_cls: 0.1888, acc: 93.1614, loss_bbox: 0.2338, loss_mask: 0.2369, loss: 0.7240 2024-05-30 23:34:27,789 - mmdet - INFO - Epoch [5][7150/7330] lr: 1.000e-04, eta: 15:52:27, time: 1.136, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0218, loss_rpn_bbox: 0.0502, loss_cls: 0.2067, acc: 92.5359, loss_bbox: 0.2516, loss_mask: 0.2496, loss: 0.7800 2024-05-30 23:35:25,952 - mmdet - INFO - Epoch [5][7200/7330] lr: 1.000e-04, eta: 15:51:35, time: 1.163, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0209, loss_rpn_bbox: 0.0495, loss_cls: 0.1888, acc: 93.0664, loss_bbox: 0.2429, loss_mask: 0.2448, loss: 0.7470 2024-05-30 23:36:22,434 - mmdet - INFO - Epoch [5][7250/7330] lr: 1.000e-04, eta: 15:50:41, time: 1.130, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0468, loss_cls: 0.1907, acc: 93.0320, loss_bbox: 0.2375, loss_mask: 0.2335, loss: 0.7266 2024-05-30 23:37:19,401 - mmdet - INFO - Epoch [5][7300/7330] lr: 1.000e-04, eta: 15:49:47, time: 1.139, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0194, loss_rpn_bbox: 0.0468, loss_cls: 0.1939, acc: 93.0178, loss_bbox: 0.2401, loss_mask: 0.2406, loss: 0.7408 2024-05-30 23:37:52,182 - mmdet - INFO - Saving checkpoint at 5 epochs 2024-05-30 23:40:16,527 - mmdet - INFO - Evaluating bbox... 2024-05-30 23:40:41,918 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.451 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.695 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.495 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.286 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.490 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.602 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.578 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.397 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.623 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.723 2024-05-30 23:40:41,918 - mmdet - INFO - Evaluating segm... 2024-05-30 23:41:07,915 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.411 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.661 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.438 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.203 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.449 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.618 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.528 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.330 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.576 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.702 2024-05-30 23:41:08,331 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 23:41:08,333 - mmdet - INFO - Epoch(val) [5][625] bbox_mAP: 0.4510, bbox_mAP_50: 0.6950, bbox_mAP_75: 0.4950, bbox_mAP_s: 0.2860, bbox_mAP_m: 0.4900, bbox_mAP_l: 0.6020, bbox_mAP_copypaste: 0.451 0.695 0.495 0.286 0.490 0.602, segm_mAP: 0.4110, segm_mAP_50: 0.6610, segm_mAP_75: 0.4380, segm_mAP_s: 0.2030, segm_mAP_m: 0.4490, segm_mAP_l: 0.6180, segm_mAP_copypaste: 0.411 0.661 0.438 0.203 0.449 0.618 2024-05-30 23:42:11,782 - mmdet - INFO - Epoch [6][50/7330] lr: 1.000e-04, eta: 15:47:43, time: 1.269, data_time: 0.140, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0433, loss_cls: 0.1795, acc: 93.3848, loss_bbox: 0.2274, loss_mask: 0.2308, loss: 0.6983 2024-05-30 23:43:06,983 - mmdet - INFO - Epoch [6][100/7330] lr: 1.000e-04, eta: 15:46:47, time: 1.104, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0467, loss_cls: 0.1823, acc: 93.1772, loss_bbox: 0.2350, loss_mask: 0.2308, loss: 0.7120 2024-05-30 23:44:01,614 - mmdet - INFO - Epoch [6][150/7330] lr: 1.000e-04, eta: 15:45:51, time: 1.093, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0433, loss_cls: 0.1773, acc: 93.4775, loss_bbox: 0.2278, loss_mask: 0.2273, loss: 0.6924 2024-05-30 23:44:55,881 - mmdet - INFO - Epoch [6][200/7330] lr: 1.000e-04, eta: 15:44:54, time: 1.085, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0446, loss_cls: 0.1818, acc: 93.3296, loss_bbox: 0.2316, loss_mask: 0.2322, loss: 0.7088 2024-05-30 23:45:49,254 - mmdet - INFO - Epoch [6][250/7330] lr: 1.000e-04, eta: 15:43:55, time: 1.067, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0428, loss_cls: 0.1723, acc: 93.6101, loss_bbox: 0.2270, loss_mask: 0.2321, loss: 0.6909 2024-05-30 23:46:43,003 - mmdet - INFO - Epoch [6][300/7330] lr: 1.000e-04, eta: 15:42:57, time: 1.075, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0470, loss_cls: 0.1870, acc: 93.1370, loss_bbox: 0.2365, loss_mask: 0.2363, loss: 0.7244 2024-05-30 23:47:42,423 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-30 23:47:42,423 - mmdet - INFO - Epoch [6][350/7330] lr: 1.000e-04, eta: 15:42:07, time: 1.188, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0476, loss_cls: 0.1840, acc: 93.2300, loss_bbox: 0.2347, loss_mask: 0.2355, loss: 0.7194 2024-05-30 23:48:38,698 - mmdet - INFO - Epoch [6][400/7330] lr: 1.000e-04, eta: 15:41:13, time: 1.126, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0458, loss_cls: 0.1785, acc: 93.3982, loss_bbox: 0.2293, loss_mask: 0.2336, loss: 0.7042 2024-05-30 23:49:37,913 - mmdet - INFO - Epoch [6][450/7330] lr: 1.000e-04, eta: 15:40:23, time: 1.184, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0456, loss_cls: 0.1740, acc: 93.4961, loss_bbox: 0.2258, loss_mask: 0.2265, loss: 0.6882 2024-05-30 23:50:32,701 - mmdet - INFO - Epoch [6][500/7330] lr: 1.000e-04, eta: 15:39:26, time: 1.096, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0465, loss_cls: 0.1796, acc: 93.3794, loss_bbox: 0.2290, loss_mask: 0.2294, loss: 0.7024 2024-05-30 23:51:30,195 - mmdet - INFO - Epoch [6][550/7330] lr: 1.000e-04, eta: 15:38:34, time: 1.150, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0469, loss_cls: 0.1923, acc: 92.8879, loss_bbox: 0.2423, loss_mask: 0.2388, loss: 0.7380 2024-05-30 23:52:27,393 - mmdet - INFO - Epoch [6][600/7330] lr: 1.000e-04, eta: 15:37:41, time: 1.144, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0476, loss_cls: 0.1921, acc: 92.9773, loss_bbox: 0.2444, loss_mask: 0.2328, loss: 0.7350 2024-05-30 23:53:21,505 - mmdet - INFO - Epoch [6][650/7330] lr: 1.000e-04, eta: 15:36:43, time: 1.082, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0462, loss_cls: 0.1871, acc: 93.1282, loss_bbox: 0.2371, loss_mask: 0.2363, loss: 0.7256 2024-05-30 23:54:15,677 - mmdet - INFO - Epoch [6][700/7330] lr: 1.000e-04, eta: 15:35:46, time: 1.083, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0476, loss_cls: 0.1916, acc: 92.9019, loss_bbox: 0.2451, loss_mask: 0.2326, loss: 0.7348 2024-05-30 23:55:11,867 - mmdet - INFO - Epoch [6][750/7330] lr: 1.000e-04, eta: 15:34:51, time: 1.123, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0464, loss_cls: 0.1887, acc: 93.0405, loss_bbox: 0.2403, loss_mask: 0.2341, loss: 0.7264 2024-05-30 23:56:06,406 - mmdet - INFO - Epoch [6][800/7330] lr: 1.000e-04, eta: 15:33:55, time: 1.091, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0468, loss_cls: 0.1899, acc: 92.9409, loss_bbox: 0.2455, loss_mask: 0.2374, loss: 0.7375 2024-05-30 23:57:00,309 - mmdet - INFO - Epoch [6][850/7330] lr: 1.000e-04, eta: 15:32:57, time: 1.078, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0440, loss_cls: 0.1883, acc: 93.2219, loss_bbox: 0.2296, loss_mask: 0.2346, loss: 0.7128 2024-05-30 23:57:57,693 - mmdet - INFO - Epoch [6][900/7330] lr: 1.000e-04, eta: 15:32:04, time: 1.148, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0424, loss_cls: 0.1741, acc: 93.4729, loss_bbox: 0.2248, loss_mask: 0.2334, loss: 0.6902 2024-05-30 23:58:54,449 - mmdet - INFO - Epoch [6][950/7330] lr: 1.000e-04, eta: 15:31:11, time: 1.135, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0456, loss_cls: 0.1862, acc: 93.1409, loss_bbox: 0.2346, loss_mask: 0.2314, loss: 0.7148 2024-05-30 23:59:50,233 - mmdet - INFO - Epoch [6][1000/7330] lr: 1.000e-04, eta: 15:30:15, time: 1.116, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0439, loss_cls: 0.1800, acc: 93.3416, loss_bbox: 0.2297, loss_mask: 0.2309, loss: 0.7008 2024-05-31 00:00:44,913 - mmdet - INFO - Epoch [6][1050/7330] lr: 1.000e-04, eta: 15:29:19, time: 1.094, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0197, loss_rpn_bbox: 0.0492, loss_cls: 0.1886, acc: 93.0662, loss_bbox: 0.2423, loss_mask: 0.2406, loss: 0.7403 2024-05-31 00:01:44,703 - mmdet - INFO - Epoch [6][1100/7330] lr: 1.000e-04, eta: 15:28:29, time: 1.196, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0438, loss_cls: 0.1798, acc: 93.4463, loss_bbox: 0.2260, loss_mask: 0.2315, loss: 0.6986 2024-05-31 00:02:40,828 - mmdet - INFO - Epoch [6][1150/7330] lr: 1.000e-04, eta: 15:27:35, time: 1.123, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0439, loss_cls: 0.1810, acc: 93.3625, loss_bbox: 0.2328, loss_mask: 0.2290, loss: 0.7022 2024-05-31 00:03:35,192 - mmdet - INFO - Epoch [6][1200/7330] lr: 1.000e-04, eta: 15:26:38, time: 1.087, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0445, loss_cls: 0.1787, acc: 93.3577, loss_bbox: 0.2257, loss_mask: 0.2308, loss: 0.6972 2024-05-31 00:04:29,784 - mmdet - INFO - Epoch [6][1250/7330] lr: 1.000e-04, eta: 15:25:41, time: 1.092, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0481, loss_cls: 0.1813, acc: 93.3062, loss_bbox: 0.2356, loss_mask: 0.2315, loss: 0.7145 2024-05-31 00:05:23,975 - mmdet - INFO - Epoch [6][1300/7330] lr: 1.000e-04, eta: 15:24:44, time: 1.084, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0450, loss_cls: 0.1837, acc: 93.2136, loss_bbox: 0.2348, loss_mask: 0.2354, loss: 0.7161 2024-05-31 00:06:20,504 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 00:06:20,504 - mmdet - INFO - Epoch [6][1350/7330] lr: 1.000e-04, eta: 15:23:50, time: 1.131, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0434, loss_cls: 0.1845, acc: 93.2036, loss_bbox: 0.2355, loss_mask: 0.2360, loss: 0.7171 2024-05-31 00:07:14,672 - mmdet - INFO - Epoch [6][1400/7330] lr: 1.000e-04, eta: 15:22:53, time: 1.083, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0422, loss_cls: 0.1764, acc: 93.4370, loss_bbox: 0.2243, loss_mask: 0.2297, loss: 0.6903 2024-05-31 00:08:15,350 - mmdet - INFO - Epoch [6][1450/7330] lr: 1.000e-04, eta: 15:22:04, time: 1.214, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0473, loss_cls: 0.1841, acc: 93.2319, loss_bbox: 0.2355, loss_mask: 0.2324, loss: 0.7180 2024-05-31 00:09:09,172 - mmdet - INFO - Epoch [6][1500/7330] lr: 1.000e-04, eta: 15:21:06, time: 1.076, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0462, loss_cls: 0.1822, acc: 93.2856, loss_bbox: 0.2315, loss_mask: 0.2309, loss: 0.7088 2024-05-31 00:10:05,969 - mmdet - INFO - Epoch [6][1550/7330] lr: 1.000e-04, eta: 15:20:12, time: 1.136, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0467, loss_cls: 0.1860, acc: 93.0496, loss_bbox: 0.2374, loss_mask: 0.2422, loss: 0.7292 2024-05-31 00:11:00,095 - mmdet - INFO - Epoch [6][1600/7330] lr: 1.000e-04, eta: 15:19:15, time: 1.082, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0442, loss_cls: 0.1867, acc: 93.0742, loss_bbox: 0.2356, loss_mask: 0.2295, loss: 0.7119 2024-05-31 00:11:54,744 - mmdet - INFO - Epoch [6][1650/7330] lr: 1.000e-04, eta: 15:18:19, time: 1.093, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0463, loss_cls: 0.1894, acc: 93.0247, loss_bbox: 0.2391, loss_mask: 0.2377, loss: 0.7303 2024-05-31 00:12:56,317 - mmdet - INFO - Epoch [6][1700/7330] lr: 1.000e-04, eta: 15:17:31, time: 1.231, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0440, loss_cls: 0.1826, acc: 93.2461, loss_bbox: 0.2371, loss_mask: 0.2343, loss: 0.7147 2024-05-31 00:13:50,404 - mmdet - INFO - Epoch [6][1750/7330] lr: 1.000e-04, eta: 15:16:34, time: 1.082, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0436, loss_cls: 0.1770, acc: 93.3713, loss_bbox: 0.2306, loss_mask: 0.2350, loss: 0.7021 2024-05-31 00:14:44,824 - mmdet - INFO - Epoch [6][1800/7330] lr: 1.000e-04, eta: 15:15:37, time: 1.088, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0479, loss_cls: 0.1858, acc: 93.1392, loss_bbox: 0.2373, loss_mask: 0.2399, loss: 0.7296 2024-05-31 00:15:38,940 - mmdet - INFO - Epoch [6][1850/7330] lr: 1.000e-04, eta: 15:14:40, time: 1.082, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0457, loss_cls: 0.1844, acc: 93.2073, loss_bbox: 0.2395, loss_mask: 0.2356, loss: 0.7225 2024-05-31 00:16:33,170 - mmdet - INFO - Epoch [6][1900/7330] lr: 1.000e-04, eta: 15:13:43, time: 1.085, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0454, loss_cls: 0.1818, acc: 93.2720, loss_bbox: 0.2314, loss_mask: 0.2369, loss: 0.7139 2024-05-31 00:17:28,480 - mmdet - INFO - Epoch [6][1950/7330] lr: 1.000e-04, eta: 15:12:47, time: 1.106, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0455, loss_cls: 0.1838, acc: 93.1753, loss_bbox: 0.2368, loss_mask: 0.2299, loss: 0.7151 2024-05-31 00:18:29,934 - mmdet - INFO - Epoch [6][2000/7330] lr: 1.000e-04, eta: 15:11:59, time: 1.229, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0443, loss_cls: 0.1819, acc: 93.3276, loss_bbox: 0.2255, loss_mask: 0.2292, loss: 0.6969 2024-05-31 00:19:24,228 - mmdet - INFO - Epoch [6][2050/7330] lr: 1.000e-04, eta: 15:11:02, time: 1.086, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0456, loss_cls: 0.1841, acc: 93.2124, loss_bbox: 0.2356, loss_mask: 0.2345, loss: 0.7182 2024-05-31 00:20:23,737 - mmdet - INFO - Epoch [6][2100/7330] lr: 1.000e-04, eta: 15:10:12, time: 1.190, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0481, loss_cls: 0.1925, acc: 92.8843, loss_bbox: 0.2470, loss_mask: 0.2363, loss: 0.7441 2024-05-31 00:21:18,541 - mmdet - INFO - Epoch [6][2150/7330] lr: 1.000e-04, eta: 15:09:15, time: 1.096, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0456, loss_cls: 0.1870, acc: 93.0386, loss_bbox: 0.2369, loss_mask: 0.2347, loss: 0.7213 2024-05-31 00:22:12,531 - mmdet - INFO - Epoch [6][2200/7330] lr: 1.000e-04, eta: 15:08:18, time: 1.080, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0457, loss_cls: 0.1831, acc: 93.2856, loss_bbox: 0.2362, loss_mask: 0.2342, loss: 0.7169 2024-05-31 00:23:11,549 - mmdet - INFO - Epoch [6][2250/7330] lr: 1.000e-04, eta: 15:07:27, time: 1.180, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0453, loss_cls: 0.1812, acc: 93.2903, loss_bbox: 0.2318, loss_mask: 0.2310, loss: 0.7077 2024-05-31 00:24:05,354 - mmdet - INFO - Epoch [6][2300/7330] lr: 1.000e-04, eta: 15:06:29, time: 1.076, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0432, loss_cls: 0.1742, acc: 93.6008, loss_bbox: 0.2262, loss_mask: 0.2264, loss: 0.6866 2024-05-31 00:25:02,166 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 00:25:02,167 - mmdet - INFO - Epoch [6][2350/7330] lr: 1.000e-04, eta: 15:05:36, time: 1.136, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0463, loss_cls: 0.1891, acc: 92.9929, loss_bbox: 0.2403, loss_mask: 0.2371, loss: 0.7317 2024-05-31 00:25:55,827 - mmdet - INFO - Epoch [6][2400/7330] lr: 1.000e-04, eta: 15:04:38, time: 1.073, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0432, loss_cls: 0.1784, acc: 93.3513, loss_bbox: 0.2308, loss_mask: 0.2288, loss: 0.6974 2024-05-31 00:26:49,742 - mmdet - INFO - Epoch [6][2450/7330] lr: 1.000e-04, eta: 15:03:40, time: 1.078, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0437, loss_cls: 0.1777, acc: 93.3967, loss_bbox: 0.2274, loss_mask: 0.2350, loss: 0.7002 2024-05-31 00:27:43,879 - mmdet - INFO - Epoch [6][2500/7330] lr: 1.000e-04, eta: 15:02:43, time: 1.083, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0467, loss_cls: 0.1862, acc: 93.1548, loss_bbox: 0.2377, loss_mask: 0.2359, loss: 0.7246 2024-05-31 00:28:42,548 - mmdet - INFO - Epoch [6][2550/7330] lr: 1.000e-04, eta: 15:01:52, time: 1.173, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0472, loss_cls: 0.1865, acc: 93.0278, loss_bbox: 0.2379, loss_mask: 0.2346, loss: 0.7236 2024-05-31 00:29:40,855 - mmdet - INFO - Epoch [6][2600/7330] lr: 1.000e-04, eta: 15:01:00, time: 1.166, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0419, loss_cls: 0.1735, acc: 93.4631, loss_bbox: 0.2258, loss_mask: 0.2297, loss: 0.6866 2024-05-31 00:30:39,059 - mmdet - INFO - Epoch [6][2650/7330] lr: 1.000e-04, eta: 15:00:07, time: 1.164, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0446, loss_cls: 0.1836, acc: 93.2600, loss_bbox: 0.2317, loss_mask: 0.2361, loss: 0.7125 2024-05-31 00:31:33,584 - mmdet - INFO - Epoch [6][2700/7330] lr: 1.000e-04, eta: 14:59:11, time: 1.090, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0458, loss_cls: 0.1836, acc: 93.1619, loss_bbox: 0.2327, loss_mask: 0.2310, loss: 0.7116 2024-05-31 00:32:30,171 - mmdet - INFO - Epoch [6][2750/7330] lr: 1.000e-04, eta: 14:58:17, time: 1.132, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0488, loss_cls: 0.1872, acc: 93.0886, loss_bbox: 0.2405, loss_mask: 0.2378, loss: 0.7326 2024-05-31 00:33:26,537 - mmdet - INFO - Epoch [6][2800/7330] lr: 1.000e-04, eta: 14:57:22, time: 1.127, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0195, loss_rpn_bbox: 0.0469, loss_cls: 0.1880, acc: 93.0283, loss_bbox: 0.2364, loss_mask: 0.2389, loss: 0.7296 2024-05-31 00:34:24,183 - mmdet - INFO - Epoch [6][2850/7330] lr: 1.000e-04, eta: 14:56:29, time: 1.153, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0466, loss_cls: 0.1858, acc: 93.1055, loss_bbox: 0.2369, loss_mask: 0.2332, loss: 0.7210 2024-05-31 00:35:18,413 - mmdet - INFO - Epoch [6][2900/7330] lr: 1.000e-04, eta: 14:55:32, time: 1.085, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0414, loss_cls: 0.1798, acc: 93.3027, loss_bbox: 0.2271, loss_mask: 0.2303, loss: 0.6955 2024-05-31 00:36:12,208 - mmdet - INFO - Epoch [6][2950/7330] lr: 1.000e-04, eta: 14:54:35, time: 1.076, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1861, acc: 93.0823, loss_bbox: 0.2343, loss_mask: 0.2293, loss: 0.7100 2024-05-31 00:37:05,951 - mmdet - INFO - Epoch [6][3000/7330] lr: 1.000e-04, eta: 14:53:37, time: 1.075, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0417, loss_cls: 0.1806, acc: 93.2454, loss_bbox: 0.2245, loss_mask: 0.2271, loss: 0.6905 2024-05-31 00:38:00,292 - mmdet - INFO - Epoch [6][3050/7330] lr: 1.000e-04, eta: 14:52:40, time: 1.087, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0426, loss_cls: 0.1774, acc: 93.5056, loss_bbox: 0.2260, loss_mask: 0.2231, loss: 0.6840 2024-05-31 00:39:00,186 - mmdet - INFO - Epoch [6][3100/7330] lr: 1.000e-04, eta: 14:51:50, time: 1.198, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0457, loss_cls: 0.1812, acc: 93.2141, loss_bbox: 0.2317, loss_mask: 0.2365, loss: 0.7128 2024-05-31 00:39:57,258 - mmdet - INFO - Epoch [6][3150/7330] lr: 1.000e-04, eta: 14:50:56, time: 1.141, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0459, loss_cls: 0.1819, acc: 93.2783, loss_bbox: 0.2314, loss_mask: 0.2324, loss: 0.7081 2024-05-31 00:40:54,622 - mmdet - INFO - Epoch [6][3200/7330] lr: 1.000e-04, eta: 14:50:03, time: 1.147, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0202, loss_rpn_bbox: 0.0461, loss_cls: 0.1872, acc: 93.1301, loss_bbox: 0.2400, loss_mask: 0.2368, loss: 0.7303 2024-05-31 00:41:51,727 - mmdet - INFO - Epoch [6][3250/7330] lr: 1.000e-04, eta: 14:49:10, time: 1.142, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0488, loss_cls: 0.1826, acc: 93.1436, loss_bbox: 0.2391, loss_mask: 0.2360, loss: 0.7232 2024-05-31 00:42:45,834 - mmdet - INFO - Epoch [6][3300/7330] lr: 1.000e-04, eta: 14:48:12, time: 1.082, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0453, loss_cls: 0.1796, acc: 93.4221, loss_bbox: 0.2294, loss_mask: 0.2340, loss: 0.7059 2024-05-31 00:43:44,864 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 00:43:44,864 - mmdet - INFO - Epoch [6][3350/7330] lr: 1.000e-04, eta: 14:47:21, time: 1.181, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0445, loss_cls: 0.1739, acc: 93.6382, loss_bbox: 0.2221, loss_mask: 0.2283, loss: 0.6844 2024-05-31 00:44:39,304 - mmdet - INFO - Epoch [6][3400/7330] lr: 1.000e-04, eta: 14:46:24, time: 1.089, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0452, loss_cls: 0.1837, acc: 93.0813, loss_bbox: 0.2380, loss_mask: 0.2340, loss: 0.7164 2024-05-31 00:45:36,008 - mmdet - INFO - Epoch [6][3450/7330] lr: 1.000e-04, eta: 14:45:30, time: 1.134, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0456, loss_cls: 0.1807, acc: 93.2874, loss_bbox: 0.2406, loss_mask: 0.2390, loss: 0.7245 2024-05-31 00:46:29,906 - mmdet - INFO - Epoch [6][3500/7330] lr: 1.000e-04, eta: 14:44:33, time: 1.078, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0459, loss_cls: 0.1837, acc: 93.2378, loss_bbox: 0.2337, loss_mask: 0.2354, loss: 0.7156 2024-05-31 00:47:23,779 - mmdet - INFO - Epoch [6][3550/7330] lr: 1.000e-04, eta: 14:43:35, time: 1.077, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0443, loss_cls: 0.1766, acc: 93.4041, loss_bbox: 0.2306, loss_mask: 0.2291, loss: 0.6975 2024-05-31 00:48:17,806 - mmdet - INFO - Epoch [6][3600/7330] lr: 1.000e-04, eta: 14:42:38, time: 1.081, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0448, loss_cls: 0.1876, acc: 93.1589, loss_bbox: 0.2323, loss_mask: 0.2345, loss: 0.7166 2024-05-31 00:49:17,111 - mmdet - INFO - Epoch [6][3650/7330] lr: 1.000e-04, eta: 14:41:47, time: 1.186, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0439, loss_cls: 0.1799, acc: 93.4282, loss_bbox: 0.2279, loss_mask: 0.2326, loss: 0.7011 2024-05-31 00:50:13,431 - mmdet - INFO - Epoch [6][3700/7330] lr: 1.000e-04, eta: 14:40:53, time: 1.126, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0461, loss_cls: 0.1778, acc: 93.4956, loss_bbox: 0.2282, loss_mask: 0.2326, loss: 0.7016 2024-05-31 00:51:12,022 - mmdet - INFO - Epoch [6][3750/7330] lr: 1.000e-04, eta: 14:40:01, time: 1.172, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0426, loss_cls: 0.1770, acc: 93.4851, loss_bbox: 0.2227, loss_mask: 0.2283, loss: 0.6873 2024-05-31 00:52:05,555 - mmdet - INFO - Epoch [6][3800/7330] lr: 1.000e-04, eta: 14:39:03, time: 1.070, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0422, loss_cls: 0.1763, acc: 93.4795, loss_bbox: 0.2269, loss_mask: 0.2337, loss: 0.6940 2024-05-31 00:53:00,272 - mmdet - INFO - Epoch [6][3850/7330] lr: 1.000e-04, eta: 14:38:06, time: 1.095, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0458, loss_cls: 0.1901, acc: 92.9519, loss_bbox: 0.2406, loss_mask: 0.2366, loss: 0.7312 2024-05-31 00:53:59,418 - mmdet - INFO - Epoch [6][3900/7330] lr: 1.000e-04, eta: 14:37:15, time: 1.183, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0458, loss_cls: 0.1808, acc: 93.3875, loss_bbox: 0.2289, loss_mask: 0.2371, loss: 0.7130 2024-05-31 00:54:53,335 - mmdet - INFO - Epoch [6][3950/7330] lr: 1.000e-04, eta: 14:36:18, time: 1.078, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0428, loss_cls: 0.1719, acc: 93.7239, loss_bbox: 0.2230, loss_mask: 0.2328, loss: 0.6870 2024-05-31 00:55:47,649 - mmdet - INFO - Epoch [6][4000/7330] lr: 1.000e-04, eta: 14:35:21, time: 1.086, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0462, loss_cls: 0.1814, acc: 93.2517, loss_bbox: 0.2359, loss_mask: 0.2374, loss: 0.7185 2024-05-31 00:56:41,938 - mmdet - INFO - Epoch [6][4050/7330] lr: 1.000e-04, eta: 14:34:24, time: 1.086, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0199, loss_rpn_bbox: 0.0466, loss_cls: 0.1853, acc: 93.2256, loss_bbox: 0.2365, loss_mask: 0.2340, loss: 0.7222 2024-05-31 00:57:38,551 - mmdet - INFO - Epoch [6][4100/7330] lr: 1.000e-04, eta: 14:33:30, time: 1.132, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0465, loss_cls: 0.1856, acc: 93.1396, loss_bbox: 0.2403, loss_mask: 0.2359, loss: 0.7262 2024-05-31 00:58:32,762 - mmdet - INFO - Epoch [6][4150/7330] lr: 1.000e-04, eta: 14:32:33, time: 1.084, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0436, loss_cls: 0.1763, acc: 93.4778, loss_bbox: 0.2300, loss_mask: 0.2280, loss: 0.6940 2024-05-31 00:59:31,754 - mmdet - INFO - Epoch [6][4200/7330] lr: 1.000e-04, eta: 14:31:41, time: 1.180, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0439, loss_cls: 0.1781, acc: 93.4866, loss_bbox: 0.2253, loss_mask: 0.2344, loss: 0.6995 2024-05-31 01:00:31,096 - mmdet - INFO - Epoch [6][4250/7330] lr: 1.000e-04, eta: 14:30:50, time: 1.187, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0465, loss_cls: 0.1871, acc: 93.0757, loss_bbox: 0.2366, loss_mask: 0.2371, loss: 0.7248 2024-05-31 01:01:25,947 - mmdet - INFO - Epoch [6][4300/7330] lr: 1.000e-04, eta: 14:29:54, time: 1.097, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0473, loss_cls: 0.1853, acc: 93.0669, loss_bbox: 0.2324, loss_mask: 0.2315, loss: 0.7139 2024-05-31 01:02:23,187 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 01:02:23,188 - mmdet - INFO - Epoch [6][4350/7330] lr: 1.000e-04, eta: 14:29:00, time: 1.145, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0467, loss_cls: 0.1927, acc: 92.8259, loss_bbox: 0.2442, loss_mask: 0.2385, loss: 0.7397 2024-05-31 01:03:19,404 - mmdet - INFO - Epoch [6][4400/7330] lr: 1.000e-04, eta: 14:28:06, time: 1.124, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0454, loss_cls: 0.1877, acc: 93.0869, loss_bbox: 0.2370, loss_mask: 0.2370, loss: 0.7250 2024-05-31 01:04:16,264 - mmdet - INFO - Epoch [6][4450/7330] lr: 1.000e-04, eta: 14:27:12, time: 1.137, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0451, loss_cls: 0.1864, acc: 93.0486, loss_bbox: 0.2368, loss_mask: 0.2384, loss: 0.7248 2024-05-31 01:05:10,774 - mmdet - INFO - Epoch [6][4500/7330] lr: 1.000e-04, eta: 14:26:15, time: 1.090, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0442, loss_cls: 0.1764, acc: 93.5195, loss_bbox: 0.2268, loss_mask: 0.2328, loss: 0.6980 2024-05-31 01:06:05,572 - mmdet - INFO - Epoch [6][4550/7330] lr: 1.000e-04, eta: 14:25:19, time: 1.096, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0472, loss_cls: 0.1835, acc: 93.2224, loss_bbox: 0.2340, loss_mask: 0.2343, loss: 0.7167 2024-05-31 01:07:00,010 - mmdet - INFO - Epoch [6][4600/7330] lr: 1.000e-04, eta: 14:24:22, time: 1.089, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0473, loss_cls: 0.1882, acc: 93.1353, loss_bbox: 0.2392, loss_mask: 0.2408, loss: 0.7338 2024-05-31 01:07:55,456 - mmdet - INFO - Epoch [6][4650/7330] lr: 1.000e-04, eta: 14:23:26, time: 1.109, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0500, loss_cls: 0.1982, acc: 92.7280, loss_bbox: 0.2448, loss_mask: 0.2439, loss: 0.7561 2024-05-31 01:08:54,059 - mmdet - INFO - Epoch [6][4700/7330] lr: 1.000e-04, eta: 14:22:34, time: 1.172, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0434, loss_cls: 0.1919, acc: 93.0291, loss_bbox: 0.2348, loss_mask: 0.2323, loss: 0.7196 2024-05-31 01:09:52,035 - mmdet - INFO - Epoch [6][4750/7330] lr: 1.000e-04, eta: 14:21:42, time: 1.160, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0196, loss_rpn_bbox: 0.0470, loss_cls: 0.1911, acc: 92.9014, loss_bbox: 0.2398, loss_mask: 0.2372, loss: 0.7346 2024-05-31 01:10:51,277 - mmdet - INFO - Epoch [6][4800/7330] lr: 1.000e-04, eta: 14:20:50, time: 1.185, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0466, loss_cls: 0.1795, acc: 93.4329, loss_bbox: 0.2293, loss_mask: 0.2285, loss: 0.7007 2024-05-31 01:11:45,701 - mmdet - INFO - Epoch [6][4850/7330] lr: 1.000e-04, eta: 14:19:53, time: 1.088, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0442, loss_cls: 0.1855, acc: 93.1721, loss_bbox: 0.2306, loss_mask: 0.2291, loss: 0.7065 2024-05-31 01:12:40,288 - mmdet - INFO - Epoch [6][4900/7330] lr: 1.000e-04, eta: 14:18:57, time: 1.092, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0464, loss_cls: 0.1824, acc: 93.3452, loss_bbox: 0.2306, loss_mask: 0.2307, loss: 0.7077 2024-05-31 01:13:35,739 - mmdet - INFO - Epoch [6][4950/7330] lr: 1.000e-04, eta: 14:18:01, time: 1.109, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0411, loss_cls: 0.1765, acc: 93.4265, loss_bbox: 0.2270, loss_mask: 0.2278, loss: 0.6876 2024-05-31 01:14:34,586 - mmdet - INFO - Epoch [6][5000/7330] lr: 1.000e-04, eta: 14:17:09, time: 1.177, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0432, loss_cls: 0.1853, acc: 93.1187, loss_bbox: 0.2331, loss_mask: 0.2325, loss: 0.7108 2024-05-31 01:15:29,369 - mmdet - INFO - Epoch [6][5050/7330] lr: 1.000e-04, eta: 14:16:13, time: 1.096, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0186, loss_rpn_bbox: 0.0476, loss_cls: 0.1892, acc: 93.0337, loss_bbox: 0.2409, loss_mask: 0.2354, loss: 0.7317 2024-05-31 01:16:23,036 - mmdet - INFO - Epoch [6][5100/7330] lr: 1.000e-04, eta: 14:15:16, time: 1.073, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0464, loss_cls: 0.1899, acc: 92.9224, loss_bbox: 0.2421, loss_mask: 0.2347, loss: 0.7304 2024-05-31 01:17:17,792 - mmdet - INFO - Epoch [6][5150/7330] lr: 1.000e-04, eta: 14:14:19, time: 1.095, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0431, loss_cls: 0.1792, acc: 93.3713, loss_bbox: 0.2294, loss_mask: 0.2307, loss: 0.7000 2024-05-31 01:18:11,960 - mmdet - INFO - Epoch [6][5200/7330] lr: 1.000e-04, eta: 14:13:22, time: 1.083, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0448, loss_cls: 0.1813, acc: 93.3157, loss_bbox: 0.2274, loss_mask: 0.2343, loss: 0.7050 2024-05-31 01:19:10,572 - mmdet - INFO - Epoch [6][5250/7330] lr: 1.000e-04, eta: 14:12:30, time: 1.172, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0421, loss_cls: 0.1820, acc: 93.2483, loss_bbox: 0.2280, loss_mask: 0.2339, loss: 0.7019 2024-05-31 01:20:10,345 - mmdet - INFO - Epoch [6][5300/7330] lr: 1.000e-04, eta: 14:11:39, time: 1.196, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0450, loss_cls: 0.1746, acc: 93.4768, loss_bbox: 0.2256, loss_mask: 0.2310, loss: 0.6931 2024-05-31 01:21:07,056 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 01:21:07,056 - mmdet - INFO - Epoch [6][5350/7330] lr: 1.000e-04, eta: 14:10:45, time: 1.134, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0429, loss_cls: 0.1779, acc: 93.5369, loss_bbox: 0.2197, loss_mask: 0.2278, loss: 0.6844 2024-05-31 01:22:01,990 - mmdet - INFO - Epoch [6][5400/7330] lr: 1.000e-04, eta: 14:09:49, time: 1.099, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0487, loss_cls: 0.1917, acc: 92.9385, loss_bbox: 0.2450, loss_mask: 0.2410, loss: 0.7448 2024-05-31 01:22:58,894 - mmdet - INFO - Epoch [6][5450/7330] lr: 1.000e-04, eta: 14:08:55, time: 1.138, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0462, loss_cls: 0.1918, acc: 92.9646, loss_bbox: 0.2409, loss_mask: 0.2375, loss: 0.7340 2024-05-31 01:23:53,414 - mmdet - INFO - Epoch [6][5500/7330] lr: 1.000e-04, eta: 14:07:58, time: 1.090, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0480, loss_cls: 0.1837, acc: 93.2546, loss_bbox: 0.2380, loss_mask: 0.2373, loss: 0.7251 2024-05-31 01:24:49,858 - mmdet - INFO - Epoch [6][5550/7330] lr: 1.000e-04, eta: 14:07:04, time: 1.129, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0435, loss_cls: 0.1828, acc: 93.3606, loss_bbox: 0.2326, loss_mask: 0.2327, loss: 0.7083 2024-05-31 01:25:46,443 - mmdet - INFO - Epoch [6][5600/7330] lr: 1.000e-04, eta: 14:06:09, time: 1.132, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0479, loss_cls: 0.1946, acc: 92.9211, loss_bbox: 0.2435, loss_mask: 0.2370, loss: 0.7434 2024-05-31 01:26:40,769 - mmdet - INFO - Epoch [6][5650/7330] lr: 1.000e-04, eta: 14:05:12, time: 1.087, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0460, loss_cls: 0.1852, acc: 93.1902, loss_bbox: 0.2316, loss_mask: 0.2349, loss: 0.7154 2024-05-31 01:27:34,974 - mmdet - INFO - Epoch [6][5700/7330] lr: 1.000e-04, eta: 14:04:15, time: 1.084, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0462, loss_cls: 0.1860, acc: 93.1819, loss_bbox: 0.2393, loss_mask: 0.2408, loss: 0.7299 2024-05-31 01:28:28,977 - mmdet - INFO - Epoch [6][5750/7330] lr: 1.000e-04, eta: 14:03:18, time: 1.080, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0439, loss_cls: 0.1841, acc: 93.2068, loss_bbox: 0.2275, loss_mask: 0.2290, loss: 0.7007 2024-05-31 01:29:28,638 - mmdet - INFO - Epoch [6][5800/7330] lr: 1.000e-04, eta: 14:02:27, time: 1.193, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0437, loss_cls: 0.1807, acc: 93.5000, loss_bbox: 0.2249, loss_mask: 0.2250, loss: 0.6907 2024-05-31 01:30:27,192 - mmdet - INFO - Epoch [6][5850/7330] lr: 1.000e-04, eta: 14:01:35, time: 1.171, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0463, loss_cls: 0.1860, acc: 93.2495, loss_bbox: 0.2350, loss_mask: 0.2336, loss: 0.7183 2024-05-31 01:31:23,983 - mmdet - INFO - Epoch [6][5900/7330] lr: 1.000e-04, eta: 14:00:41, time: 1.136, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0433, loss_cls: 0.1731, acc: 93.6895, loss_bbox: 0.2233, loss_mask: 0.2315, loss: 0.6881 2024-05-31 01:32:18,071 - mmdet - INFO - Epoch [6][5950/7330] lr: 1.000e-04, eta: 13:59:43, time: 1.082, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0454, loss_cls: 0.1829, acc: 93.2332, loss_bbox: 0.2332, loss_mask: 0.2355, loss: 0.7145 2024-05-31 01:33:14,393 - mmdet - INFO - Epoch [6][6000/7330] lr: 1.000e-04, eta: 13:58:49, time: 1.126, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0453, loss_cls: 0.1808, acc: 93.3230, loss_bbox: 0.2300, loss_mask: 0.2290, loss: 0.7027 2024-05-31 01:34:08,329 - mmdet - INFO - Epoch [6][6050/7330] lr: 1.000e-04, eta: 13:57:51, time: 1.079, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0470, loss_cls: 0.1833, acc: 93.2039, loss_bbox: 0.2352, loss_mask: 0.2318, loss: 0.7149 2024-05-31 01:35:06,448 - mmdet - INFO - Epoch [6][6100/7330] lr: 1.000e-04, eta: 13:56:59, time: 1.162, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0467, loss_cls: 0.1838, acc: 93.2454, loss_bbox: 0.2300, loss_mask: 0.2336, loss: 0.7120 2024-05-31 01:36:00,456 - mmdet - INFO - Epoch [6][6150/7330] lr: 1.000e-04, eta: 13:56:02, time: 1.080, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0424, loss_cls: 0.1747, acc: 93.6021, loss_bbox: 0.2248, loss_mask: 0.2287, loss: 0.6855 2024-05-31 01:36:54,729 - mmdet - INFO - Epoch [6][6200/7330] lr: 1.000e-04, eta: 13:55:05, time: 1.085, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0184, loss_rpn_bbox: 0.0441, loss_cls: 0.1835, acc: 93.1482, loss_bbox: 0.2362, loss_mask: 0.2369, loss: 0.7191 2024-05-31 01:37:48,769 - mmdet - INFO - Epoch [6][6250/7330] lr: 1.000e-04, eta: 13:54:08, time: 1.081, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0435, loss_cls: 0.1849, acc: 93.2646, loss_bbox: 0.2231, loss_mask: 0.2337, loss: 0.7028 2024-05-31 01:38:43,463 - mmdet - INFO - Epoch [6][6300/7330] lr: 1.000e-04, eta: 13:53:11, time: 1.094, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0191, loss_rpn_bbox: 0.0472, loss_cls: 0.1913, acc: 92.9934, loss_bbox: 0.2399, loss_mask: 0.2343, loss: 0.7317 2024-05-31 01:39:46,047 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 01:39:46,047 - mmdet - INFO - Epoch [6][6350/7330] lr: 1.000e-04, eta: 13:52:23, time: 1.252, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0470, loss_cls: 0.1859, acc: 93.1516, loss_bbox: 0.2404, loss_mask: 0.2308, loss: 0.7208 2024-05-31 01:40:42,678 - mmdet - INFO - Epoch [6][6400/7330] lr: 1.000e-04, eta: 13:51:29, time: 1.133, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0204, loss_rpn_bbox: 0.0495, loss_cls: 0.1963, acc: 92.8613, loss_bbox: 0.2380, loss_mask: 0.2370, loss: 0.7412 2024-05-31 01:41:39,063 - mmdet - INFO - Epoch [6][6450/7330] lr: 1.000e-04, eta: 13:50:34, time: 1.128, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0189, loss_rpn_bbox: 0.0431, loss_cls: 0.1899, acc: 93.0298, loss_bbox: 0.2339, loss_mask: 0.2327, loss: 0.7186 2024-05-31 01:42:35,120 - mmdet - INFO - Epoch [6][6500/7330] lr: 1.000e-04, eta: 13:49:39, time: 1.121, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0448, loss_cls: 0.1828, acc: 93.2593, loss_bbox: 0.2318, loss_mask: 0.2258, loss: 0.7015 2024-05-31 01:43:30,217 - mmdet - INFO - Epoch [6][6550/7330] lr: 1.000e-04, eta: 13:48:43, time: 1.102, data_time: 0.047, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0453, loss_cls: 0.1911, acc: 92.9197, loss_bbox: 0.2438, loss_mask: 0.2356, loss: 0.7337 2024-05-31 01:44:24,458 - mmdet - INFO - Epoch [6][6600/7330] lr: 1.000e-04, eta: 13:47:46, time: 1.085, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0448, loss_cls: 0.1809, acc: 93.2253, loss_bbox: 0.2303, loss_mask: 0.2318, loss: 0.7048 2024-05-31 01:45:23,510 - mmdet - INFO - Epoch [6][6650/7330] lr: 1.000e-04, eta: 13:46:54, time: 1.181, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0435, loss_cls: 0.1816, acc: 93.3284, loss_bbox: 0.2299, loss_mask: 0.2318, loss: 0.7053 2024-05-31 01:46:18,407 - mmdet - INFO - Epoch [6][6700/7330] lr: 1.000e-04, eta: 13:45:58, time: 1.098, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0435, loss_cls: 0.1755, acc: 93.5181, loss_bbox: 0.2232, loss_mask: 0.2331, loss: 0.6933 2024-05-31 01:47:13,649 - mmdet - INFO - Epoch [6][6750/7330] lr: 1.000e-04, eta: 13:45:02, time: 1.105, data_time: 0.109, memory: 24290, loss_rpn_cls: 0.0192, loss_rpn_bbox: 0.0447, loss_cls: 0.1833, acc: 93.2654, loss_bbox: 0.2291, loss_mask: 0.2319, loss: 0.7083 2024-05-31 01:48:07,472 - mmdet - INFO - Epoch [6][6800/7330] lr: 1.000e-04, eta: 13:44:05, time: 1.076, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0437, loss_cls: 0.1740, acc: 93.6384, loss_bbox: 0.2202, loss_mask: 0.2289, loss: 0.6839 2024-05-31 01:49:02,351 - mmdet - INFO - Epoch [6][6850/7330] lr: 1.000e-04, eta: 13:43:08, time: 1.098, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0182, loss_rpn_bbox: 0.0468, loss_cls: 0.1925, acc: 92.8740, loss_bbox: 0.2419, loss_mask: 0.2390, loss: 0.7384 2024-05-31 01:50:01,044 - mmdet - INFO - Epoch [6][6900/7330] lr: 1.000e-04, eta: 13:42:16, time: 1.174, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0442, loss_cls: 0.1839, acc: 93.2307, loss_bbox: 0.2319, loss_mask: 0.2297, loss: 0.7074 2024-05-31 01:50:55,170 - mmdet - INFO - Epoch [6][6950/7330] lr: 1.000e-04, eta: 13:41:19, time: 1.083, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0455, loss_cls: 0.1886, acc: 93.0527, loss_bbox: 0.2380, loss_mask: 0.2354, loss: 0.7243 2024-05-31 01:51:56,598 - mmdet - INFO - Epoch [6][7000/7330] lr: 1.000e-04, eta: 13:40:30, time: 1.229, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0436, loss_cls: 0.1789, acc: 93.4270, loss_bbox: 0.2298, loss_mask: 0.2388, loss: 0.7080 2024-05-31 01:52:50,519 - mmdet - INFO - Epoch [6][7050/7330] lr: 1.000e-04, eta: 13:39:32, time: 1.078, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0459, loss_cls: 0.1750, acc: 93.5630, loss_bbox: 0.2248, loss_mask: 0.2307, loss: 0.6928 2024-05-31 01:53:45,029 - mmdet - INFO - Epoch [6][7100/7330] lr: 1.000e-04, eta: 13:38:36, time: 1.090, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0198, loss_rpn_bbox: 0.0493, loss_cls: 0.1875, acc: 93.1094, loss_bbox: 0.2376, loss_mask: 0.2384, loss: 0.7326 2024-05-31 01:54:41,245 - mmdet - INFO - Epoch [6][7150/7330] lr: 1.000e-04, eta: 13:37:41, time: 1.124, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0188, loss_rpn_bbox: 0.0472, loss_cls: 0.1887, acc: 93.1199, loss_bbox: 0.2339, loss_mask: 0.2369, loss: 0.7256 2024-05-31 01:55:37,723 - mmdet - INFO - Epoch [6][7200/7330] lr: 1.000e-04, eta: 13:36:46, time: 1.130, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0187, loss_rpn_bbox: 0.0430, loss_cls: 0.1765, acc: 93.4824, loss_bbox: 0.2248, loss_mask: 0.2251, loss: 0.6882 2024-05-31 01:56:34,268 - mmdet - INFO - Epoch [6][7250/7330] lr: 1.000e-04, eta: 13:35:52, time: 1.131, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0444, loss_cls: 0.1828, acc: 93.2410, loss_bbox: 0.2350, loss_mask: 0.2348, loss: 0.7143 2024-05-31 01:57:27,379 - mmdet - INFO - Epoch [6][7300/7330] lr: 1.000e-04, eta: 13:34:54, time: 1.062, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0183, loss_rpn_bbox: 0.0433, loss_cls: 0.1753, acc: 93.5684, loss_bbox: 0.2226, loss_mask: 0.2325, loss: 0.6919 2024-05-31 01:58:00,620 - mmdet - INFO - Saving checkpoint at 6 epochs 2024-05-31 02:00:22,393 - mmdet - INFO - Evaluating bbox... 2024-05-31 02:00:46,736 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.462 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.702 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.508 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.285 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.501 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.615 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.585 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.393 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.628 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.737 2024-05-31 02:00:46,736 - mmdet - INFO - Evaluating segm... 2024-05-31 02:01:11,522 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.420 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.669 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.451 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.203 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.458 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.632 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.535 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.326 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.581 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.715 2024-05-31 02:01:11,917 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 02:01:11,919 - mmdet - INFO - Epoch(val) [6][625] bbox_mAP: 0.4620, bbox_mAP_50: 0.7020, bbox_mAP_75: 0.5080, bbox_mAP_s: 0.2850, bbox_mAP_m: 0.5010, bbox_mAP_l: 0.6150, bbox_mAP_copypaste: 0.462 0.702 0.508 0.285 0.501 0.615, segm_mAP: 0.4200, segm_mAP_50: 0.6690, segm_mAP_75: 0.4510, segm_mAP_s: 0.2030, segm_mAP_m: 0.4580, segm_mAP_l: 0.6320, segm_mAP_copypaste: 0.420 0.669 0.451 0.203 0.458 0.632 2024-05-31 02:02:20,266 - mmdet - INFO - Epoch [7][50/7330] lr: 1.000e-04, eta: 13:33:04, time: 1.366, data_time: 0.139, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0442, loss_cls: 0.1700, acc: 93.6377, loss_bbox: 0.2230, loss_mask: 0.2267, loss: 0.6792 2024-05-31 02:03:17,082 - mmdet - INFO - Epoch [7][100/7330] lr: 1.000e-04, eta: 13:32:10, time: 1.136, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0178, loss_rpn_bbox: 0.0450, loss_cls: 0.1770, acc: 93.3313, loss_bbox: 0.2256, loss_mask: 0.2273, loss: 0.6928 2024-05-31 02:04:11,024 - mmdet - INFO - Epoch [7][150/7330] lr: 1.000e-04, eta: 13:31:13, time: 1.079, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0422, loss_cls: 0.1773, acc: 93.4521, loss_bbox: 0.2292, loss_mask: 0.2292, loss: 0.6932 2024-05-31 02:05:07,584 - mmdet - INFO - Epoch [7][200/7330] lr: 1.000e-04, eta: 13:30:18, time: 1.131, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0450, loss_cls: 0.1713, acc: 93.6179, loss_bbox: 0.2223, loss_mask: 0.2231, loss: 0.6772 2024-05-31 02:06:03,486 - mmdet - INFO - Epoch [7][250/7330] lr: 1.000e-04, eta: 13:29:23, time: 1.118, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0419, loss_cls: 0.1714, acc: 93.5474, loss_bbox: 0.2198, loss_mask: 0.2188, loss: 0.6675 2024-05-31 02:06:57,439 - mmdet - INFO - Epoch [7][300/7330] lr: 1.000e-04, eta: 13:28:26, time: 1.079, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0435, loss_cls: 0.1704, acc: 93.5898, loss_bbox: 0.2246, loss_mask: 0.2242, loss: 0.6783 2024-05-31 02:07:51,682 - mmdet - INFO - Epoch [7][350/7330] lr: 1.000e-04, eta: 13:27:29, time: 1.085, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0440, loss_cls: 0.1759, acc: 93.4780, loss_bbox: 0.2330, loss_mask: 0.2287, loss: 0.6973 2024-05-31 02:08:46,035 - mmdet - INFO - Epoch [7][400/7330] lr: 1.000e-04, eta: 13:26:33, time: 1.087, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0421, loss_cls: 0.1758, acc: 93.3953, loss_bbox: 0.2225, loss_mask: 0.2275, loss: 0.6826 2024-05-31 02:09:42,809 - mmdet - INFO - Epoch [7][450/7330] lr: 1.000e-04, eta: 13:25:38, time: 1.136, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0446, loss_cls: 0.1806, acc: 93.3809, loss_bbox: 0.2275, loss_mask: 0.2348, loss: 0.7034 2024-05-31 02:10:37,234 - mmdet - INFO - Epoch [7][500/7330] lr: 1.000e-04, eta: 13:24:42, time: 1.088, data_time: 0.093, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0462, loss_cls: 0.1766, acc: 93.3550, loss_bbox: 0.2305, loss_mask: 0.2335, loss: 0.7032 2024-05-31 02:11:33,670 - mmdet - INFO - Epoch [7][550/7330] lr: 1.000e-04, eta: 13:23:47, time: 1.129, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0455, loss_cls: 0.1768, acc: 93.3367, loss_bbox: 0.2311, loss_mask: 0.2293, loss: 0.6981 2024-05-31 02:12:30,106 - mmdet - INFO - Epoch [7][600/7330] lr: 1.000e-04, eta: 13:22:53, time: 1.129, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0417, loss_cls: 0.1755, acc: 93.4038, loss_bbox: 0.2287, loss_mask: 0.2298, loss: 0.6913 2024-05-31 02:13:26,729 - mmdet - INFO - Epoch [7][650/7330] lr: 1.000e-04, eta: 13:21:58, time: 1.132, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0408, loss_cls: 0.1688, acc: 93.6504, loss_bbox: 0.2195, loss_mask: 0.2266, loss: 0.6696 2024-05-31 02:14:23,876 - mmdet - INFO - Epoch [7][700/7330] lr: 1.000e-04, eta: 13:21:04, time: 1.143, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0456, loss_cls: 0.1816, acc: 93.2671, loss_bbox: 0.2311, loss_mask: 0.2315, loss: 0.7062 2024-05-31 02:15:20,692 - mmdet - INFO - Epoch [7][750/7330] lr: 1.000e-04, eta: 13:20:10, time: 1.136, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0448, loss_cls: 0.1807, acc: 93.1843, loss_bbox: 0.2274, loss_mask: 0.2270, loss: 0.6958 2024-05-31 02:16:17,125 - mmdet - INFO - Epoch [7][800/7330] lr: 1.000e-04, eta: 13:19:15, time: 1.129, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0417, loss_cls: 0.1721, acc: 93.5623, loss_bbox: 0.2258, loss_mask: 0.2238, loss: 0.6777 2024-05-31 02:17:13,951 - mmdet - INFO - Epoch [7][850/7330] lr: 1.000e-04, eta: 13:18:21, time: 1.137, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0474, loss_cls: 0.1801, acc: 93.3716, loss_bbox: 0.2317, loss_mask: 0.2363, loss: 0.7131 2024-05-31 02:18:07,677 - mmdet - INFO - Epoch [7][900/7330] lr: 1.000e-04, eta: 13:17:24, time: 1.074, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0424, loss_cls: 0.1667, acc: 93.7649, loss_bbox: 0.2164, loss_mask: 0.2220, loss: 0.6620 2024-05-31 02:19:01,758 - mmdet - INFO - Epoch [7][950/7330] lr: 1.000e-04, eta: 13:16:27, time: 1.082, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0444, loss_cls: 0.1745, acc: 93.4812, loss_bbox: 0.2327, loss_mask: 0.2320, loss: 0.7016 2024-05-31 02:19:57,762 - mmdet - INFO - Epoch [7][1000/7330] lr: 1.000e-04, eta: 13:15:32, time: 1.120, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0449, loss_cls: 0.1696, acc: 93.5999, loss_bbox: 0.2235, loss_mask: 0.2285, loss: 0.6828 2024-05-31 02:20:54,912 - mmdet - INFO - Epoch [7][1050/7330] lr: 1.000e-04, eta: 13:14:38, time: 1.143, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0434, loss_cls: 0.1791, acc: 93.2646, loss_bbox: 0.2336, loss_mask: 0.2274, loss: 0.6975 2024-05-31 02:21:49,137 - mmdet - INFO - Epoch [7][1100/7330] lr: 1.000e-04, eta: 13:13:41, time: 1.085, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0423, loss_cls: 0.1675, acc: 93.8062, loss_bbox: 0.2151, loss_mask: 0.2216, loss: 0.6605 2024-05-31 02:22:45,405 - mmdet - INFO - Epoch [7][1150/7330] lr: 1.000e-04, eta: 13:12:46, time: 1.125, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0430, loss_cls: 0.1726, acc: 93.5854, loss_bbox: 0.2226, loss_mask: 0.2265, loss: 0.6822 2024-05-31 02:23:42,213 - mmdet - INFO - Epoch [7][1200/7330] lr: 1.000e-04, eta: 13:11:52, time: 1.136, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0416, loss_cls: 0.1803, acc: 93.3201, loss_bbox: 0.2289, loss_mask: 0.2283, loss: 0.6954 2024-05-31 02:24:38,476 - mmdet - INFO - Epoch [7][1250/7330] lr: 1.000e-04, eta: 13:10:57, time: 1.125, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0413, loss_cls: 0.1658, acc: 93.7803, loss_bbox: 0.2143, loss_mask: 0.2205, loss: 0.6566 2024-05-31 02:25:32,994 - mmdet - INFO - Epoch [7][1300/7330] lr: 1.000e-04, eta: 13:10:00, time: 1.090, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0444, loss_cls: 0.1827, acc: 93.1746, loss_bbox: 0.2386, loss_mask: 0.2405, loss: 0.7243 2024-05-31 02:26:28,984 - mmdet - INFO - Epoch [7][1350/7330] lr: 1.000e-04, eta: 13:09:05, time: 1.120, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0426, loss_cls: 0.1700, acc: 93.6223, loss_bbox: 0.2187, loss_mask: 0.2258, loss: 0.6716 2024-05-31 02:27:23,248 - mmdet - INFO - Epoch [7][1400/7330] lr: 1.000e-04, eta: 13:08:09, time: 1.085, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0448, loss_cls: 0.1726, acc: 93.6182, loss_bbox: 0.2305, loss_mask: 0.2252, loss: 0.6899 2024-05-31 02:28:22,011 - mmdet - INFO - Epoch [7][1450/7330] lr: 1.000e-04, eta: 13:07:16, time: 1.175, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0438, loss_cls: 0.1719, acc: 93.5479, loss_bbox: 0.2270, loss_mask: 0.2287, loss: 0.6880 2024-05-31 02:29:16,645 - mmdet - INFO - Epoch [7][1500/7330] lr: 1.000e-04, eta: 13:06:20, time: 1.093, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0417, loss_cls: 0.1636, acc: 93.9277, loss_bbox: 0.2120, loss_mask: 0.2192, loss: 0.6514 2024-05-31 02:30:12,812 - mmdet - INFO - Epoch [7][1550/7330] lr: 1.000e-04, eta: 13:05:25, time: 1.123, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0440, loss_cls: 0.1729, acc: 93.5132, loss_bbox: 0.2219, loss_mask: 0.2292, loss: 0.6847 2024-05-31 02:31:11,369 - mmdet - INFO - Epoch [7][1600/7330] lr: 1.000e-04, eta: 13:04:32, time: 1.171, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0439, loss_cls: 0.1750, acc: 93.5234, loss_bbox: 0.2230, loss_mask: 0.2276, loss: 0.6860 2024-05-31 02:32:05,313 - mmdet - INFO - Epoch [7][1650/7330] lr: 1.000e-04, eta: 13:03:35, time: 1.079, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0450, loss_cls: 0.1783, acc: 93.3657, loss_bbox: 0.2306, loss_mask: 0.2258, loss: 0.6954 2024-05-31 02:33:02,542 - mmdet - INFO - Epoch [7][1700/7330] lr: 1.000e-04, eta: 13:02:41, time: 1.145, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0442, loss_cls: 0.1715, acc: 93.5752, loss_bbox: 0.2229, loss_mask: 0.2255, loss: 0.6797 2024-05-31 02:34:01,762 - mmdet - INFO - Epoch [7][1750/7330] lr: 1.000e-04, eta: 13:01:49, time: 1.184, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0413, loss_cls: 0.1720, acc: 93.6223, loss_bbox: 0.2211, loss_mask: 0.2255, loss: 0.6746 2024-05-31 02:34:56,490 - mmdet - INFO - Epoch [7][1800/7330] lr: 1.000e-04, eta: 13:00:53, time: 1.095, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0428, loss_cls: 0.1776, acc: 93.3784, loss_bbox: 0.2265, loss_mask: 0.2275, loss: 0.6911 2024-05-31 02:35:54,273 - mmdet - INFO - Epoch [7][1850/7330] lr: 1.000e-04, eta: 12:59:59, time: 1.156, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0430, loss_cls: 0.1751, acc: 93.4788, loss_bbox: 0.2261, loss_mask: 0.2320, loss: 0.6934 2024-05-31 02:36:48,184 - mmdet - INFO - Epoch [7][1900/7330] lr: 1.000e-04, eta: 12:59:02, time: 1.078, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0403, loss_cls: 0.1619, acc: 94.0088, loss_bbox: 0.2084, loss_mask: 0.2209, loss: 0.6473 2024-05-31 02:37:42,696 - mmdet - INFO - Epoch [7][1950/7330] lr: 1.000e-04, eta: 12:58:06, time: 1.090, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0442, loss_cls: 0.1720, acc: 93.5896, loss_bbox: 0.2247, loss_mask: 0.2300, loss: 0.6874 2024-05-31 02:38:36,563 - mmdet - INFO - Epoch [7][2000/7330] lr: 1.000e-04, eta: 12:57:09, time: 1.077, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0464, loss_cls: 0.1864, acc: 93.1396, loss_bbox: 0.2390, loss_mask: 0.2351, loss: 0.7246 2024-05-31 02:39:36,592 - mmdet - INFO - Epoch [7][2050/7330] lr: 1.000e-04, eta: 12:56:17, time: 1.200, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0436, loss_cls: 0.1735, acc: 93.6270, loss_bbox: 0.2231, loss_mask: 0.2279, loss: 0.6849 2024-05-31 02:40:36,498 - mmdet - INFO - Epoch [7][2100/7330] lr: 1.000e-04, eta: 12:55:26, time: 1.198, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0431, loss_cls: 0.1786, acc: 93.3906, loss_bbox: 0.2304, loss_mask: 0.2328, loss: 0.7017 2024-05-31 02:41:30,402 - mmdet - INFO - Epoch [7][2150/7330] lr: 1.000e-04, eta: 12:54:29, time: 1.078, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0406, loss_cls: 0.1655, acc: 93.7842, loss_bbox: 0.2151, loss_mask: 0.2269, loss: 0.6622 2024-05-31 02:42:24,295 - mmdet - INFO - Epoch [7][2200/7330] lr: 1.000e-04, eta: 12:53:32, time: 1.078, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0428, loss_cls: 0.1683, acc: 93.7957, loss_bbox: 0.2198, loss_mask: 0.2290, loss: 0.6759 2024-05-31 02:43:21,095 - mmdet - INFO - Epoch [7][2250/7330] lr: 1.000e-04, eta: 12:52:37, time: 1.136, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0434, loss_cls: 0.1723, acc: 93.6543, loss_bbox: 0.2250, loss_mask: 0.2268, loss: 0.6826 2024-05-31 02:44:19,611 - mmdet - INFO - Epoch [7][2300/7330] lr: 1.000e-04, eta: 12:51:44, time: 1.170, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0454, loss_cls: 0.1817, acc: 93.2324, loss_bbox: 0.2299, loss_mask: 0.2243, loss: 0.6965 2024-05-31 02:45:13,577 - mmdet - INFO - Epoch [7][2350/7330] lr: 1.000e-04, eta: 12:50:47, time: 1.079, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0434, loss_cls: 0.1690, acc: 93.7532, loss_bbox: 0.2157, loss_mask: 0.2189, loss: 0.6626 2024-05-31 02:46:09,592 - mmdet - INFO - Epoch [7][2400/7330] lr: 1.000e-04, eta: 12:49:52, time: 1.120, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0456, loss_cls: 0.1795, acc: 93.3535, loss_bbox: 0.2283, loss_mask: 0.2274, loss: 0.6960 2024-05-31 02:47:04,300 - mmdet - INFO - Epoch [7][2450/7330] lr: 1.000e-04, eta: 12:48:56, time: 1.094, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0453, loss_cls: 0.1801, acc: 93.3032, loss_bbox: 0.2277, loss_mask: 0.2277, loss: 0.6985 2024-05-31 02:47:58,749 - mmdet - INFO - Epoch [7][2500/7330] lr: 1.000e-04, eta: 12:47:59, time: 1.089, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0453, loss_cls: 0.1792, acc: 93.3125, loss_bbox: 0.2291, loss_mask: 0.2330, loss: 0.7048 2024-05-31 02:48:53,129 - mmdet - INFO - Epoch [7][2550/7330] lr: 1.000e-04, eta: 12:47:03, time: 1.088, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0429, loss_cls: 0.1690, acc: 93.7271, loss_bbox: 0.2157, loss_mask: 0.2278, loss: 0.6704 2024-05-31 02:49:49,786 - mmdet - INFO - Epoch [7][2600/7330] lr: 1.000e-04, eta: 12:46:08, time: 1.133, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0460, loss_cls: 0.1841, acc: 93.1777, loss_bbox: 0.2350, loss_mask: 0.2238, loss: 0.7063 2024-05-31 02:50:48,899 - mmdet - INFO - Epoch [7][2650/7330] lr: 1.000e-04, eta: 12:45:16, time: 1.182, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0411, loss_cls: 0.1703, acc: 93.7944, loss_bbox: 0.2116, loss_mask: 0.2227, loss: 0.6620 2024-05-31 02:51:46,013 - mmdet - INFO - Epoch [7][2700/7330] lr: 1.000e-04, eta: 12:44:22, time: 1.142, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0450, loss_cls: 0.1855, acc: 93.1650, loss_bbox: 0.2379, loss_mask: 0.2338, loss: 0.7194 2024-05-31 02:52:39,958 - mmdet - INFO - Epoch [7][2750/7330] lr: 1.000e-04, eta: 12:43:25, time: 1.079, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0447, loss_cls: 0.1751, acc: 93.4929, loss_bbox: 0.2244, loss_mask: 0.2227, loss: 0.6819 2024-05-31 02:53:38,029 - mmdet - INFO - Epoch [7][2800/7330] lr: 1.000e-04, eta: 12:42:31, time: 1.161, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0439, loss_cls: 0.1727, acc: 93.6641, loss_bbox: 0.2228, loss_mask: 0.2239, loss: 0.6782 2024-05-31 02:54:35,186 - mmdet - INFO - Epoch [7][2850/7330] lr: 1.000e-04, eta: 12:41:37, time: 1.143, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0430, loss_cls: 0.1734, acc: 93.6248, loss_bbox: 0.2266, loss_mask: 0.2271, loss: 0.6861 2024-05-31 02:55:31,551 - mmdet - INFO - Epoch [7][2900/7330] lr: 1.000e-04, eta: 12:40:42, time: 1.127, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0431, loss_cls: 0.1773, acc: 93.4612, loss_bbox: 0.2252, loss_mask: 0.2287, loss: 0.6903 2024-05-31 02:56:26,115 - mmdet - INFO - Epoch [7][2950/7330] lr: 1.000e-04, eta: 12:39:46, time: 1.091, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0452, loss_cls: 0.1842, acc: 93.0684, loss_bbox: 0.2361, loss_mask: 0.2331, loss: 0.7145 2024-05-31 02:57:20,734 - mmdet - INFO - Epoch [7][3000/7330] lr: 1.000e-04, eta: 12:38:50, time: 1.092, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0457, loss_cls: 0.1787, acc: 93.3958, loss_bbox: 0.2326, loss_mask: 0.2323, loss: 0.7065 2024-05-31 02:58:16,059 - mmdet - INFO - Epoch [7][3050/7330] lr: 1.000e-04, eta: 12:37:54, time: 1.107, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0478, loss_cls: 0.1833, acc: 93.0586, loss_bbox: 0.2399, loss_mask: 0.2328, loss: 0.7207 2024-05-31 02:59:10,183 - mmdet - INFO - Epoch [7][3100/7330] lr: 1.000e-04, eta: 12:36:57, time: 1.082, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0430, loss_cls: 0.1770, acc: 93.4268, loss_bbox: 0.2250, loss_mask: 0.2327, loss: 0.6941 2024-05-31 03:00:04,383 - mmdet - INFO - Epoch [7][3150/7330] lr: 1.000e-04, eta: 12:36:00, time: 1.084, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0432, loss_cls: 0.1781, acc: 93.3752, loss_bbox: 0.2284, loss_mask: 0.2280, loss: 0.6928 2024-05-31 03:01:02,761 - mmdet - INFO - Epoch [7][3200/7330] lr: 1.000e-04, eta: 12:35:07, time: 1.168, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0423, loss_cls: 0.1790, acc: 93.2432, loss_bbox: 0.2282, loss_mask: 0.2295, loss: 0.6951 2024-05-31 03:02:00,493 - mmdet - INFO - Epoch [7][3250/7330] lr: 1.000e-04, eta: 12:34:14, time: 1.155, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0430, loss_cls: 0.1757, acc: 93.5942, loss_bbox: 0.2268, loss_mask: 0.2249, loss: 0.6864 2024-05-31 03:02:55,457 - mmdet - INFO - Epoch [7][3300/7330] lr: 1.000e-04, eta: 12:33:18, time: 1.099, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0458, loss_cls: 0.1784, acc: 93.3867, loss_bbox: 0.2250, loss_mask: 0.2274, loss: 0.6930 2024-05-31 03:04:00,114 - mmdet - INFO - Epoch [7][3350/7330] lr: 1.000e-04, eta: 12:32:30, time: 1.293, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0466, loss_cls: 0.1804, acc: 93.3203, loss_bbox: 0.2320, loss_mask: 0.2346, loss: 0.7103 2024-05-31 03:04:53,928 - mmdet - INFO - Epoch [7][3400/7330] lr: 1.000e-04, eta: 12:31:33, time: 1.076, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0440, loss_cls: 0.1747, acc: 93.5359, loss_bbox: 0.2230, loss_mask: 0.2283, loss: 0.6855 2024-05-31 03:05:49,965 - mmdet - INFO - Epoch [7][3450/7330] lr: 1.000e-04, eta: 12:30:38, time: 1.121, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0442, loss_cls: 0.1810, acc: 93.3789, loss_bbox: 0.2265, loss_mask: 0.2238, loss: 0.6917 2024-05-31 03:06:44,036 - mmdet - INFO - Epoch [7][3500/7330] lr: 1.000e-04, eta: 12:29:41, time: 1.081, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0431, loss_cls: 0.1701, acc: 93.6958, loss_bbox: 0.2207, loss_mask: 0.2212, loss: 0.6724 2024-05-31 03:07:38,342 - mmdet - INFO - Epoch [7][3550/7330] lr: 1.000e-04, eta: 12:28:44, time: 1.086, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0410, loss_cls: 0.1696, acc: 93.7803, loss_bbox: 0.2174, loss_mask: 0.2224, loss: 0.6672 2024-05-31 03:08:32,961 - mmdet - INFO - Epoch [7][3600/7330] lr: 1.000e-04, eta: 12:27:48, time: 1.092, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0439, loss_cls: 0.1733, acc: 93.6016, loss_bbox: 0.2294, loss_mask: 0.2265, loss: 0.6882 2024-05-31 03:09:27,297 - mmdet - INFO - Epoch [7][3650/7330] lr: 1.000e-04, eta: 12:26:51, time: 1.087, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0447, loss_cls: 0.1735, acc: 93.5588, loss_bbox: 0.2269, loss_mask: 0.2276, loss: 0.6895 2024-05-31 03:10:22,131 - mmdet - INFO - Epoch [7][3700/7330] lr: 1.000e-04, eta: 12:25:55, time: 1.097, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0451, loss_cls: 0.1795, acc: 93.3191, loss_bbox: 0.2285, loss_mask: 0.2276, loss: 0.6983 2024-05-31 03:11:18,977 - mmdet - INFO - Epoch [7][3750/7330] lr: 1.000e-04, eta: 12:25:01, time: 1.137, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0435, loss_cls: 0.1746, acc: 93.5469, loss_bbox: 0.2244, loss_mask: 0.2223, loss: 0.6815 2024-05-31 03:12:17,892 - mmdet - INFO - Epoch [7][3800/7330] lr: 1.000e-04, eta: 12:24:08, time: 1.178, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0458, loss_cls: 0.1771, acc: 93.4209, loss_bbox: 0.2285, loss_mask: 0.2268, loss: 0.6946 2024-05-31 03:13:13,797 - mmdet - INFO - Epoch [7][3850/7330] lr: 1.000e-04, eta: 12:23:13, time: 1.118, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0411, loss_cls: 0.1744, acc: 93.6147, loss_bbox: 0.2193, loss_mask: 0.2271, loss: 0.6788 2024-05-31 03:14:12,190 - mmdet - INFO - Epoch [7][3900/7330] lr: 1.000e-04, eta: 12:22:19, time: 1.168, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0444, loss_cls: 0.1802, acc: 93.2385, loss_bbox: 0.2334, loss_mask: 0.2279, loss: 0.7018 2024-05-31 03:15:11,384 - mmdet - INFO - Epoch [7][3950/7330] lr: 1.000e-04, eta: 12:21:27, time: 1.184, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0425, loss_cls: 0.1819, acc: 93.1765, loss_bbox: 0.2287, loss_mask: 0.2338, loss: 0.7027 2024-05-31 03:16:05,509 - mmdet - INFO - Epoch [7][4000/7330] lr: 1.000e-04, eta: 12:20:30, time: 1.083, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0445, loss_cls: 0.1819, acc: 93.2927, loss_bbox: 0.2260, loss_mask: 0.2295, loss: 0.6983 2024-05-31 03:16:59,684 - mmdet - INFO - Epoch [7][4050/7330] lr: 1.000e-04, eta: 12:19:33, time: 1.083, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0414, loss_cls: 0.1730, acc: 93.5671, loss_bbox: 0.2199, loss_mask: 0.2263, loss: 0.6764 2024-05-31 03:17:53,847 - mmdet - INFO - Epoch [7][4100/7330] lr: 1.000e-04, eta: 12:18:37, time: 1.083, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0419, loss_cls: 0.1664, acc: 93.7126, loss_bbox: 0.2182, loss_mask: 0.2260, loss: 0.6682 2024-05-31 03:18:47,266 - mmdet - INFO - Epoch [7][4150/7330] lr: 1.000e-04, eta: 12:17:39, time: 1.068, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0440, loss_cls: 0.1756, acc: 93.4932, loss_bbox: 0.2216, loss_mask: 0.2290, loss: 0.6861 2024-05-31 03:19:41,748 - mmdet - INFO - Epoch [7][4200/7330] lr: 1.000e-04, eta: 12:16:43, time: 1.090, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0456, loss_cls: 0.1822, acc: 93.3125, loss_bbox: 0.2333, loss_mask: 0.2288, loss: 0.7071 2024-05-31 03:20:36,285 - mmdet - INFO - Epoch [7][4250/7330] lr: 1.000e-04, eta: 12:15:46, time: 1.091, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0442, loss_cls: 0.1787, acc: 93.3379, loss_bbox: 0.2370, loss_mask: 0.2292, loss: 0.7047 2024-05-31 03:21:34,308 - mmdet - INFO - Epoch [7][4300/7330] lr: 1.000e-04, eta: 12:14:53, time: 1.161, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0174, loss_rpn_bbox: 0.0467, loss_cls: 0.1781, acc: 93.3403, loss_bbox: 0.2243, loss_mask: 0.2298, loss: 0.6964 2024-05-31 03:22:33,261 - mmdet - INFO - Epoch [7][4350/7330] lr: 1.000e-04, eta: 12:14:00, time: 1.179, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0453, loss_cls: 0.1824, acc: 93.2637, loss_bbox: 0.2311, loss_mask: 0.2356, loss: 0.7129 2024-05-31 03:23:30,993 - mmdet - INFO - Epoch [7][4400/7330] lr: 1.000e-04, eta: 12:13:06, time: 1.154, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0422, loss_cls: 0.1788, acc: 93.3772, loss_bbox: 0.2275, loss_mask: 0.2277, loss: 0.6930 2024-05-31 03:24:31,554 - mmdet - INFO - Epoch [7][4450/7330] lr: 1.000e-04, eta: 12:12:15, time: 1.211, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0441, loss_cls: 0.1811, acc: 93.3628, loss_bbox: 0.2280, loss_mask: 0.2300, loss: 0.6987 2024-05-31 03:25:27,854 - mmdet - INFO - Epoch [7][4500/7330] lr: 1.000e-04, eta: 12:11:20, time: 1.126, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0440, loss_cls: 0.1767, acc: 93.4438, loss_bbox: 0.2237, loss_mask: 0.2250, loss: 0.6872 2024-05-31 03:26:22,765 - mmdet - INFO - Epoch [7][4550/7330] lr: 1.000e-04, eta: 12:10:24, time: 1.098, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0455, loss_cls: 0.1892, acc: 92.9805, loss_bbox: 0.2371, loss_mask: 0.2332, loss: 0.7218 2024-05-31 03:27:17,163 - mmdet - INFO - Epoch [7][4600/7330] lr: 1.000e-04, eta: 12:09:27, time: 1.088, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0455, loss_cls: 0.1812, acc: 93.3052, loss_bbox: 0.2274, loss_mask: 0.2334, loss: 0.7048 2024-05-31 03:28:11,187 - mmdet - INFO - Epoch [7][4650/7330] lr: 1.000e-04, eta: 12:08:30, time: 1.080, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0179, loss_rpn_bbox: 0.0475, loss_cls: 0.1758, acc: 93.4595, loss_bbox: 0.2278, loss_mask: 0.2317, loss: 0.7006 2024-05-31 03:29:05,205 - mmdet - INFO - Epoch [7][4700/7330] lr: 1.000e-04, eta: 12:07:34, time: 1.080, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0425, loss_cls: 0.1856, acc: 93.1382, loss_bbox: 0.2325, loss_mask: 0.2287, loss: 0.7062 2024-05-31 03:29:59,417 - mmdet - INFO - Epoch [7][4750/7330] lr: 1.000e-04, eta: 12:06:37, time: 1.084, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0419, loss_cls: 0.1725, acc: 93.5698, loss_bbox: 0.2207, loss_mask: 0.2279, loss: 0.6773 2024-05-31 03:30:53,302 - mmdet - INFO - Epoch [7][4800/7330] lr: 1.000e-04, eta: 12:05:40, time: 1.078, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0424, loss_cls: 0.1840, acc: 93.1729, loss_bbox: 0.2301, loss_mask: 0.2286, loss: 0.7010 2024-05-31 03:31:51,612 - mmdet - INFO - Epoch [7][4850/7330] lr: 1.000e-04, eta: 12:04:47, time: 1.166, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0418, loss_cls: 0.1661, acc: 93.7761, loss_bbox: 0.2141, loss_mask: 0.2234, loss: 0.6605 2024-05-31 03:32:45,155 - mmdet - INFO - Epoch [7][4900/7330] lr: 1.000e-04, eta: 12:03:49, time: 1.071, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0414, loss_cls: 0.1704, acc: 93.5811, loss_bbox: 0.2176, loss_mask: 0.2254, loss: 0.6688 2024-05-31 03:33:39,721 - mmdet - INFO - Epoch [7][4950/7330] lr: 1.000e-04, eta: 12:02:53, time: 1.091, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0443, loss_cls: 0.1804, acc: 93.3176, loss_bbox: 0.2317, loss_mask: 0.2282, loss: 0.7012 2024-05-31 03:34:45,939 - mmdet - INFO - Epoch [7][5000/7330] lr: 1.000e-04, eta: 12:02:06, time: 1.324, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0396, loss_cls: 0.1768, acc: 93.4316, loss_bbox: 0.2232, loss_mask: 0.2254, loss: 0.6810 2024-05-31 03:35:40,589 - mmdet - INFO - Epoch [7][5050/7330] lr: 1.000e-04, eta: 12:01:10, time: 1.093, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0428, loss_cls: 0.1755, acc: 93.5254, loss_bbox: 0.2226, loss_mask: 0.2277, loss: 0.6856 2024-05-31 03:36:37,232 - mmdet - INFO - Epoch [7][5100/7330] lr: 1.000e-04, eta: 12:00:15, time: 1.133, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0434, loss_cls: 0.1809, acc: 93.2356, loss_bbox: 0.2367, loss_mask: 0.2344, loss: 0.7098 2024-05-31 03:37:32,347 - mmdet - INFO - Epoch [7][5150/7330] lr: 1.000e-04, eta: 11:59:19, time: 1.102, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0177, loss_rpn_bbox: 0.0487, loss_cls: 0.1882, acc: 92.9915, loss_bbox: 0.2373, loss_mask: 0.2342, loss: 0.7260 2024-05-31 03:38:26,274 - mmdet - INFO - Epoch [7][5200/7330] lr: 1.000e-04, eta: 11:58:22, time: 1.079, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0444, loss_cls: 0.1784, acc: 93.4277, loss_bbox: 0.2299, loss_mask: 0.2334, loss: 0.7035 2024-05-31 03:39:20,905 - mmdet - INFO - Epoch [7][5250/7330] lr: 1.000e-04, eta: 11:57:26, time: 1.092, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0454, loss_cls: 0.1747, acc: 93.4844, loss_bbox: 0.2280, loss_mask: 0.2320, loss: 0.6981 2024-05-31 03:40:15,800 - mmdet - INFO - Epoch [7][5300/7330] lr: 1.000e-04, eta: 11:56:30, time: 1.098, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0434, loss_cls: 0.1725, acc: 93.5034, loss_bbox: 0.2250, loss_mask: 0.2276, loss: 0.6836 2024-05-31 03:41:10,029 - mmdet - INFO - Epoch [7][5350/7330] lr: 1.000e-04, eta: 11:55:33, time: 1.084, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0181, loss_rpn_bbox: 0.0455, loss_cls: 0.1735, acc: 93.6003, loss_bbox: 0.2220, loss_mask: 0.2281, loss: 0.6872 2024-05-31 03:42:06,150 - mmdet - INFO - Epoch [7][5400/7330] lr: 1.000e-04, eta: 11:54:38, time: 1.122, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0441, loss_cls: 0.1736, acc: 93.5037, loss_bbox: 0.2246, loss_mask: 0.2281, loss: 0.6869 2024-05-31 03:43:00,376 - mmdet - INFO - Epoch [7][5450/7330] lr: 1.000e-04, eta: 11:53:41, time: 1.085, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0440, loss_cls: 0.1784, acc: 93.3320, loss_bbox: 0.2297, loss_mask: 0.2296, loss: 0.6972 2024-05-31 03:44:03,026 - mmdet - INFO - Epoch [7][5500/7330] lr: 1.000e-04, eta: 11:52:51, time: 1.253, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0439, loss_cls: 0.1809, acc: 93.3787, loss_bbox: 0.2249, loss_mask: 0.2288, loss: 0.6950 2024-05-31 03:45:02,809 - mmdet - INFO - Epoch [7][5550/7330] lr: 1.000e-04, eta: 11:51:59, time: 1.195, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0434, loss_cls: 0.1818, acc: 93.3899, loss_bbox: 0.2244, loss_mask: 0.2391, loss: 0.7056 2024-05-31 03:45:59,792 - mmdet - INFO - Epoch [7][5600/7330] lr: 1.000e-04, eta: 11:51:04, time: 1.140, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0413, loss_cls: 0.1723, acc: 93.6184, loss_bbox: 0.2208, loss_mask: 0.2257, loss: 0.6749 2024-05-31 03:46:53,863 - mmdet - INFO - Epoch [7][5650/7330] lr: 1.000e-04, eta: 11:50:08, time: 1.081, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0185, loss_rpn_bbox: 0.0449, loss_cls: 0.1799, acc: 93.2500, loss_bbox: 0.2331, loss_mask: 0.2354, loss: 0.7117 2024-05-31 03:47:48,703 - mmdet - INFO - Epoch [7][5700/7330] lr: 1.000e-04, eta: 11:49:12, time: 1.097, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0429, loss_cls: 0.1797, acc: 93.4126, loss_bbox: 0.2242, loss_mask: 0.2261, loss: 0.6901 2024-05-31 03:48:42,663 - mmdet - INFO - Epoch [7][5750/7330] lr: 1.000e-04, eta: 11:48:15, time: 1.079, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0414, loss_cls: 0.1721, acc: 93.5198, loss_bbox: 0.2220, loss_mask: 0.2261, loss: 0.6768 2024-05-31 03:49:36,896 - mmdet - INFO - Epoch [7][5800/7330] lr: 1.000e-04, eta: 11:47:18, time: 1.085, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0421, loss_cls: 0.1743, acc: 93.5261, loss_bbox: 0.2180, loss_mask: 0.2249, loss: 0.6756 2024-05-31 03:50:31,970 - mmdet - INFO - Epoch [7][5850/7330] lr: 1.000e-04, eta: 11:46:22, time: 1.101, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0453, loss_cls: 0.1772, acc: 93.3708, loss_bbox: 0.2243, loss_mask: 0.2214, loss: 0.6848 2024-05-31 03:51:25,986 - mmdet - INFO - Epoch [7][5900/7330] lr: 1.000e-04, eta: 11:45:25, time: 1.080, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0180, loss_rpn_bbox: 0.0448, loss_cls: 0.1768, acc: 93.3223, loss_bbox: 0.2266, loss_mask: 0.2265, loss: 0.6927 2024-05-31 03:52:22,922 - mmdet - INFO - Epoch [7][5950/7330] lr: 1.000e-04, eta: 11:44:31, time: 1.139, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0441, loss_cls: 0.1813, acc: 93.2007, loss_bbox: 0.2357, loss_mask: 0.2286, loss: 0.7060 2024-05-31 03:53:17,434 - mmdet - INFO - Epoch [7][6000/7330] lr: 1.000e-04, eta: 11:43:34, time: 1.090, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0176, loss_rpn_bbox: 0.0447, loss_cls: 0.1723, acc: 93.6028, loss_bbox: 0.2210, loss_mask: 0.2270, loss: 0.6826 2024-05-31 03:54:16,151 - mmdet - INFO - Epoch [7][6050/7330] lr: 1.000e-04, eta: 11:42:41, time: 1.174, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0428, loss_cls: 0.1717, acc: 93.5613, loss_bbox: 0.2195, loss_mask: 0.2245, loss: 0.6743 2024-05-31 03:55:14,293 - mmdet - INFO - Epoch [7][6100/7330] lr: 1.000e-04, eta: 11:41:48, time: 1.163, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0423, loss_cls: 0.1701, acc: 93.6650, loss_bbox: 0.2159, loss_mask: 0.2301, loss: 0.6732 2024-05-31 03:56:15,869 - mmdet - INFO - Epoch [7][6150/7330] lr: 1.000e-04, eta: 11:40:57, time: 1.232, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0405, loss_cls: 0.1623, acc: 94.0237, loss_bbox: 0.2160, loss_mask: 0.2221, loss: 0.6566 2024-05-31 03:57:09,238 - mmdet - INFO - Epoch [7][6200/7330] lr: 1.000e-04, eta: 11:39:59, time: 1.067, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0398, loss_cls: 0.1726, acc: 93.6890, loss_bbox: 0.2208, loss_mask: 0.2253, loss: 0.6741 2024-05-31 03:58:03,751 - mmdet - INFO - Epoch [7][6250/7330] lr: 1.000e-04, eta: 11:39:03, time: 1.090, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0411, loss_cls: 0.1738, acc: 93.6167, loss_bbox: 0.2198, loss_mask: 0.2268, loss: 0.6782 2024-05-31 03:58:57,410 - mmdet - INFO - Epoch [7][6300/7330] lr: 1.000e-04, eta: 11:38:06, time: 1.073, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0419, loss_cls: 0.1686, acc: 93.6858, loss_bbox: 0.2206, loss_mask: 0.2221, loss: 0.6685 2024-05-31 03:59:52,023 - mmdet - INFO - Epoch [7][6350/7330] lr: 1.000e-04, eta: 11:37:10, time: 1.092, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0469, loss_cls: 0.1801, acc: 93.2878, loss_bbox: 0.2350, loss_mask: 0.2273, loss: 0.7056 2024-05-31 04:00:46,624 - mmdet - INFO - Epoch [7][6400/7330] lr: 1.000e-04, eta: 11:36:13, time: 1.092, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0436, loss_cls: 0.1766, acc: 93.3223, loss_bbox: 0.2305, loss_mask: 0.2287, loss: 0.6959 2024-05-31 04:01:43,454 - mmdet - INFO - Epoch [7][6450/7330] lr: 1.000e-04, eta: 11:35:19, time: 1.137, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0424, loss_cls: 0.1730, acc: 93.5576, loss_bbox: 0.2249, loss_mask: 0.2340, loss: 0.6897 2024-05-31 04:02:38,263 - mmdet - INFO - Epoch [7][6500/7330] lr: 1.000e-04, eta: 11:34:22, time: 1.096, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0421, loss_cls: 0.1765, acc: 93.4995, loss_bbox: 0.2248, loss_mask: 0.2289, loss: 0.6885 2024-05-31 04:03:34,003 - mmdet - INFO - Epoch [7][6550/7330] lr: 1.000e-04, eta: 11:33:27, time: 1.115, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0393, loss_cls: 0.1635, acc: 93.9104, loss_bbox: 0.2119, loss_mask: 0.2201, loss: 0.6495 2024-05-31 04:04:30,738 - mmdet - INFO - Epoch [7][6600/7330] lr: 1.000e-04, eta: 11:32:32, time: 1.135, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0461, loss_cls: 0.1870, acc: 93.0908, loss_bbox: 0.2342, loss_mask: 0.2312, loss: 0.7155 2024-05-31 04:05:29,002 - mmdet - INFO - Epoch [7][6650/7330] lr: 1.000e-04, eta: 11:31:39, time: 1.165, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0442, loss_cls: 0.1722, acc: 93.6694, loss_bbox: 0.2206, loss_mask: 0.2326, loss: 0.6861 2024-05-31 04:06:25,594 - mmdet - INFO - Epoch [7][6700/7330] lr: 1.000e-04, eta: 11:30:44, time: 1.132, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0406, loss_cls: 0.1690, acc: 93.7012, loss_bbox: 0.2162, loss_mask: 0.2257, loss: 0.6667 2024-05-31 04:07:25,445 - mmdet - INFO - Epoch [7][6750/7330] lr: 1.000e-04, eta: 11:29:51, time: 1.197, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0486, loss_cls: 0.1865, acc: 93.0579, loss_bbox: 0.2402, loss_mask: 0.2345, loss: 0.7268 2024-05-31 04:08:20,520 - mmdet - INFO - Epoch [7][6800/7330] lr: 1.000e-04, eta: 11:28:55, time: 1.101, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0468, loss_cls: 0.1750, acc: 93.3992, loss_bbox: 0.2292, loss_mask: 0.2246, loss: 0.6923 2024-05-31 04:09:15,574 - mmdet - INFO - Epoch [7][6850/7330] lr: 1.000e-04, eta: 11:27:59, time: 1.101, data_time: 0.101, memory: 24290, loss_rpn_cls: 0.0173, loss_rpn_bbox: 0.0480, loss_cls: 0.1861, acc: 93.0273, loss_bbox: 0.2385, loss_mask: 0.2347, loss: 0.7245 2024-05-31 04:10:09,492 - mmdet - INFO - Epoch [7][6900/7330] lr: 1.000e-04, eta: 11:27:03, time: 1.078, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0427, loss_cls: 0.1753, acc: 93.5393, loss_bbox: 0.2252, loss_mask: 0.2291, loss: 0.6879 2024-05-31 04:11:03,711 - mmdet - INFO - Epoch [7][6950/7330] lr: 1.000e-04, eta: 11:26:06, time: 1.084, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0407, loss_cls: 0.1747, acc: 93.6453, loss_bbox: 0.2196, loss_mask: 0.2276, loss: 0.6780 2024-05-31 04:12:00,709 - mmdet - INFO - Epoch [7][7000/7330] lr: 1.000e-04, eta: 11:25:11, time: 1.140, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0416, loss_cls: 0.1747, acc: 93.4841, loss_bbox: 0.2239, loss_mask: 0.2264, loss: 0.6836 2024-05-31 04:12:57,985 - mmdet - INFO - Epoch [7][7050/7330] lr: 1.000e-04, eta: 11:24:17, time: 1.146, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0443, loss_cls: 0.1793, acc: 93.3140, loss_bbox: 0.2253, loss_mask: 0.2298, loss: 0.6937 2024-05-31 04:13:52,755 - mmdet - INFO - Epoch [7][7100/7330] lr: 1.000e-04, eta: 11:23:21, time: 1.095, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0175, loss_rpn_bbox: 0.0466, loss_cls: 0.1826, acc: 93.2266, loss_bbox: 0.2323, loss_mask: 0.2324, loss: 0.7114 2024-05-31 04:14:49,507 - mmdet - INFO - Epoch [7][7150/7330] lr: 1.000e-04, eta: 11:22:26, time: 1.135, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0162, loss_rpn_bbox: 0.0445, loss_cls: 0.1798, acc: 93.3474, loss_bbox: 0.2302, loss_mask: 0.2323, loss: 0.7029 2024-05-31 04:15:48,190 - mmdet - INFO - Epoch [7][7200/7330] lr: 1.000e-04, eta: 11:21:33, time: 1.174, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0426, loss_cls: 0.1745, acc: 93.4263, loss_bbox: 0.2267, loss_mask: 0.2262, loss: 0.6850 2024-05-31 04:16:42,467 - mmdet - INFO - Epoch [7][7250/7330] lr: 1.000e-04, eta: 11:20:36, time: 1.086, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0442, loss_cls: 0.1843, acc: 93.2271, loss_bbox: 0.2308, loss_mask: 0.2304, loss: 0.7066 2024-05-31 04:17:37,724 - mmdet - INFO - Epoch [7][7300/7330] lr: 1.000e-04, eta: 11:19:40, time: 1.105, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0402, loss_cls: 0.1701, acc: 93.6719, loss_bbox: 0.2203, loss_mask: 0.2229, loss: 0.6689 2024-05-31 04:18:14,848 - mmdet - INFO - Saving checkpoint at 7 epochs 2024-05-31 04:20:39,602 - mmdet - INFO - Evaluating bbox... 2024-05-31 04:21:02,436 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.467 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.704 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.516 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.297 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.507 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.621 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.408 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.635 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.744 2024-05-31 04:21:02,436 - mmdet - INFO - Evaluating segm... 2024-05-31 04:21:27,803 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.425 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.671 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.454 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.215 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.457 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.629 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.540 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.345 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.586 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.715 2024-05-31 04:21:28,184 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 04:21:28,185 - mmdet - INFO - Epoch(val) [7][625] bbox_mAP: 0.4670, bbox_mAP_50: 0.7040, bbox_mAP_75: 0.5160, bbox_mAP_s: 0.2970, bbox_mAP_m: 0.5070, bbox_mAP_l: 0.6210, bbox_mAP_copypaste: 0.467 0.704 0.516 0.297 0.507 0.621, segm_mAP: 0.4250, segm_mAP_50: 0.6710, segm_mAP_75: 0.4540, segm_mAP_s: 0.2150, segm_mAP_m: 0.4570, segm_mAP_l: 0.6290, segm_mAP_copypaste: 0.425 0.671 0.454 0.215 0.457 0.629 2024-05-31 04:22:34,018 - mmdet - INFO - Epoch [8][50/7330] lr: 1.000e-04, eta: 11:17:55, time: 1.316, data_time: 0.141, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0446, loss_cls: 0.1692, acc: 93.6694, loss_bbox: 0.2271, loss_mask: 0.2292, loss: 0.6862 2024-05-31 04:23:31,435 - mmdet - INFO - Epoch [8][100/7330] lr: 1.000e-04, eta: 11:17:01, time: 1.148, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0431, loss_cls: 0.1668, acc: 93.7024, loss_bbox: 0.2180, loss_mask: 0.2221, loss: 0.6647 2024-05-31 04:24:26,846 - mmdet - INFO - Epoch [8][150/7330] lr: 1.000e-04, eta: 11:16:05, time: 1.108, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0403, loss_cls: 0.1644, acc: 93.9199, loss_bbox: 0.2177, loss_mask: 0.2211, loss: 0.6564 2024-05-31 04:25:21,708 - mmdet - INFO - Epoch [8][200/7330] lr: 1.000e-04, eta: 11:15:09, time: 1.097, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0415, loss_cls: 0.1722, acc: 93.5674, loss_bbox: 0.2201, loss_mask: 0.2222, loss: 0.6691 2024-05-31 04:26:18,396 - mmdet - INFO - Epoch [8][250/7330] lr: 1.000e-04, eta: 11:14:14, time: 1.134, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0415, loss_cls: 0.1650, acc: 93.7390, loss_bbox: 0.2127, loss_mask: 0.2239, loss: 0.6572 2024-05-31 04:27:12,438 - mmdet - INFO - Epoch [8][300/7330] lr: 1.000e-04, eta: 11:13:18, time: 1.081, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0416, loss_cls: 0.1675, acc: 93.7627, loss_bbox: 0.2157, loss_mask: 0.2261, loss: 0.6647 2024-05-31 04:28:07,313 - mmdet - INFO - Epoch [8][350/7330] lr: 1.000e-04, eta: 11:12:22, time: 1.098, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0435, loss_cls: 0.1729, acc: 93.6094, loss_bbox: 0.2272, loss_mask: 0.2239, loss: 0.6818 2024-05-31 04:29:01,808 - mmdet - INFO - Epoch [8][400/7330] lr: 1.000e-04, eta: 11:11:25, time: 1.090, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0402, loss_cls: 0.1629, acc: 93.8823, loss_bbox: 0.2118, loss_mask: 0.2199, loss: 0.6488 2024-05-31 04:29:59,250 - mmdet - INFO - Epoch [8][450/7330] lr: 1.000e-04, eta: 11:10:31, time: 1.149, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0439, loss_cls: 0.1694, acc: 93.6904, loss_bbox: 0.2215, loss_mask: 0.2229, loss: 0.6730 2024-05-31 04:30:58,165 - mmdet - INFO - Epoch [8][500/7330] lr: 1.000e-04, eta: 11:09:38, time: 1.178, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0410, loss_cls: 0.1663, acc: 93.8406, loss_bbox: 0.2145, loss_mask: 0.2192, loss: 0.6558 2024-05-31 04:31:55,755 - mmdet - INFO - Epoch [8][550/7330] lr: 1.000e-04, eta: 11:08:44, time: 1.152, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0400, loss_cls: 0.1692, acc: 93.6680, loss_bbox: 0.2220, loss_mask: 0.2272, loss: 0.6715 2024-05-31 04:32:54,459 - mmdet - INFO - Epoch [8][600/7330] lr: 1.000e-04, eta: 11:07:50, time: 1.174, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0454, loss_cls: 0.1706, acc: 93.5532, loss_bbox: 0.2225, loss_mask: 0.2244, loss: 0.6762 2024-05-31 04:33:51,487 - mmdet - INFO - Epoch [8][650/7330] lr: 1.000e-04, eta: 11:06:56, time: 1.141, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0419, loss_cls: 0.1707, acc: 93.5603, loss_bbox: 0.2229, loss_mask: 0.2220, loss: 0.6715 2024-05-31 04:34:47,147 - mmdet - INFO - Epoch [8][700/7330] lr: 1.000e-04, eta: 11:06:00, time: 1.113, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0427, loss_cls: 0.1652, acc: 93.8108, loss_bbox: 0.2194, loss_mask: 0.2267, loss: 0.6693 2024-05-31 04:35:41,945 - mmdet - INFO - Epoch [8][750/7330] lr: 1.000e-04, eta: 11:05:04, time: 1.096, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0430, loss_cls: 0.1663, acc: 93.7224, loss_bbox: 0.2173, loss_mask: 0.2270, loss: 0.6692 2024-05-31 04:36:38,930 - mmdet - INFO - Epoch [8][800/7330] lr: 1.000e-04, eta: 11:04:09, time: 1.140, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0406, loss_cls: 0.1625, acc: 93.8848, loss_bbox: 0.2132, loss_mask: 0.2219, loss: 0.6523 2024-05-31 04:37:34,590 - mmdet - INFO - Epoch [8][850/7330] lr: 1.000e-04, eta: 11:03:14, time: 1.113, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0437, loss_cls: 0.1722, acc: 93.4294, loss_bbox: 0.2268, loss_mask: 0.2252, loss: 0.6830 2024-05-31 04:38:30,325 - mmdet - INFO - Epoch [8][900/7330] lr: 1.000e-04, eta: 11:02:18, time: 1.115, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0417, loss_cls: 0.1641, acc: 93.7622, loss_bbox: 0.2167, loss_mask: 0.2207, loss: 0.6583 2024-05-31 04:39:25,528 - mmdet - INFO - Epoch [8][950/7330] lr: 1.000e-04, eta: 11:01:22, time: 1.104, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0458, loss_cls: 0.1728, acc: 93.5254, loss_bbox: 0.2252, loss_mask: 0.2277, loss: 0.6868 2024-05-31 04:40:23,568 - mmdet - INFO - Epoch [8][1000/7330] lr: 1.000e-04, eta: 11:00:29, time: 1.161, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0172, loss_rpn_bbox: 0.0437, loss_cls: 0.1746, acc: 93.4473, loss_bbox: 0.2250, loss_mask: 0.2299, loss: 0.6904 2024-05-31 04:41:21,374 - mmdet - INFO - Epoch [8][1050/7330] lr: 1.000e-04, eta: 10:59:35, time: 1.156, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0417, loss_cls: 0.1682, acc: 93.7532, loss_bbox: 0.2214, loss_mask: 0.2240, loss: 0.6702 2024-05-31 04:42:21,185 - mmdet - INFO - Epoch [8][1100/7330] lr: 1.000e-04, eta: 10:58:42, time: 1.196, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0445, loss_cls: 0.1686, acc: 93.7410, loss_bbox: 0.2205, loss_mask: 0.2271, loss: 0.6754 2024-05-31 04:43:19,555 - mmdet - INFO - Epoch [8][1150/7330] lr: 1.000e-04, eta: 10:57:48, time: 1.167, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0368, loss_cls: 0.1524, acc: 94.2456, loss_bbox: 0.1987, loss_mask: 0.2155, loss: 0.6153 2024-05-31 04:44:16,744 - mmdet - INFO - Epoch [8][1200/7330] lr: 1.000e-04, eta: 10:56:54, time: 1.144, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0409, loss_cls: 0.1685, acc: 93.6697, loss_bbox: 0.2157, loss_mask: 0.2263, loss: 0.6657 2024-05-31 04:45:11,863 - mmdet - INFO - Epoch [8][1250/7330] lr: 1.000e-04, eta: 10:55:58, time: 1.102, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0412, loss_cls: 0.1694, acc: 93.6306, loss_bbox: 0.2184, loss_mask: 0.2212, loss: 0.6643 2024-05-31 04:46:06,311 - mmdet - INFO - Epoch [8][1300/7330] lr: 1.000e-04, eta: 10:55:01, time: 1.089, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0428, loss_cls: 0.1663, acc: 93.7542, loss_bbox: 0.2237, loss_mask: 0.2233, loss: 0.6704 2024-05-31 04:47:04,265 - mmdet - INFO - Epoch [8][1350/7330] lr: 1.000e-04, eta: 10:54:07, time: 1.159, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0463, loss_cls: 0.1745, acc: 93.3831, loss_bbox: 0.2322, loss_mask: 0.2346, loss: 0.7036 2024-05-31 04:47:59,652 - mmdet - INFO - Epoch [8][1400/7330] lr: 1.000e-04, eta: 10:53:12, time: 1.108, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0422, loss_cls: 0.1671, acc: 93.7434, loss_bbox: 0.2166, loss_mask: 0.2191, loss: 0.6602 2024-05-31 04:48:54,540 - mmdet - INFO - Epoch [8][1450/7330] lr: 1.000e-04, eta: 10:52:16, time: 1.098, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0415, loss_cls: 0.1676, acc: 93.8142, loss_bbox: 0.2122, loss_mask: 0.2240, loss: 0.6595 2024-05-31 04:49:49,498 - mmdet - INFO - Epoch [8][1500/7330] lr: 1.000e-04, eta: 10:51:20, time: 1.099, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0166, loss_rpn_bbox: 0.0434, loss_cls: 0.1700, acc: 93.7002, loss_bbox: 0.2242, loss_mask: 0.2287, loss: 0.6829 2024-05-31 04:50:50,411 - mmdet - INFO - Epoch [8][1550/7330] lr: 1.000e-04, eta: 10:50:27, time: 1.218, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0403, loss_cls: 0.1693, acc: 93.7505, loss_bbox: 0.2200, loss_mask: 0.2228, loss: 0.6665 2024-05-31 04:51:45,503 - mmdet - INFO - Epoch [8][1600/7330] lr: 1.000e-04, eta: 10:49:32, time: 1.102, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0433, loss_cls: 0.1712, acc: 93.5220, loss_bbox: 0.2236, loss_mask: 0.2229, loss: 0.6762 2024-05-31 04:52:44,840 - mmdet - INFO - Epoch [8][1650/7330] lr: 1.000e-04, eta: 10:48:38, time: 1.187, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0409, loss_cls: 0.1694, acc: 93.6416, loss_bbox: 0.2208, loss_mask: 0.2202, loss: 0.6656 2024-05-31 04:53:42,648 - mmdet - INFO - Epoch [8][1700/7330] lr: 1.000e-04, eta: 10:47:44, time: 1.156, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0446, loss_cls: 0.1721, acc: 93.5339, loss_bbox: 0.2234, loss_mask: 0.2266, loss: 0.6824 2024-05-31 04:54:41,853 - mmdet - INFO - Epoch [8][1750/7330] lr: 1.000e-04, eta: 10:46:51, time: 1.184, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0430, loss_cls: 0.1708, acc: 93.6914, loss_bbox: 0.2201, loss_mask: 0.2248, loss: 0.6731 2024-05-31 04:55:37,425 - mmdet - INFO - Epoch [8][1800/7330] lr: 1.000e-04, eta: 10:45:55, time: 1.111, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0437, loss_cls: 0.1638, acc: 93.8872, loss_bbox: 0.2137, loss_mask: 0.2256, loss: 0.6623 2024-05-31 04:56:32,534 - mmdet - INFO - Epoch [8][1850/7330] lr: 1.000e-04, eta: 10:44:59, time: 1.102, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0422, loss_cls: 0.1680, acc: 93.7117, loss_bbox: 0.2165, loss_mask: 0.2207, loss: 0.6615 2024-05-31 04:57:29,093 - mmdet - INFO - Epoch [8][1900/7330] lr: 1.000e-04, eta: 10:44:04, time: 1.131, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0421, loss_cls: 0.1686, acc: 93.7490, loss_bbox: 0.2229, loss_mask: 0.2244, loss: 0.6738 2024-05-31 04:58:25,013 - mmdet - INFO - Epoch [8][1950/7330] lr: 1.000e-04, eta: 10:43:09, time: 1.118, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0437, loss_cls: 0.1758, acc: 93.3989, loss_bbox: 0.2260, loss_mask: 0.2284, loss: 0.6892 2024-05-31 04:59:19,588 - mmdet - INFO - Epoch [8][2000/7330] lr: 1.000e-04, eta: 10:42:13, time: 1.092, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0402, loss_cls: 0.1615, acc: 93.9502, loss_bbox: 0.2116, loss_mask: 0.2182, loss: 0.6453 2024-05-31 05:00:16,778 - mmdet - INFO - Epoch [8][2050/7330] lr: 1.000e-04, eta: 10:41:18, time: 1.144, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0419, loss_cls: 0.1700, acc: 93.6333, loss_bbox: 0.2192, loss_mask: 0.2201, loss: 0.6649 2024-05-31 05:01:13,323 - mmdet - INFO - Epoch [8][2100/7330] lr: 1.000e-04, eta: 10:40:23, time: 1.131, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0403, loss_cls: 0.1619, acc: 94.0464, loss_bbox: 0.2102, loss_mask: 0.2205, loss: 0.6469 2024-05-31 05:02:08,169 - mmdet - INFO - Epoch [8][2150/7330] lr: 1.000e-04, eta: 10:39:27, time: 1.097, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0419, loss_cls: 0.1638, acc: 93.8745, loss_bbox: 0.2156, loss_mask: 0.2200, loss: 0.6556 2024-05-31 05:03:05,915 - mmdet - INFO - Epoch [8][2200/7330] lr: 1.000e-04, eta: 10:38:33, time: 1.155, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0461, loss_cls: 0.1772, acc: 93.4500, loss_bbox: 0.2244, loss_mask: 0.2207, loss: 0.6838 2024-05-31 05:04:04,674 - mmdet - INFO - Epoch [8][2250/7330] lr: 1.000e-04, eta: 10:37:39, time: 1.175, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0407, loss_cls: 0.1619, acc: 93.8826, loss_bbox: 0.2115, loss_mask: 0.2228, loss: 0.6505 2024-05-31 05:05:01,365 - mmdet - INFO - Epoch [8][2300/7330] lr: 1.000e-04, eta: 10:36:44, time: 1.134, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0432, loss_cls: 0.1690, acc: 93.5562, loss_bbox: 0.2181, loss_mask: 0.2163, loss: 0.6601 2024-05-31 05:05:57,767 - mmdet - INFO - Epoch [8][2350/7330] lr: 1.000e-04, eta: 10:35:49, time: 1.128, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0413, loss_cls: 0.1674, acc: 93.6768, loss_bbox: 0.2110, loss_mask: 0.2204, loss: 0.6542 2024-05-31 05:06:52,975 - mmdet - INFO - Epoch [8][2400/7330] lr: 1.000e-04, eta: 10:34:53, time: 1.104, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0424, loss_cls: 0.1689, acc: 93.6602, loss_bbox: 0.2204, loss_mask: 0.2307, loss: 0.6769 2024-05-31 05:07:51,432 - mmdet - INFO - Epoch [8][2450/7330] lr: 1.000e-04, eta: 10:34:00, time: 1.169, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0169, loss_rpn_bbox: 0.0456, loss_cls: 0.1778, acc: 93.3638, loss_bbox: 0.2305, loss_mask: 0.2294, loss: 0.7002 2024-05-31 05:08:46,742 - mmdet - INFO - Epoch [8][2500/7330] lr: 1.000e-04, eta: 10:33:04, time: 1.106, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0437, loss_cls: 0.1721, acc: 93.5264, loss_bbox: 0.2228, loss_mask: 0.2288, loss: 0.6821 2024-05-31 05:09:39,828 - mmdet - INFO - Epoch [8][2550/7330] lr: 1.000e-04, eta: 10:32:07, time: 1.061, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0393, loss_cls: 0.1636, acc: 93.9749, loss_bbox: 0.2109, loss_mask: 0.2273, loss: 0.6544 2024-05-31 05:10:39,153 - mmdet - INFO - Epoch [8][2600/7330] lr: 1.000e-04, eta: 10:31:13, time: 1.187, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0421, loss_cls: 0.1726, acc: 93.5688, loss_bbox: 0.2232, loss_mask: 0.2293, loss: 0.6825 2024-05-31 05:11:34,293 - mmdet - INFO - Epoch [8][2650/7330] lr: 1.000e-04, eta: 10:30:17, time: 1.103, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0415, loss_cls: 0.1696, acc: 93.5869, loss_bbox: 0.2187, loss_mask: 0.2210, loss: 0.6653 2024-05-31 05:12:29,381 - mmdet - INFO - Epoch [8][2700/7330] lr: 1.000e-04, eta: 10:29:21, time: 1.102, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0413, loss_cls: 0.1704, acc: 93.6055, loss_bbox: 0.2188, loss_mask: 0.2258, loss: 0.6703 2024-05-31 05:13:26,633 - mmdet - INFO - Epoch [8][2750/7330] lr: 1.000e-04, eta: 10:28:27, time: 1.145, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0448, loss_cls: 0.1741, acc: 93.4521, loss_bbox: 0.2305, loss_mask: 0.2317, loss: 0.6969 2024-05-31 05:14:22,645 - mmdet - INFO - Epoch [8][2800/7330] lr: 1.000e-04, eta: 10:27:32, time: 1.120, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0414, loss_cls: 0.1673, acc: 93.8171, loss_bbox: 0.2138, loss_mask: 0.2201, loss: 0.6561 2024-05-31 05:15:22,872 - mmdet - INFO - Epoch [8][2850/7330] lr: 1.000e-04, eta: 10:26:39, time: 1.205, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0417, loss_cls: 0.1715, acc: 93.6414, loss_bbox: 0.2190, loss_mask: 0.2286, loss: 0.6738 2024-05-31 05:16:17,464 - mmdet - INFO - Epoch [8][2900/7330] lr: 1.000e-04, eta: 10:25:43, time: 1.092, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0394, loss_cls: 0.1717, acc: 93.6548, loss_bbox: 0.2140, loss_mask: 0.2211, loss: 0.6614 2024-05-31 05:17:16,847 - mmdet - INFO - Epoch [8][2950/7330] lr: 1.000e-04, eta: 10:24:49, time: 1.188, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0397, loss_cls: 0.1630, acc: 93.8506, loss_bbox: 0.2193, loss_mask: 0.2284, loss: 0.6649 2024-05-31 05:18:11,696 - mmdet - INFO - Epoch [8][3000/7330] lr: 1.000e-04, eta: 10:23:53, time: 1.097, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0409, loss_cls: 0.1665, acc: 93.7769, loss_bbox: 0.2159, loss_mask: 0.2228, loss: 0.6607 2024-05-31 05:19:06,716 - mmdet - INFO - Epoch [8][3050/7330] lr: 1.000e-04, eta: 10:22:57, time: 1.100, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0427, loss_cls: 0.1725, acc: 93.5825, loss_bbox: 0.2226, loss_mask: 0.2313, loss: 0.6838 2024-05-31 05:20:03,268 - mmdet - INFO - Epoch [8][3100/7330] lr: 1.000e-04, eta: 10:22:02, time: 1.131, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0431, loss_cls: 0.1721, acc: 93.5503, loss_bbox: 0.2259, loss_mask: 0.2253, loss: 0.6813 2024-05-31 05:21:01,105 - mmdet - INFO - Epoch [8][3150/7330] lr: 1.000e-04, eta: 10:21:08, time: 1.157, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0442, loss_cls: 0.1751, acc: 93.4087, loss_bbox: 0.2299, loss_mask: 0.2261, loss: 0.6919 2024-05-31 05:21:56,069 - mmdet - INFO - Epoch [8][3200/7330] lr: 1.000e-04, eta: 10:20:12, time: 1.099, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0400, loss_cls: 0.1662, acc: 93.8831, loss_bbox: 0.2144, loss_mask: 0.2201, loss: 0.6556 2024-05-31 05:22:50,635 - mmdet - INFO - Epoch [8][3250/7330] lr: 1.000e-04, eta: 10:19:16, time: 1.091, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0421, loss_cls: 0.1654, acc: 93.9312, loss_bbox: 0.2107, loss_mask: 0.2211, loss: 0.6542 2024-05-31 05:23:47,683 - mmdet - INFO - Epoch [8][3300/7330] lr: 1.000e-04, eta: 10:18:21, time: 1.141, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0441, loss_cls: 0.1715, acc: 93.6191, loss_bbox: 0.2195, loss_mask: 0.2231, loss: 0.6738 2024-05-31 05:24:44,552 - mmdet - INFO - Epoch [8][3350/7330] lr: 1.000e-04, eta: 10:17:26, time: 1.137, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0411, loss_cls: 0.1710, acc: 93.6462, loss_bbox: 0.2197, loss_mask: 0.2250, loss: 0.6713 2024-05-31 05:25:44,784 - mmdet - INFO - Epoch [8][3400/7330] lr: 1.000e-04, eta: 10:16:33, time: 1.205, data_time: 0.097, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0407, loss_cls: 0.1699, acc: 93.7151, loss_bbox: 0.2179, loss_mask: 0.2224, loss: 0.6671 2024-05-31 05:26:39,731 - mmdet - INFO - Epoch [8][3450/7330] lr: 1.000e-04, eta: 10:15:37, time: 1.099, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0427, loss_cls: 0.1666, acc: 93.7444, loss_bbox: 0.2189, loss_mask: 0.2233, loss: 0.6675 2024-05-31 05:27:36,433 - mmdet - INFO - Epoch [8][3500/7330] lr: 1.000e-04, eta: 10:14:42, time: 1.134, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0415, loss_cls: 0.1703, acc: 93.6445, loss_bbox: 0.2189, loss_mask: 0.2220, loss: 0.6674 2024-05-31 05:28:33,616 - mmdet - INFO - Epoch [8][3550/7330] lr: 1.000e-04, eta: 10:13:47, time: 1.144, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0424, loss_cls: 0.1696, acc: 93.7310, loss_bbox: 0.2177, loss_mask: 0.2228, loss: 0.6678 2024-05-31 05:29:28,193 - mmdet - INFO - Epoch [8][3600/7330] lr: 1.000e-04, eta: 10:12:51, time: 1.092, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0416, loss_cls: 0.1701, acc: 93.5923, loss_bbox: 0.2225, loss_mask: 0.2207, loss: 0.6699 2024-05-31 05:30:24,298 - mmdet - INFO - Epoch [8][3650/7330] lr: 1.000e-04, eta: 10:11:56, time: 1.122, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0456, loss_cls: 0.1714, acc: 93.6575, loss_bbox: 0.2216, loss_mask: 0.2192, loss: 0.6746 2024-05-31 05:31:23,986 - mmdet - INFO - Epoch [8][3700/7330] lr: 1.000e-04, eta: 10:11:03, time: 1.194, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0408, loss_cls: 0.1669, acc: 93.7595, loss_bbox: 0.2091, loss_mask: 0.2215, loss: 0.6540 2024-05-31 05:32:19,195 - mmdet - INFO - Epoch [8][3750/7330] lr: 1.000e-04, eta: 10:10:07, time: 1.104, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0417, loss_cls: 0.1744, acc: 93.5623, loss_bbox: 0.2208, loss_mask: 0.2213, loss: 0.6739 2024-05-31 05:33:13,636 - mmdet - INFO - Epoch [8][3800/7330] lr: 1.000e-04, eta: 10:09:10, time: 1.089, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0421, loss_cls: 0.1666, acc: 93.7292, loss_bbox: 0.2174, loss_mask: 0.2211, loss: 0.6616 2024-05-31 05:34:10,533 - mmdet - INFO - Epoch [8][3850/7330] lr: 1.000e-04, eta: 10:08:15, time: 1.138, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0455, loss_cls: 0.1708, acc: 93.5920, loss_bbox: 0.2183, loss_mask: 0.2303, loss: 0.6808 2024-05-31 05:35:07,208 - mmdet - INFO - Epoch [8][3900/7330] lr: 1.000e-04, eta: 10:07:20, time: 1.133, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0420, loss_cls: 0.1655, acc: 93.8735, loss_bbox: 0.2118, loss_mask: 0.2263, loss: 0.6599 2024-05-31 05:36:04,728 - mmdet - INFO - Epoch [8][3950/7330] lr: 1.000e-04, eta: 10:06:26, time: 1.150, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0458, loss_cls: 0.1731, acc: 93.5132, loss_bbox: 0.2249, loss_mask: 0.2299, loss: 0.6901 2024-05-31 05:37:02,578 - mmdet - INFO - Epoch [8][4000/7330] lr: 1.000e-04, eta: 10:05:32, time: 1.157, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0425, loss_cls: 0.1680, acc: 93.6382, loss_bbox: 0.2182, loss_mask: 0.2214, loss: 0.6638 2024-05-31 05:37:59,651 - mmdet - INFO - Epoch [8][4050/7330] lr: 1.000e-04, eta: 10:04:37, time: 1.141, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0437, loss_cls: 0.1715, acc: 93.5903, loss_bbox: 0.2206, loss_mask: 0.2239, loss: 0.6739 2024-05-31 05:38:56,782 - mmdet - INFO - Epoch [8][4100/7330] lr: 1.000e-04, eta: 10:03:42, time: 1.143, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0432, loss_cls: 0.1744, acc: 93.5549, loss_bbox: 0.2229, loss_mask: 0.2186, loss: 0.6746 2024-05-31 05:39:51,009 - mmdet - INFO - Epoch [8][4150/7330] lr: 1.000e-04, eta: 10:02:46, time: 1.084, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0397, loss_cls: 0.1616, acc: 93.9985, loss_bbox: 0.2075, loss_mask: 0.2185, loss: 0.6409 2024-05-31 05:40:45,990 - mmdet - INFO - Epoch [8][4200/7330] lr: 1.000e-04, eta: 10:01:50, time: 1.100, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0411, loss_cls: 0.1748, acc: 93.3799, loss_bbox: 0.2249, loss_mask: 0.2324, loss: 0.6886 2024-05-31 05:41:43,684 - mmdet - INFO - Epoch [8][4250/7330] lr: 1.000e-04, eta: 10:00:55, time: 1.154, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0426, loss_cls: 0.1652, acc: 93.7837, loss_bbox: 0.2145, loss_mask: 0.2242, loss: 0.6612 2024-05-31 05:42:37,974 - mmdet - INFO - Epoch [8][4300/7330] lr: 1.000e-04, eta: 9:59:59, time: 1.086, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0397, loss_cls: 0.1630, acc: 93.9814, loss_bbox: 0.2062, loss_mask: 0.2173, loss: 0.6401 2024-05-31 05:43:37,290 - mmdet - INFO - Epoch [8][4350/7330] lr: 1.000e-04, eta: 9:59:04, time: 1.144, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0453, loss_cls: 0.1765, acc: 93.3188, loss_bbox: 0.2237, loss_mask: 0.2255, loss: 0.6866 2024-05-31 05:44:34,952 - mmdet - INFO - Epoch [8][4400/7330] lr: 1.000e-04, eta: 9:58:11, time: 1.196, data_time: 0.118, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0439, loss_cls: 0.1682, acc: 93.6311, loss_bbox: 0.2197, loss_mask: 0.2235, loss: 0.6708 2024-05-31 05:45:30,419 - mmdet - INFO - Epoch [8][4450/7330] lr: 1.000e-04, eta: 9:57:15, time: 1.109, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0457, loss_cls: 0.1797, acc: 93.3135, loss_bbox: 0.2311, loss_mask: 0.2262, loss: 0.6993 2024-05-31 05:46:27,784 - mmdet - INFO - Epoch [8][4500/7330] lr: 1.000e-04, eta: 9:56:20, time: 1.147, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0406, loss_cls: 0.1652, acc: 93.7893, loss_bbox: 0.2170, loss_mask: 0.2243, loss: 0.6622 2024-05-31 05:47:23,235 - mmdet - INFO - Epoch [8][4550/7330] lr: 1.000e-04, eta: 9:55:25, time: 1.109, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0157, loss_rpn_bbox: 0.0414, loss_cls: 0.1661, acc: 93.8650, loss_bbox: 0.2083, loss_mask: 0.2193, loss: 0.6508 2024-05-31 05:48:22,703 - mmdet - INFO - Epoch [8][4600/7330] lr: 1.000e-04, eta: 9:54:31, time: 1.189, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0409, loss_cls: 0.1734, acc: 93.4609, loss_bbox: 0.2247, loss_mask: 0.2219, loss: 0.6755 2024-05-31 05:49:18,250 - mmdet - INFO - Epoch [8][4650/7330] lr: 1.000e-04, eta: 9:53:35, time: 1.111, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0445, loss_cls: 0.1714, acc: 93.5278, loss_bbox: 0.2228, loss_mask: 0.2256, loss: 0.6807 2024-05-31 05:50:15,147 - mmdet - INFO - Epoch [8][4700/7330] lr: 1.000e-04, eta: 9:52:40, time: 1.138, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0426, loss_cls: 0.1715, acc: 93.5974, loss_bbox: 0.2259, loss_mask: 0.2235, loss: 0.6782 2024-05-31 05:51:10,275 - mmdet - INFO - Epoch [8][4750/7330] lr: 1.000e-04, eta: 9:51:44, time: 1.102, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0444, loss_cls: 0.1742, acc: 93.4404, loss_bbox: 0.2280, loss_mask: 0.2295, loss: 0.6910 2024-05-31 05:52:07,510 - mmdet - INFO - Epoch [8][4800/7330] lr: 1.000e-04, eta: 9:50:50, time: 1.145, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0423, loss_cls: 0.1702, acc: 93.6714, loss_bbox: 0.2207, loss_mask: 0.2223, loss: 0.6700 2024-05-31 05:53:03,216 - mmdet - INFO - Epoch [8][4850/7330] lr: 1.000e-04, eta: 9:49:54, time: 1.114, data_time: 0.104, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0412, loss_cls: 0.1669, acc: 93.7581, loss_bbox: 0.2174, loss_mask: 0.2222, loss: 0.6627 2024-05-31 05:54:00,698 - mmdet - INFO - Epoch [8][4900/7330] lr: 1.000e-04, eta: 9:48:58, time: 1.091, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0165, loss_rpn_bbox: 0.0443, loss_cls: 0.1807, acc: 93.3030, loss_bbox: 0.2340, loss_mask: 0.2333, loss: 0.7088 2024-05-31 05:55:00,770 - mmdet - INFO - Epoch [8][4950/7330] lr: 1.000e-04, eta: 9:48:06, time: 1.260, data_time: 0.149, memory: 24290, loss_rpn_cls: 0.0139, loss_rpn_bbox: 0.0414, loss_cls: 0.1684, acc: 93.7732, loss_bbox: 0.2205, loss_mask: 0.2218, loss: 0.6660 2024-05-31 05:55:55,135 - mmdet - INFO - Epoch [8][5000/7330] lr: 1.000e-04, eta: 9:47:10, time: 1.087, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0417, loss_cls: 0.1684, acc: 93.7937, loss_bbox: 0.2135, loss_mask: 0.2160, loss: 0.6548 2024-05-31 05:56:51,740 - mmdet - INFO - Epoch [8][5050/7330] lr: 1.000e-04, eta: 9:46:15, time: 1.132, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0426, loss_cls: 0.1652, acc: 93.8213, loss_bbox: 0.2167, loss_mask: 0.2224, loss: 0.6622 2024-05-31 05:57:46,490 - mmdet - INFO - Epoch [8][5100/7330] lr: 1.000e-04, eta: 9:45:19, time: 1.095, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0430, loss_cls: 0.1733, acc: 93.5784, loss_bbox: 0.2247, loss_mask: 0.2296, loss: 0.6856 2024-05-31 05:58:44,739 - mmdet - INFO - Epoch [8][5150/7330] lr: 1.000e-04, eta: 9:44:24, time: 1.165, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0440, loss_cls: 0.1740, acc: 93.5828, loss_bbox: 0.2204, loss_mask: 0.2240, loss: 0.6792 2024-05-31 05:59:39,092 - mmdet - INFO - Epoch [8][5200/7330] lr: 1.000e-04, eta: 9:43:28, time: 1.087, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0406, loss_cls: 0.1579, acc: 94.1082, loss_bbox: 0.2108, loss_mask: 0.2199, loss: 0.6451 2024-05-31 06:00:36,514 - mmdet - INFO - Epoch [8][5250/7330] lr: 1.000e-04, eta: 9:42:33, time: 1.148, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0144, loss_rpn_bbox: 0.0427, loss_cls: 0.1635, acc: 93.8477, loss_bbox: 0.2150, loss_mask: 0.2263, loss: 0.6619 2024-05-31 06:01:34,214 - mmdet - INFO - Epoch [8][5300/7330] lr: 1.000e-04, eta: 9:41:39, time: 1.154, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0168, loss_rpn_bbox: 0.0448, loss_cls: 0.1770, acc: 93.3250, loss_bbox: 0.2240, loss_mask: 0.2246, loss: 0.6873 2024-05-31 06:02:30,968 - mmdet - INFO - Epoch [8][5350/7330] lr: 1.000e-04, eta: 9:40:44, time: 1.135, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0416, loss_cls: 0.1719, acc: 93.5376, loss_bbox: 0.2215, loss_mask: 0.2242, loss: 0.6755 2024-05-31 06:03:25,222 - mmdet - INFO - Epoch [8][5400/7330] lr: 1.000e-04, eta: 9:39:47, time: 1.085, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0402, loss_cls: 0.1678, acc: 93.7449, loss_bbox: 0.2172, loss_mask: 0.2271, loss: 0.6665 2024-05-31 06:04:23,906 - mmdet - INFO - Epoch [8][5450/7330] lr: 1.000e-04, eta: 9:38:53, time: 1.174, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0161, loss_rpn_bbox: 0.0443, loss_cls: 0.1737, acc: 93.4666, loss_bbox: 0.2251, loss_mask: 0.2227, loss: 0.6819 2024-05-31 06:05:20,934 - mmdet - INFO - Epoch [8][5500/7330] lr: 1.000e-04, eta: 9:37:58, time: 1.141, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0423, loss_cls: 0.1676, acc: 93.7312, loss_bbox: 0.2230, loss_mask: 0.2274, loss: 0.6767 2024-05-31 06:06:20,072 - mmdet - INFO - Epoch [8][5550/7330] lr: 1.000e-04, eta: 9:37:05, time: 1.183, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0153, loss_rpn_bbox: 0.0408, loss_cls: 0.1672, acc: 93.7957, loss_bbox: 0.2118, loss_mask: 0.2288, loss: 0.6638 2024-05-31 06:07:14,985 - mmdet - INFO - Epoch [8][5600/7330] lr: 1.000e-04, eta: 9:36:09, time: 1.098, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0154, loss_rpn_bbox: 0.0397, loss_cls: 0.1633, acc: 93.9910, loss_bbox: 0.2111, loss_mask: 0.2173, loss: 0.6468 2024-05-31 06:08:12,899 - mmdet - INFO - Epoch [8][5650/7330] lr: 1.000e-04, eta: 9:35:14, time: 1.158, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0438, loss_cls: 0.1777, acc: 93.2629, loss_bbox: 0.2268, loss_mask: 0.2273, loss: 0.6911 2024-05-31 06:09:07,192 - mmdet - INFO - Epoch [8][5700/7330] lr: 1.000e-04, eta: 9:34:18, time: 1.086, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0414, loss_cls: 0.1681, acc: 93.7419, loss_bbox: 0.2182, loss_mask: 0.2206, loss: 0.6626 2024-05-31 06:10:01,651 - mmdet - INFO - Epoch [8][5750/7330] lr: 1.000e-04, eta: 9:33:21, time: 1.089, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0433, loss_cls: 0.1697, acc: 93.5422, loss_bbox: 0.2192, loss_mask: 0.2274, loss: 0.6738 2024-05-31 06:10:55,828 - mmdet - INFO - Epoch [8][5800/7330] lr: 1.000e-04, eta: 9:32:25, time: 1.084, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0400, loss_cls: 0.1588, acc: 93.9929, loss_bbox: 0.2054, loss_mask: 0.2117, loss: 0.6300 2024-05-31 06:11:54,536 - mmdet - INFO - Epoch [8][5850/7330] lr: 1.000e-04, eta: 9:31:31, time: 1.174, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0415, loss_cls: 0.1729, acc: 93.4734, loss_bbox: 0.2221, loss_mask: 0.2218, loss: 0.6734 2024-05-31 06:12:52,107 - mmdet - INFO - Epoch [8][5900/7330] lr: 1.000e-04, eta: 9:30:36, time: 1.151, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0424, loss_cls: 0.1733, acc: 93.5852, loss_bbox: 0.2194, loss_mask: 0.2227, loss: 0.6733 2024-05-31 06:13:46,044 - mmdet - INFO - Epoch [8][5950/7330] lr: 1.000e-04, eta: 9:29:40, time: 1.079, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0146, loss_rpn_bbox: 0.0418, loss_cls: 0.1633, acc: 93.8247, loss_bbox: 0.2164, loss_mask: 0.2264, loss: 0.6623 2024-05-31 06:14:44,819 - mmdet - INFO - Epoch [8][6000/7330] lr: 1.000e-04, eta: 9:28:46, time: 1.176, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0452, loss_cls: 0.1699, acc: 93.6599, loss_bbox: 0.2189, loss_mask: 0.2275, loss: 0.6771 2024-05-31 06:15:41,522 - mmdet - INFO - Epoch [8][6050/7330] lr: 1.000e-04, eta: 9:27:51, time: 1.134, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0171, loss_rpn_bbox: 0.0449, loss_cls: 0.1800, acc: 93.2568, loss_bbox: 0.2310, loss_mask: 0.2217, loss: 0.6947 2024-05-31 06:16:38,505 - mmdet - INFO - Epoch [8][6100/7330] lr: 1.000e-04, eta: 9:26:56, time: 1.140, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0438, loss_cls: 0.1672, acc: 93.7661, loss_bbox: 0.2181, loss_mask: 0.2260, loss: 0.6706 2024-05-31 06:17:37,489 - mmdet - INFO - Epoch [8][6150/7330] lr: 1.000e-04, eta: 9:26:02, time: 1.180, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0160, loss_rpn_bbox: 0.0440, loss_cls: 0.1706, acc: 93.5952, loss_bbox: 0.2259, loss_mask: 0.2253, loss: 0.6820 2024-05-31 06:18:31,945 - mmdet - INFO - Epoch [8][6200/7330] lr: 1.000e-04, eta: 9:25:05, time: 1.089, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0151, loss_rpn_bbox: 0.0391, loss_cls: 0.1654, acc: 93.8965, loss_bbox: 0.2092, loss_mask: 0.2219, loss: 0.6507 2024-05-31 06:19:27,051 - mmdet - INFO - Epoch [8][6250/7330] lr: 1.000e-04, eta: 9:24:09, time: 1.102, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0167, loss_rpn_bbox: 0.0411, loss_cls: 0.1689, acc: 93.6123, loss_bbox: 0.2181, loss_mask: 0.2219, loss: 0.6667 2024-05-31 06:20:22,183 - mmdet - INFO - Epoch [8][6300/7330] lr: 1.000e-04, eta: 9:23:13, time: 1.103, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0152, loss_rpn_bbox: 0.0430, loss_cls: 0.1762, acc: 93.4260, loss_bbox: 0.2212, loss_mask: 0.2251, loss: 0.6807 2024-05-31 06:21:17,247 - mmdet - INFO - Epoch [8][6350/7330] lr: 1.000e-04, eta: 9:22:17, time: 1.101, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0145, loss_rpn_bbox: 0.0408, loss_cls: 0.1659, acc: 93.8967, loss_bbox: 0.2164, loss_mask: 0.2190, loss: 0.6566 2024-05-31 06:22:12,121 - mmdet - INFO - Epoch [8][6400/7330] lr: 1.000e-04, eta: 9:21:21, time: 1.097, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0404, loss_cls: 0.1659, acc: 93.8308, loss_bbox: 0.2167, loss_mask: 0.2240, loss: 0.6619 2024-05-31 06:23:13,345 - mmdet - INFO - Epoch [8][6450/7330] lr: 1.000e-04, eta: 9:20:29, time: 1.225, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0143, loss_rpn_bbox: 0.0429, loss_cls: 0.1617, acc: 93.9849, loss_bbox: 0.2127, loss_mask: 0.2249, loss: 0.6564 2024-05-31 06:24:08,609 - mmdet - INFO - Epoch [8][6500/7330] lr: 1.000e-04, eta: 9:19:33, time: 1.105, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0164, loss_rpn_bbox: 0.0434, loss_cls: 0.1773, acc: 93.3936, loss_bbox: 0.2216, loss_mask: 0.2281, loss: 0.6868 2024-05-31 06:25:06,334 - mmdet - INFO - Epoch [8][6550/7330] lr: 1.000e-04, eta: 9:18:38, time: 1.154, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0147, loss_rpn_bbox: 0.0415, loss_cls: 0.1649, acc: 93.7705, loss_bbox: 0.2160, loss_mask: 0.2216, loss: 0.6587 2024-05-31 06:26:03,293 - mmdet - INFO - Epoch [8][6600/7330] lr: 1.000e-04, eta: 9:17:43, time: 1.139, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0170, loss_rpn_bbox: 0.0471, loss_cls: 0.1829, acc: 93.1147, loss_bbox: 0.2349, loss_mask: 0.2319, loss: 0.7139 2024-05-31 06:26:59,768 - mmdet - INFO - Epoch [8][6650/7330] lr: 1.000e-04, eta: 9:16:48, time: 1.129, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0163, loss_rpn_bbox: 0.0441, loss_cls: 0.1737, acc: 93.5789, loss_bbox: 0.2230, loss_mask: 0.2273, loss: 0.6844 2024-05-31 06:27:56,280 - mmdet - INFO - Epoch [8][6700/7330] lr: 1.000e-04, eta: 9:15:53, time: 1.130, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0150, loss_rpn_bbox: 0.0406, loss_cls: 0.1715, acc: 93.6267, loss_bbox: 0.2163, loss_mask: 0.2182, loss: 0.6617 2024-05-31 06:28:50,502 - mmdet - INFO - Epoch [8][6750/7330] lr: 1.000e-04, eta: 9:14:56, time: 1.084, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0158, loss_rpn_bbox: 0.0408, loss_cls: 0.1713, acc: 93.4932, loss_bbox: 0.2226, loss_mask: 0.2250, loss: 0.6754 2024-05-31 06:29:47,361 - mmdet - INFO - Epoch [8][6800/7330] lr: 1.000e-04, eta: 9:14:01, time: 1.137, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0156, loss_rpn_bbox: 0.0439, loss_cls: 0.1726, acc: 93.5784, loss_bbox: 0.2218, loss_mask: 0.2208, loss: 0.6747 2024-05-31 06:30:41,789 - mmdet - INFO - Epoch [8][6850/7330] lr: 1.000e-04, eta: 9:13:05, time: 1.089, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0422, loss_cls: 0.1728, acc: 93.5515, loss_bbox: 0.2198, loss_mask: 0.2292, loss: 0.6789 2024-05-31 06:31:36,579 - mmdet - INFO - Epoch [8][6900/7330] lr: 1.000e-04, eta: 9:12:09, time: 1.096, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0159, loss_rpn_bbox: 0.0427, loss_cls: 0.1673, acc: 93.7637, loss_bbox: 0.2187, loss_mask: 0.2250, loss: 0.6695 2024-05-31 06:32:30,035 - mmdet - INFO - Epoch [8][6950/7330] lr: 1.000e-04, eta: 9:11:12, time: 1.069, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0378, loss_cls: 0.1590, acc: 94.0554, loss_bbox: 0.2077, loss_mask: 0.2133, loss: 0.6314 2024-05-31 06:33:27,007 - mmdet - INFO - Epoch [8][7000/7330] lr: 1.000e-04, eta: 9:10:17, time: 1.139, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0155, loss_rpn_bbox: 0.0413, loss_cls: 0.1717, acc: 93.6431, loss_bbox: 0.2240, loss_mask: 0.2201, loss: 0.6726 2024-05-31 06:34:25,421 - mmdet - INFO - Epoch [8][7050/7330] lr: 1.000e-04, eta: 9:09:22, time: 1.168, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0148, loss_rpn_bbox: 0.0408, loss_cls: 0.1645, acc: 93.8828, loss_bbox: 0.2115, loss_mask: 0.2121, loss: 0.6437 2024-05-31 06:35:21,767 - mmdet - INFO - Epoch [8][7100/7330] lr: 1.000e-04, eta: 9:08:27, time: 1.127, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0411, loss_cls: 0.1706, acc: 93.6282, loss_bbox: 0.2210, loss_mask: 0.2246, loss: 0.6711 2024-05-31 06:36:20,983 - mmdet - INFO - Epoch [8][7150/7330] lr: 1.000e-04, eta: 9:07:33, time: 1.185, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0430, loss_cls: 0.1788, acc: 93.2661, loss_bbox: 0.2258, loss_mask: 0.2230, loss: 0.6854 2024-05-31 06:37:17,598 - mmdet - INFO - Epoch [8][7200/7330] lr: 1.000e-04, eta: 9:06:38, time: 1.132, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0409, loss_cls: 0.1665, acc: 93.7039, loss_bbox: 0.2147, loss_mask: 0.2169, loss: 0.6531 2024-05-31 06:38:11,668 - mmdet - INFO - Epoch [8][7250/7330] lr: 1.000e-04, eta: 9:05:41, time: 1.081, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0405, loss_cls: 0.1742, acc: 93.5974, loss_bbox: 0.2172, loss_mask: 0.2196, loss: 0.6664 2024-05-31 06:39:06,125 - mmdet - INFO - Epoch [8][7300/7330] lr: 1.000e-04, eta: 9:04:45, time: 1.089, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0149, loss_rpn_bbox: 0.0438, loss_cls: 0.1735, acc: 93.4543, loss_bbox: 0.2222, loss_mask: 0.2236, loss: 0.6780 2024-05-31 06:39:39,153 - mmdet - INFO - Saving checkpoint at 8 epochs 2024-05-31 06:42:00,971 - mmdet - INFO - Evaluating bbox... 2024-05-31 06:42:22,030 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.475 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.711 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.520 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.315 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.513 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.634 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.595 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.595 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.595 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.420 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.637 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.751 2024-05-31 06:42:22,030 - mmdet - INFO - Evaluating segm... 2024-05-31 06:42:48,353 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.429 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.675 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.458 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.232 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.462 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.640 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.542 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.542 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.542 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.350 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.587 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.721 2024-05-31 06:42:48,674 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 06:42:48,675 - mmdet - INFO - Epoch(val) [8][625] bbox_mAP: 0.4750, bbox_mAP_50: 0.7110, bbox_mAP_75: 0.5200, bbox_mAP_s: 0.3150, bbox_mAP_m: 0.5130, bbox_mAP_l: 0.6340, bbox_mAP_copypaste: 0.475 0.711 0.520 0.315 0.513 0.634, segm_mAP: 0.4290, segm_mAP_50: 0.6750, segm_mAP_75: 0.4580, segm_mAP_s: 0.2320, segm_mAP_m: 0.4620, segm_mAP_l: 0.6400, segm_mAP_copypaste: 0.429 0.675 0.458 0.232 0.462 0.640 2024-05-31 06:43:55,664 - mmdet - INFO - Epoch [9][50/7330] lr: 1.000e-05, eta: 9:03:05, time: 1.339, data_time: 0.143, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0400, loss_cls: 0.1563, acc: 94.0723, loss_bbox: 0.2040, loss_mask: 0.2130, loss: 0.6272 2024-05-31 06:44:50,390 - mmdet - INFO - Epoch [9][100/7330] lr: 1.000e-05, eta: 9:02:09, time: 1.094, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0418, loss_cls: 0.1615, acc: 93.7993, loss_bbox: 0.2133, loss_mask: 0.2179, loss: 0.6475 2024-05-31 06:45:44,940 - mmdet - INFO - Epoch [9][150/7330] lr: 1.000e-05, eta: 9:01:13, time: 1.091, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0135, loss_rpn_bbox: 0.0403, loss_cls: 0.1560, acc: 94.1750, loss_bbox: 0.2055, loss_mask: 0.2140, loss: 0.6294 2024-05-31 06:46:38,721 - mmdet - INFO - Epoch [9][200/7330] lr: 1.000e-05, eta: 9:00:16, time: 1.076, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0404, loss_cls: 0.1515, acc: 94.2122, loss_bbox: 0.1986, loss_mask: 0.2130, loss: 0.6157 2024-05-31 06:47:36,009 - mmdet - INFO - Epoch [9][250/7330] lr: 1.000e-05, eta: 8:59:21, time: 1.146, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0430, loss_cls: 0.1700, acc: 93.5312, loss_bbox: 0.2248, loss_mask: 0.2268, loss: 0.6788 2024-05-31 06:48:30,378 - mmdet - INFO - Epoch [9][300/7330] lr: 1.000e-05, eta: 8:58:25, time: 1.087, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0406, loss_cls: 0.1546, acc: 94.1067, loss_bbox: 0.2075, loss_mask: 0.2145, loss: 0.6303 2024-05-31 06:49:24,913 - mmdet - INFO - Epoch [9][350/7330] lr: 1.000e-05, eta: 8:57:29, time: 1.091, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0392, loss_cls: 0.1505, acc: 94.2622, loss_bbox: 0.2039, loss_mask: 0.2141, loss: 0.6194 2024-05-31 06:50:19,293 - mmdet - INFO - Epoch [9][400/7330] lr: 1.000e-05, eta: 8:56:32, time: 1.088, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0390, loss_cls: 0.1558, acc: 94.0979, loss_bbox: 0.2050, loss_mask: 0.2129, loss: 0.6248 2024-05-31 06:51:16,295 - mmdet - INFO - Epoch [9][450/7330] lr: 1.000e-05, eta: 8:55:37, time: 1.140, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0373, loss_cls: 0.1482, acc: 94.3245, loss_bbox: 0.2001, loss_mask: 0.2119, loss: 0.6096 2024-05-31 06:52:15,702 - mmdet - INFO - Epoch [9][500/7330] lr: 1.000e-05, eta: 8:54:44, time: 1.188, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0397, loss_cls: 0.1540, acc: 94.1575, loss_bbox: 0.2051, loss_mask: 0.2159, loss: 0.6275 2024-05-31 06:53:12,561 - mmdet - INFO - Epoch [9][550/7330] lr: 1.000e-05, eta: 8:53:49, time: 1.137, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0390, loss_cls: 0.1578, acc: 93.9502, loss_bbox: 0.2088, loss_mask: 0.2128, loss: 0.6313 2024-05-31 06:54:09,482 - mmdet - INFO - Epoch [9][600/7330] lr: 1.000e-05, eta: 8:52:53, time: 1.138, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0414, loss_cls: 0.1645, acc: 93.7227, loss_bbox: 0.2122, loss_mask: 0.2145, loss: 0.6462 2024-05-31 06:55:03,761 - mmdet - INFO - Epoch [9][650/7330] lr: 1.000e-05, eta: 8:51:57, time: 1.086, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0410, loss_cls: 0.1558, acc: 94.1426, loss_bbox: 0.2051, loss_mask: 0.2183, loss: 0.6333 2024-05-31 06:55:57,545 - mmdet - INFO - Epoch [9][700/7330] lr: 1.000e-05, eta: 8:51:01, time: 1.076, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0400, loss_cls: 0.1545, acc: 94.0723, loss_bbox: 0.2088, loss_mask: 0.2171, loss: 0.6331 2024-05-31 06:56:56,676 - mmdet - INFO - Epoch [9][750/7330] lr: 1.000e-05, eta: 8:50:07, time: 1.183, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0385, loss_cls: 0.1561, acc: 94.0754, loss_bbox: 0.2045, loss_mask: 0.2141, loss: 0.6256 2024-05-31 06:57:52,904 - mmdet - INFO - Epoch [9][800/7330] lr: 1.000e-05, eta: 8:49:11, time: 1.124, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0394, loss_cls: 0.1526, acc: 94.1335, loss_bbox: 0.2018, loss_mask: 0.2085, loss: 0.6144 2024-05-31 06:58:47,480 - mmdet - INFO - Epoch [9][850/7330] lr: 1.000e-05, eta: 8:48:15, time: 1.091, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0141, loss_rpn_bbox: 0.0418, loss_cls: 0.1600, acc: 93.9099, loss_bbox: 0.2122, loss_mask: 0.2173, loss: 0.6454 2024-05-31 06:59:41,616 - mmdet - INFO - Epoch [9][900/7330] lr: 1.000e-05, eta: 8:47:19, time: 1.083, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0394, loss_cls: 0.1527, acc: 94.1667, loss_bbox: 0.2056, loss_mask: 0.2112, loss: 0.6212 2024-05-31 07:00:38,438 - mmdet - INFO - Epoch [9][950/7330] lr: 1.000e-05, eta: 8:46:23, time: 1.136, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0377, loss_cls: 0.1584, acc: 93.9531, loss_bbox: 0.2031, loss_mask: 0.2148, loss: 0.6265 2024-05-31 07:01:34,495 - mmdet - INFO - Epoch [9][1000/7330] lr: 1.000e-05, eta: 8:45:28, time: 1.121, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0378, loss_cls: 0.1561, acc: 94.0500, loss_bbox: 0.2049, loss_mask: 0.2127, loss: 0.6233 2024-05-31 07:02:31,576 - mmdet - INFO - Epoch [9][1050/7330] lr: 1.000e-05, eta: 8:44:33, time: 1.142, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0426, loss_cls: 0.1594, acc: 93.7971, loss_bbox: 0.2118, loss_mask: 0.2175, loss: 0.6440 2024-05-31 07:03:25,856 - mmdet - INFO - Epoch [9][1100/7330] lr: 1.000e-05, eta: 8:43:37, time: 1.086, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0367, loss_cls: 0.1484, acc: 94.2888, loss_bbox: 0.1974, loss_mask: 0.2080, loss: 0.6022 2024-05-31 07:04:22,203 - mmdet - INFO - Epoch [9][1150/7330] lr: 1.000e-05, eta: 8:42:41, time: 1.127, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0388, loss_cls: 0.1548, acc: 94.0693, loss_bbox: 0.2078, loss_mask: 0.2181, loss: 0.6322 2024-05-31 07:05:20,637 - mmdet - INFO - Epoch [9][1200/7330] lr: 1.000e-05, eta: 8:41:47, time: 1.169, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0361, loss_cls: 0.1464, acc: 94.4534, loss_bbox: 0.1923, loss_mask: 0.2089, loss: 0.5956 2024-05-31 07:06:14,412 - mmdet - INFO - Epoch [9][1250/7330] lr: 1.000e-05, eta: 8:40:50, time: 1.075, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0383, loss_cls: 0.1525, acc: 94.2197, loss_bbox: 0.2038, loss_mask: 0.2130, loss: 0.6186 2024-05-31 07:07:08,137 - mmdet - INFO - Epoch [9][1300/7330] lr: 1.000e-05, eta: 8:39:54, time: 1.075, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0383, loss_cls: 0.1515, acc: 94.2388, loss_bbox: 0.2002, loss_mask: 0.2102, loss: 0.6122 2024-05-31 07:08:09,629 - mmdet - INFO - Epoch [9][1350/7330] lr: 1.000e-05, eta: 8:39:01, time: 1.230, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0381, loss_cls: 0.1512, acc: 94.2637, loss_bbox: 0.2010, loss_mask: 0.2105, loss: 0.6135 2024-05-31 07:09:03,803 - mmdet - INFO - Epoch [9][1400/7330] lr: 1.000e-05, eta: 8:38:05, time: 1.084, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0400, loss_cls: 0.1608, acc: 93.8442, loss_bbox: 0.2083, loss_mask: 0.2163, loss: 0.6383 2024-05-31 07:09:57,794 - mmdet - INFO - Epoch [9][1450/7330] lr: 1.000e-05, eta: 8:37:08, time: 1.080, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0375, loss_cls: 0.1496, acc: 94.2651, loss_bbox: 0.1994, loss_mask: 0.2129, loss: 0.6113 2024-05-31 07:10:54,725 - mmdet - INFO - Epoch [9][1500/7330] lr: 1.000e-05, eta: 8:36:13, time: 1.139, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0399, loss_cls: 0.1558, acc: 94.0671, loss_bbox: 0.1995, loss_mask: 0.2170, loss: 0.6250 2024-05-31 07:11:47,976 - mmdet - INFO - Epoch [9][1550/7330] lr: 1.000e-05, eta: 8:35:16, time: 1.065, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0386, loss_cls: 0.1527, acc: 94.0657, loss_bbox: 0.2051, loss_mask: 0.2115, loss: 0.6202 2024-05-31 07:12:47,146 - mmdet - INFO - Epoch [9][1600/7330] lr: 1.000e-05, eta: 8:34:22, time: 1.183, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0398, loss_cls: 0.1554, acc: 94.0203, loss_bbox: 0.2086, loss_mask: 0.2138, loss: 0.6302 2024-05-31 07:13:41,017 - mmdet - INFO - Epoch [9][1650/7330] lr: 1.000e-05, eta: 8:33:26, time: 1.077, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0382, loss_cls: 0.1591, acc: 93.9492, loss_bbox: 0.2102, loss_mask: 0.2140, loss: 0.6330 2024-05-31 07:14:37,336 - mmdet - INFO - Epoch [9][1700/7330] lr: 1.000e-05, eta: 8:32:30, time: 1.126, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0383, loss_cls: 0.1562, acc: 93.9954, loss_bbox: 0.2046, loss_mask: 0.2073, loss: 0.6190 2024-05-31 07:15:31,572 - mmdet - INFO - Epoch [9][1750/7330] lr: 1.000e-05, eta: 8:31:34, time: 1.085, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0378, loss_cls: 0.1478, acc: 94.3318, loss_bbox: 0.2007, loss_mask: 0.2089, loss: 0.6068 2024-05-31 07:16:29,006 - mmdet - INFO - Epoch [9][1800/7330] lr: 1.000e-05, eta: 8:30:39, time: 1.149, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0142, loss_rpn_bbox: 0.0397, loss_cls: 0.1546, acc: 94.1277, loss_bbox: 0.2021, loss_mask: 0.2136, loss: 0.6241 2024-05-31 07:17:23,573 - mmdet - INFO - Epoch [9][1850/7330] lr: 1.000e-05, eta: 8:29:43, time: 1.091, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0379, loss_cls: 0.1565, acc: 94.0884, loss_bbox: 0.2072, loss_mask: 0.2121, loss: 0.6259 2024-05-31 07:18:21,876 - mmdet - INFO - Epoch [9][1900/7330] lr: 1.000e-05, eta: 8:28:48, time: 1.166, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0365, loss_cls: 0.1492, acc: 94.3022, loss_bbox: 0.1998, loss_mask: 0.2121, loss: 0.6102 2024-05-31 07:19:18,259 - mmdet - INFO - Epoch [9][1950/7330] lr: 1.000e-05, eta: 8:27:53, time: 1.128, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0400, loss_cls: 0.1547, acc: 94.0840, loss_bbox: 0.2049, loss_mask: 0.2152, loss: 0.6272 2024-05-31 07:20:13,054 - mmdet - INFO - Epoch [9][2000/7330] lr: 1.000e-05, eta: 8:26:57, time: 1.096, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0396, loss_cls: 0.1476, acc: 94.2810, loss_bbox: 0.1973, loss_mask: 0.2131, loss: 0.6098 2024-05-31 07:21:09,066 - mmdet - INFO - Epoch [9][2050/7330] lr: 1.000e-05, eta: 8:26:02, time: 1.120, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0378, loss_cls: 0.1508, acc: 94.1865, loss_bbox: 0.2049, loss_mask: 0.2159, loss: 0.6210 2024-05-31 07:22:03,104 - mmdet - INFO - Epoch [9][2100/7330] lr: 1.000e-05, eta: 8:25:05, time: 1.081, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0384, loss_cls: 0.1498, acc: 94.2712, loss_bbox: 0.2066, loss_mask: 0.2172, loss: 0.6239 2024-05-31 07:23:01,542 - mmdet - INFO - Epoch [9][2150/7330] lr: 1.000e-05, eta: 8:24:11, time: 1.169, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0376, loss_cls: 0.1479, acc: 94.3381, loss_bbox: 0.1973, loss_mask: 0.2140, loss: 0.6093 2024-05-31 07:23:56,020 - mmdet - INFO - Epoch [9][2200/7330] lr: 1.000e-05, eta: 8:23:15, time: 1.090, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0374, loss_cls: 0.1435, acc: 94.4636, loss_bbox: 0.1927, loss_mask: 0.2056, loss: 0.5911 2024-05-31 07:24:52,017 - mmdet - INFO - Epoch [9][2250/7330] lr: 1.000e-05, eta: 8:22:19, time: 1.120, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0396, loss_cls: 0.1556, acc: 94.0652, loss_bbox: 0.2097, loss_mask: 0.2170, loss: 0.6337 2024-05-31 07:25:46,139 - mmdet - INFO - Epoch [9][2300/7330] lr: 1.000e-05, eta: 8:21:23, time: 1.082, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0391, loss_cls: 0.1569, acc: 94.0264, loss_bbox: 0.2067, loss_mask: 0.2164, loss: 0.6318 2024-05-31 07:26:40,860 - mmdet - INFO - Epoch [9][2350/7330] lr: 1.000e-05, eta: 8:20:27, time: 1.094, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0394, loss_cls: 0.1554, acc: 94.0737, loss_bbox: 0.2068, loss_mask: 0.2097, loss: 0.6234 2024-05-31 07:27:40,498 - mmdet - INFO - Epoch [9][2400/7330] lr: 1.000e-05, eta: 8:19:33, time: 1.193, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0405, loss_cls: 0.1599, acc: 93.8188, loss_bbox: 0.2130, loss_mask: 0.2176, loss: 0.6437 2024-05-31 07:28:37,096 - mmdet - INFO - Epoch [9][2450/7330] lr: 1.000e-05, eta: 8:18:37, time: 1.132, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0379, loss_cls: 0.1492, acc: 94.2007, loss_bbox: 0.1995, loss_mask: 0.2091, loss: 0.6069 2024-05-31 07:29:32,885 - mmdet - INFO - Epoch [9][2500/7330] lr: 1.000e-05, eta: 8:17:42, time: 1.116, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0361, loss_cls: 0.1452, acc: 94.3899, loss_bbox: 0.1944, loss_mask: 0.2081, loss: 0.5948 2024-05-31 07:30:27,365 - mmdet - INFO - Epoch [9][2550/7330] lr: 1.000e-05, eta: 8:16:46, time: 1.090, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0422, loss_cls: 0.1600, acc: 93.9131, loss_bbox: 0.2132, loss_mask: 0.2158, loss: 0.6444 2024-05-31 07:31:24,283 - mmdet - INFO - Epoch [9][2600/7330] lr: 1.000e-05, eta: 8:15:50, time: 1.138, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0364, loss_cls: 0.1488, acc: 94.3259, loss_bbox: 0.1989, loss_mask: 0.2065, loss: 0.6022 2024-05-31 07:32:20,446 - mmdet - INFO - Epoch [9][2650/7330] lr: 1.000e-05, eta: 8:14:55, time: 1.123, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0381, loss_cls: 0.1447, acc: 94.4207, loss_bbox: 0.1963, loss_mask: 0.2087, loss: 0.5992 2024-05-31 07:33:16,396 - mmdet - INFO - Epoch [9][2700/7330] lr: 1.000e-05, eta: 8:13:59, time: 1.119, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0374, loss_cls: 0.1498, acc: 94.3386, loss_bbox: 0.1970, loss_mask: 0.2128, loss: 0.6082 2024-05-31 07:34:10,913 - mmdet - INFO - Epoch [9][2750/7330] lr: 1.000e-05, eta: 8:13:03, time: 1.090, data_time: 0.050, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0382, loss_cls: 0.1528, acc: 94.2346, loss_bbox: 0.2030, loss_mask: 0.2131, loss: 0.6190 2024-05-31 07:35:06,883 - mmdet - INFO - Epoch [9][2800/7330] lr: 1.000e-05, eta: 8:12:08, time: 1.119, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0385, loss_cls: 0.1534, acc: 94.2490, loss_bbox: 0.2029, loss_mask: 0.2194, loss: 0.6263 2024-05-31 07:35:59,362 - mmdet - INFO - Epoch [9][2850/7330] lr: 1.000e-05, eta: 8:11:11, time: 1.050, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0349, loss_cls: 0.1407, acc: 94.6487, loss_bbox: 0.1883, loss_mask: 0.2022, loss: 0.5768 2024-05-31 07:36:54,976 - mmdet - INFO - Epoch [9][2900/7330] lr: 1.000e-05, eta: 8:10:15, time: 1.112, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0419, loss_cls: 0.1574, acc: 93.9514, loss_bbox: 0.2142, loss_mask: 0.2163, loss: 0.6421 2024-05-31 07:37:48,666 - mmdet - INFO - Epoch [9][2950/7330] lr: 1.000e-05, eta: 8:09:18, time: 1.074, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0376, loss_cls: 0.1486, acc: 94.2859, loss_bbox: 0.2015, loss_mask: 0.2115, loss: 0.6105 2024-05-31 07:38:50,304 - mmdet - INFO - Epoch [9][3000/7330] lr: 1.000e-05, eta: 8:08:25, time: 1.233, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0398, loss_cls: 0.1505, acc: 94.1895, loss_bbox: 0.2017, loss_mask: 0.2074, loss: 0.6116 2024-05-31 07:39:43,905 - mmdet - INFO - Epoch [9][3050/7330] lr: 1.000e-05, eta: 8:07:29, time: 1.072, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0357, loss_cls: 0.1441, acc: 94.5317, loss_bbox: 0.1953, loss_mask: 0.2080, loss: 0.5937 2024-05-31 07:40:38,188 - mmdet - INFO - Epoch [9][3100/7330] lr: 1.000e-05, eta: 8:06:33, time: 1.086, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0373, loss_cls: 0.1501, acc: 94.2141, loss_bbox: 0.2005, loss_mask: 0.2137, loss: 0.6130 2024-05-31 07:41:37,009 - mmdet - INFO - Epoch [9][3150/7330] lr: 1.000e-05, eta: 8:05:38, time: 1.177, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0373, loss_cls: 0.1470, acc: 94.3457, loss_bbox: 0.1972, loss_mask: 0.2127, loss: 0.6053 2024-05-31 07:42:31,623 - mmdet - INFO - Epoch [9][3200/7330] lr: 1.000e-05, eta: 8:04:42, time: 1.092, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0394, loss_cls: 0.1487, acc: 94.2236, loss_bbox: 0.1982, loss_mask: 0.2107, loss: 0.6088 2024-05-31 07:43:28,350 - mmdet - INFO - Epoch [9][3250/7330] lr: 1.000e-05, eta: 8:03:47, time: 1.135, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0393, loss_cls: 0.1486, acc: 94.3362, loss_bbox: 0.2020, loss_mask: 0.2130, loss: 0.6157 2024-05-31 07:44:23,007 - mmdet - INFO - Epoch [9][3300/7330] lr: 1.000e-05, eta: 8:02:51, time: 1.093, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0392, loss_cls: 0.1497, acc: 94.2598, loss_bbox: 0.2021, loss_mask: 0.2041, loss: 0.6064 2024-05-31 07:45:19,353 - mmdet - INFO - Epoch [9][3350/7330] lr: 1.000e-05, eta: 8:01:55, time: 1.127, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0402, loss_cls: 0.1488, acc: 94.3896, loss_bbox: 0.1989, loss_mask: 0.2113, loss: 0.6113 2024-05-31 07:46:13,571 - mmdet - INFO - Epoch [9][3400/7330] lr: 1.000e-05, eta: 8:00:59, time: 1.084, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0384, loss_cls: 0.1508, acc: 94.1826, loss_bbox: 0.2016, loss_mask: 0.2113, loss: 0.6138 2024-05-31 07:47:08,055 - mmdet - INFO - Epoch [9][3450/7330] lr: 1.000e-05, eta: 8:00:03, time: 1.090, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0376, loss_cls: 0.1461, acc: 94.3503, loss_bbox: 0.1973, loss_mask: 0.2108, loss: 0.6036 2024-05-31 07:48:02,824 - mmdet - INFO - Epoch [9][3500/7330] lr: 1.000e-05, eta: 7:59:07, time: 1.095, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0399, loss_cls: 0.1537, acc: 94.2170, loss_bbox: 0.2079, loss_mask: 0.2129, loss: 0.6271 2024-05-31 07:49:03,470 - mmdet - INFO - Epoch [9][3550/7330] lr: 1.000e-05, eta: 7:58:13, time: 1.213, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0381, loss_cls: 0.1484, acc: 94.3828, loss_bbox: 0.2021, loss_mask: 0.2145, loss: 0.6136 2024-05-31 07:50:01,497 - mmdet - INFO - Epoch [9][3600/7330] lr: 1.000e-05, eta: 7:57:19, time: 1.160, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0371, loss_cls: 0.1464, acc: 94.3928, loss_bbox: 0.1920, loss_mask: 0.2083, loss: 0.5949 2024-05-31 07:50:55,956 - mmdet - INFO - Epoch [9][3650/7330] lr: 1.000e-05, eta: 7:56:22, time: 1.089, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0384, loss_cls: 0.1537, acc: 94.1013, loss_bbox: 0.2053, loss_mask: 0.2129, loss: 0.6234 2024-05-31 07:51:52,829 - mmdet - INFO - Epoch [9][3700/7330] lr: 1.000e-05, eta: 7:55:27, time: 1.137, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0407, loss_cls: 0.1582, acc: 93.9688, loss_bbox: 0.2124, loss_mask: 0.2152, loss: 0.6383 2024-05-31 07:52:48,101 - mmdet - INFO - Epoch [9][3750/7330] lr: 1.000e-05, eta: 7:54:31, time: 1.106, data_time: 0.103, memory: 24290, loss_rpn_cls: 0.0130, loss_rpn_bbox: 0.0392, loss_cls: 0.1512, acc: 94.2173, loss_bbox: 0.2042, loss_mask: 0.2135, loss: 0.6212 2024-05-31 07:53:46,849 - mmdet - INFO - Epoch [9][3800/7330] lr: 1.000e-05, eta: 7:53:37, time: 1.175, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0389, loss_cls: 0.1511, acc: 94.2100, loss_bbox: 0.2062, loss_mask: 0.2155, loss: 0.6244 2024-05-31 07:54:41,928 - mmdet - INFO - Epoch [9][3850/7330] lr: 1.000e-05, eta: 7:52:41, time: 1.102, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0137, loss_rpn_bbox: 0.0419, loss_cls: 0.1556, acc: 93.9980, loss_bbox: 0.2047, loss_mask: 0.2139, loss: 0.6298 2024-05-31 07:55:37,548 - mmdet - INFO - Epoch [9][3900/7330] lr: 1.000e-05, eta: 7:51:45, time: 1.112, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0348, loss_cls: 0.1453, acc: 94.3276, loss_bbox: 0.1953, loss_mask: 0.2106, loss: 0.5975 2024-05-31 07:56:32,352 - mmdet - INFO - Epoch [9][3950/7330] lr: 1.000e-05, eta: 7:50:49, time: 1.096, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0390, loss_cls: 0.1472, acc: 94.3196, loss_bbox: 0.1964, loss_mask: 0.2074, loss: 0.6017 2024-05-31 07:57:26,995 - mmdet - INFO - Epoch [9][4000/7330] lr: 1.000e-05, eta: 7:49:53, time: 1.093, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0390, loss_cls: 0.1519, acc: 94.2375, loss_bbox: 0.2013, loss_mask: 0.2125, loss: 0.6161 2024-05-31 07:58:24,791 - mmdet - INFO - Epoch [9][4050/7330] lr: 1.000e-05, eta: 7:48:58, time: 1.155, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0356, loss_cls: 0.1495, acc: 94.2776, loss_bbox: 0.2022, loss_mask: 0.2119, loss: 0.6113 2024-05-31 07:59:21,051 - mmdet - INFO - Epoch [9][4100/7330] lr: 1.000e-05, eta: 7:48:03, time: 1.126, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0358, loss_cls: 0.1414, acc: 94.5610, loss_bbox: 0.1902, loss_mask: 0.2094, loss: 0.5883 2024-05-31 08:00:15,702 - mmdet - INFO - Epoch [9][4150/7330] lr: 1.000e-05, eta: 7:47:07, time: 1.093, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0380, loss_cls: 0.1468, acc: 94.3821, loss_bbox: 0.1971, loss_mask: 0.2105, loss: 0.6037 2024-05-31 08:01:09,560 - mmdet - INFO - Epoch [9][4200/7330] lr: 1.000e-05, eta: 7:46:10, time: 1.077, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0372, loss_cls: 0.1527, acc: 94.1934, loss_bbox: 0.2037, loss_mask: 0.2150, loss: 0.6209 2024-05-31 08:02:09,112 - mmdet - INFO - Epoch [9][4250/7330] lr: 1.000e-05, eta: 7:45:16, time: 1.191, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0391, loss_cls: 0.1506, acc: 94.2507, loss_bbox: 0.2031, loss_mask: 0.2105, loss: 0.6157 2024-05-31 08:03:03,674 - mmdet - INFO - Epoch [9][4300/7330] lr: 1.000e-05, eta: 7:44:20, time: 1.091, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0134, loss_rpn_bbox: 0.0394, loss_cls: 0.1514, acc: 94.2288, loss_bbox: 0.2043, loss_mask: 0.2149, loss: 0.6235 2024-05-31 08:03:59,811 - mmdet - INFO - Epoch [9][4350/7330] lr: 1.000e-05, eta: 7:43:25, time: 1.123, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0372, loss_cls: 0.1518, acc: 94.1899, loss_bbox: 0.1987, loss_mask: 0.2094, loss: 0.6090 2024-05-31 08:04:58,566 - mmdet - INFO - Epoch [9][4400/7330] lr: 1.000e-05, eta: 7:42:30, time: 1.175, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0397, loss_cls: 0.1500, acc: 94.2141, loss_bbox: 0.2010, loss_mask: 0.2130, loss: 0.6150 2024-05-31 08:05:53,760 - mmdet - INFO - Epoch [9][4450/7330] lr: 1.000e-05, eta: 7:41:34, time: 1.104, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0401, loss_cls: 0.1558, acc: 94.0693, loss_bbox: 0.2042, loss_mask: 0.2163, loss: 0.6291 2024-05-31 08:06:47,906 - mmdet - INFO - Epoch [9][4500/7330] lr: 1.000e-05, eta: 7:40:38, time: 1.083, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0378, loss_cls: 0.1432, acc: 94.5127, loss_bbox: 0.1959, loss_mask: 0.2095, loss: 0.5969 2024-05-31 08:07:44,256 - mmdet - INFO - Epoch [9][4550/7330] lr: 1.000e-05, eta: 7:39:43, time: 1.127, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0381, loss_cls: 0.1465, acc: 94.4468, loss_bbox: 0.1951, loss_mask: 0.2051, loss: 0.5964 2024-05-31 08:08:40,736 - mmdet - INFO - Epoch [9][4600/7330] lr: 1.000e-05, eta: 7:38:47, time: 1.130, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0390, loss_cls: 0.1522, acc: 94.1570, loss_bbox: 0.2019, loss_mask: 0.2090, loss: 0.6138 2024-05-31 08:09:37,486 - mmdet - INFO - Epoch [9][4650/7330] lr: 1.000e-05, eta: 7:37:52, time: 1.135, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0385, loss_cls: 0.1466, acc: 94.4019, loss_bbox: 0.1953, loss_mask: 0.2094, loss: 0.6020 2024-05-31 08:10:31,848 - mmdet - INFO - Epoch [9][4700/7330] lr: 1.000e-05, eta: 7:36:56, time: 1.087, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0383, loss_cls: 0.1471, acc: 94.2683, loss_bbox: 0.1998, loss_mask: 0.2107, loss: 0.6071 2024-05-31 08:11:26,433 - mmdet - INFO - Epoch [9][4750/7330] lr: 1.000e-05, eta: 7:36:00, time: 1.092, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0382, loss_cls: 0.1472, acc: 94.3984, loss_bbox: 0.1998, loss_mask: 0.2106, loss: 0.6072 2024-05-31 08:12:23,081 - mmdet - INFO - Epoch [9][4800/7330] lr: 1.000e-05, eta: 7:35:04, time: 1.133, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0397, loss_cls: 0.1520, acc: 94.2322, loss_bbox: 0.2047, loss_mask: 0.2138, loss: 0.6229 2024-05-31 08:13:21,321 - mmdet - INFO - Epoch [9][4850/7330] lr: 1.000e-05, eta: 7:34:10, time: 1.165, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0364, loss_cls: 0.1425, acc: 94.5466, loss_bbox: 0.1905, loss_mask: 0.2066, loss: 0.5875 2024-05-31 08:14:15,580 - mmdet - INFO - Epoch [9][4900/7330] lr: 1.000e-05, eta: 7:33:14, time: 1.085, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0377, loss_cls: 0.1549, acc: 94.0154, loss_bbox: 0.2064, loss_mask: 0.2112, loss: 0.6226 2024-05-31 08:15:11,854 - mmdet - INFO - Epoch [9][4950/7330] lr: 1.000e-05, eta: 7:32:18, time: 1.125, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0375, loss_cls: 0.1392, acc: 94.7168, loss_bbox: 0.1885, loss_mask: 0.2053, loss: 0.5820 2024-05-31 08:16:08,600 - mmdet - INFO - Epoch [9][5000/7330] lr: 1.000e-05, eta: 7:31:23, time: 1.135, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0368, loss_cls: 0.1472, acc: 94.3467, loss_bbox: 0.1989, loss_mask: 0.2122, loss: 0.6059 2024-05-31 08:17:03,438 - mmdet - INFO - Epoch [9][5050/7330] lr: 1.000e-05, eta: 7:30:27, time: 1.097, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0415, loss_cls: 0.1545, acc: 94.1775, loss_bbox: 0.2016, loss_mask: 0.2156, loss: 0.6253 2024-05-31 08:18:00,143 - mmdet - INFO - Epoch [9][5100/7330] lr: 1.000e-05, eta: 7:29:32, time: 1.134, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0390, loss_cls: 0.1472, acc: 94.2773, loss_bbox: 0.1989, loss_mask: 0.2103, loss: 0.6072 2024-05-31 08:18:56,150 - mmdet - INFO - Epoch [9][5150/7330] lr: 1.000e-05, eta: 7:28:36, time: 1.120, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0367, loss_cls: 0.1462, acc: 94.3528, loss_bbox: 0.1981, loss_mask: 0.2078, loss: 0.5992 2024-05-31 08:19:52,156 - mmdet - INFO - Epoch [9][5200/7330] lr: 1.000e-05, eta: 7:27:40, time: 1.120, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0367, loss_cls: 0.1444, acc: 94.5591, loss_bbox: 0.1979, loss_mask: 0.2096, loss: 0.5986 2024-05-31 08:20:46,837 - mmdet - INFO - Epoch [9][5250/7330] lr: 1.000e-05, eta: 7:26:44, time: 1.094, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0363, loss_cls: 0.1427, acc: 94.4509, loss_bbox: 0.1952, loss_mask: 0.2171, loss: 0.6024 2024-05-31 08:21:40,947 - mmdet - INFO - Epoch [9][5300/7330] lr: 1.000e-05, eta: 7:25:48, time: 1.082, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0406, loss_cls: 0.1501, acc: 94.2825, loss_bbox: 0.2007, loss_mask: 0.2133, loss: 0.6171 2024-05-31 08:22:37,336 - mmdet - INFO - Epoch [9][5350/7330] lr: 1.000e-05, eta: 7:24:53, time: 1.128, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0374, loss_cls: 0.1477, acc: 94.4041, loss_bbox: 0.1971, loss_mask: 0.2129, loss: 0.6073 2024-05-31 08:23:34,423 - mmdet - INFO - Epoch [9][5400/7330] lr: 1.000e-05, eta: 7:23:57, time: 1.142, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0140, loss_rpn_bbox: 0.0394, loss_cls: 0.1495, acc: 94.2903, loss_bbox: 0.2012, loss_mask: 0.2088, loss: 0.6128 2024-05-31 08:24:30,943 - mmdet - INFO - Epoch [9][5450/7330] lr: 1.000e-05, eta: 7:23:02, time: 1.130, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0397, loss_cls: 0.1600, acc: 93.8240, loss_bbox: 0.2134, loss_mask: 0.2183, loss: 0.6441 2024-05-31 08:25:27,000 - mmdet - INFO - Epoch [9][5500/7330] lr: 1.000e-05, eta: 7:22:07, time: 1.121, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0362, loss_cls: 0.1505, acc: 94.2644, loss_bbox: 0.1984, loss_mask: 0.2105, loss: 0.6068 2024-05-31 08:26:21,396 - mmdet - INFO - Epoch [9][5550/7330] lr: 1.000e-05, eta: 7:21:10, time: 1.088, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0384, loss_cls: 0.1478, acc: 94.3591, loss_bbox: 0.1949, loss_mask: 0.2108, loss: 0.6049 2024-05-31 08:27:17,709 - mmdet - INFO - Epoch [9][5600/7330] lr: 1.000e-05, eta: 7:20:15, time: 1.127, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0365, loss_cls: 0.1453, acc: 94.3787, loss_bbox: 0.2007, loss_mask: 0.2116, loss: 0.6060 2024-05-31 08:28:12,232 - mmdet - INFO - Epoch [9][5650/7330] lr: 1.000e-05, eta: 7:19:19, time: 1.090, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0376, loss_cls: 0.1480, acc: 94.2954, loss_bbox: 0.1993, loss_mask: 0.2094, loss: 0.6068 2024-05-31 08:29:12,987 - mmdet - INFO - Epoch [9][5700/7330] lr: 1.000e-05, eta: 7:18:25, time: 1.215, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0386, loss_cls: 0.1520, acc: 94.2212, loss_bbox: 0.2050, loss_mask: 0.2094, loss: 0.6160 2024-05-31 08:30:07,041 - mmdet - INFO - Epoch [9][5750/7330] lr: 1.000e-05, eta: 7:17:29, time: 1.081, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0379, loss_cls: 0.1410, acc: 94.5908, loss_bbox: 0.1927, loss_mask: 0.2063, loss: 0.5886 2024-05-31 08:31:01,616 - mmdet - INFO - Epoch [9][5800/7330] lr: 1.000e-05, eta: 7:16:33, time: 1.092, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0127, loss_rpn_bbox: 0.0402, loss_cls: 0.1516, acc: 94.1846, loss_bbox: 0.2032, loss_mask: 0.2132, loss: 0.6210 2024-05-31 08:31:58,553 - mmdet - INFO - Epoch [9][5850/7330] lr: 1.000e-05, eta: 7:15:37, time: 1.139, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0381, loss_cls: 0.1453, acc: 94.3870, loss_bbox: 0.1955, loss_mask: 0.2101, loss: 0.6004 2024-05-31 08:32:52,809 - mmdet - INFO - Epoch [9][5900/7330] lr: 1.000e-05, eta: 7:14:41, time: 1.085, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0382, loss_cls: 0.1562, acc: 94.1211, loss_bbox: 0.2054, loss_mask: 0.2099, loss: 0.6206 2024-05-31 08:33:49,319 - mmdet - INFO - Epoch [9][5950/7330] lr: 1.000e-05, eta: 7:13:46, time: 1.130, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0378, loss_cls: 0.1488, acc: 94.2913, loss_bbox: 0.2009, loss_mask: 0.2116, loss: 0.6115 2024-05-31 08:34:44,104 - mmdet - INFO - Epoch [9][6000/7330] lr: 1.000e-05, eta: 7:12:50, time: 1.096, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0354, loss_cls: 0.1452, acc: 94.4546, loss_bbox: 0.1978, loss_mask: 0.2084, loss: 0.5966 2024-05-31 08:35:43,726 - mmdet - INFO - Epoch [9][6050/7330] lr: 1.000e-05, eta: 7:11:56, time: 1.192, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0390, loss_cls: 0.1520, acc: 94.1030, loss_bbox: 0.2040, loss_mask: 0.2169, loss: 0.6233 2024-05-31 08:36:37,973 - mmdet - INFO - Epoch [9][6100/7330] lr: 1.000e-05, eta: 7:10:59, time: 1.085, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0362, loss_cls: 0.1481, acc: 94.3113, loss_bbox: 0.1998, loss_mask: 0.2112, loss: 0.6066 2024-05-31 08:37:33,985 - mmdet - INFO - Epoch [9][6150/7330] lr: 1.000e-05, eta: 7:10:04, time: 1.120, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0375, loss_cls: 0.1476, acc: 94.3503, loss_bbox: 0.1990, loss_mask: 0.2128, loss: 0.6080 2024-05-31 08:38:28,941 - mmdet - INFO - Epoch [9][6200/7330] lr: 1.000e-05, eta: 7:09:08, time: 1.099, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0393, loss_cls: 0.1552, acc: 94.0544, loss_bbox: 0.2113, loss_mask: 0.2148, loss: 0.6335 2024-05-31 08:39:25,936 - mmdet - INFO - Epoch [9][6250/7330] lr: 1.000e-05, eta: 7:08:13, time: 1.140, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0376, loss_cls: 0.1503, acc: 94.2151, loss_bbox: 0.2014, loss_mask: 0.2093, loss: 0.6096 2024-05-31 08:40:26,006 - mmdet - INFO - Epoch [9][6300/7330] lr: 1.000e-05, eta: 7:07:19, time: 1.201, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0138, loss_rpn_bbox: 0.0405, loss_cls: 0.1574, acc: 94.1003, loss_bbox: 0.2051, loss_mask: 0.2108, loss: 0.6275 2024-05-31 08:41:20,529 - mmdet - INFO - Epoch [9][6350/7330] lr: 1.000e-05, eta: 7:06:22, time: 1.090, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0133, loss_rpn_bbox: 0.0381, loss_cls: 0.1527, acc: 94.1384, loss_bbox: 0.2046, loss_mask: 0.2155, loss: 0.6242 2024-05-31 08:42:17,231 - mmdet - INFO - Epoch [9][6400/7330] lr: 1.000e-05, eta: 7:05:27, time: 1.134, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0402, loss_cls: 0.1542, acc: 94.1201, loss_bbox: 0.2055, loss_mask: 0.2140, loss: 0.6259 2024-05-31 08:43:10,636 - mmdet - INFO - Epoch [9][6450/7330] lr: 1.000e-05, eta: 7:04:31, time: 1.068, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0366, loss_cls: 0.1452, acc: 94.4248, loss_bbox: 0.1922, loss_mask: 0.2094, loss: 0.5943 2024-05-31 08:44:07,245 - mmdet - INFO - Epoch [9][6500/7330] lr: 1.000e-05, eta: 7:03:35, time: 1.132, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0375, loss_cls: 0.1484, acc: 94.3159, loss_bbox: 0.1995, loss_mask: 0.2122, loss: 0.6100 2024-05-31 08:45:01,458 - mmdet - INFO - Epoch [9][6550/7330] lr: 1.000e-05, eta: 7:02:39, time: 1.084, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0378, loss_cls: 0.1501, acc: 94.2349, loss_bbox: 0.2036, loss_mask: 0.2137, loss: 0.6169 2024-05-31 08:46:02,615 - mmdet - INFO - Epoch [9][6600/7330] lr: 1.000e-05, eta: 7:01:45, time: 1.223, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0383, loss_cls: 0.1509, acc: 94.2446, loss_bbox: 0.2024, loss_mask: 0.2062, loss: 0.6102 2024-05-31 08:46:57,794 - mmdet - INFO - Epoch [9][6650/7330] lr: 1.000e-05, eta: 7:00:49, time: 1.104, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0132, loss_rpn_bbox: 0.0427, loss_cls: 0.1601, acc: 93.8081, loss_bbox: 0.2174, loss_mask: 0.2179, loss: 0.6513 2024-05-31 08:47:51,884 - mmdet - INFO - Epoch [9][6700/7330] lr: 1.000e-05, eta: 6:59:53, time: 1.082, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0402, loss_cls: 0.1455, acc: 94.3521, loss_bbox: 0.2051, loss_mask: 0.2174, loss: 0.6195 2024-05-31 08:48:47,544 - mmdet - INFO - Epoch [9][6750/7330] lr: 1.000e-05, eta: 6:58:57, time: 1.113, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0360, loss_cls: 0.1416, acc: 94.5115, loss_bbox: 0.1922, loss_mask: 0.2044, loss: 0.5845 2024-05-31 08:49:44,458 - mmdet - INFO - Epoch [9][6800/7330] lr: 1.000e-05, eta: 6:58:02, time: 1.138, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0366, loss_cls: 0.1450, acc: 94.4099, loss_bbox: 0.1920, loss_mask: 0.2093, loss: 0.5933 2024-05-31 08:50:40,830 - mmdet - INFO - Epoch [9][6850/7330] lr: 1.000e-05, eta: 6:57:07, time: 1.127, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0393, loss_cls: 0.1500, acc: 94.2739, loss_bbox: 0.2012, loss_mask: 0.2136, loss: 0.6161 2024-05-31 08:51:35,954 - mmdet - INFO - Epoch [9][6900/7330] lr: 1.000e-05, eta: 6:56:11, time: 1.102, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0400, loss_cls: 0.1569, acc: 94.0522, loss_bbox: 0.2085, loss_mask: 0.2153, loss: 0.6329 2024-05-31 08:52:35,024 - mmdet - INFO - Epoch [9][6950/7330] lr: 1.000e-05, eta: 6:55:16, time: 1.181, data_time: 0.096, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0392, loss_cls: 0.1500, acc: 94.2734, loss_bbox: 0.2019, loss_mask: 0.2109, loss: 0.6145 2024-05-31 08:53:28,617 - mmdet - INFO - Epoch [9][7000/7330] lr: 1.000e-05, eta: 6:54:20, time: 1.072, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0354, loss_cls: 0.1407, acc: 94.5959, loss_bbox: 0.1927, loss_mask: 0.2108, loss: 0.5904 2024-05-31 08:54:24,773 - mmdet - INFO - Epoch [9][7050/7330] lr: 1.000e-05, eta: 6:53:24, time: 1.123, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0359, loss_cls: 0.1434, acc: 94.5979, loss_bbox: 0.1952, loss_mask: 0.2121, loss: 0.5968 2024-05-31 08:55:21,466 - mmdet - INFO - Epoch [9][7100/7330] lr: 1.000e-05, eta: 6:52:29, time: 1.134, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0392, loss_cls: 0.1520, acc: 94.1514, loss_bbox: 0.2011, loss_mask: 0.2135, loss: 0.6170 2024-05-31 08:56:17,581 - mmdet - INFO - Epoch [9][7150/7330] lr: 1.000e-05, eta: 6:51:33, time: 1.122, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0353, loss_cls: 0.1425, acc: 94.5837, loss_bbox: 0.1927, loss_mask: 0.2099, loss: 0.5919 2024-05-31 08:57:12,672 - mmdet - INFO - Epoch [9][7200/7330] lr: 1.000e-05, eta: 6:50:38, time: 1.102, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0412, loss_cls: 0.1480, acc: 94.2402, loss_bbox: 0.2047, loss_mask: 0.2169, loss: 0.6225 2024-05-31 08:58:06,788 - mmdet - INFO - Epoch [9][7250/7330] lr: 1.000e-05, eta: 6:49:41, time: 1.082, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0386, loss_cls: 0.1443, acc: 94.4331, loss_bbox: 0.1975, loss_mask: 0.2098, loss: 0.6024 2024-05-31 08:59:03,421 - mmdet - INFO - Epoch [9][7300/7330] lr: 1.000e-05, eta: 6:48:46, time: 1.133, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0380, loss_cls: 0.1512, acc: 94.2168, loss_bbox: 0.2056, loss_mask: 0.2138, loss: 0.6215 2024-05-31 08:59:36,663 - mmdet - INFO - Saving checkpoint at 9 epochs 2024-05-31 09:01:57,657 - mmdet - INFO - Evaluating bbox... 2024-05-31 09:02:17,648 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.490 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.722 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.537 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.322 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.528 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.652 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.426 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.646 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.761 2024-05-31 09:02:17,648 - mmdet - INFO - Evaluating segm... 2024-05-31 09:02:42,696 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.440 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.687 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.473 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.237 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.473 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.648 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.548 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.358 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.592 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.725 2024-05-31 09:02:42,991 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 09:02:42,992 - mmdet - INFO - Epoch(val) [9][625] bbox_mAP: 0.4900, bbox_mAP_50: 0.7220, bbox_mAP_75: 0.5370, bbox_mAP_s: 0.3220, bbox_mAP_m: 0.5280, bbox_mAP_l: 0.6520, bbox_mAP_copypaste: 0.490 0.722 0.537 0.322 0.528 0.652, segm_mAP: 0.4400, segm_mAP_50: 0.6870, segm_mAP_75: 0.4730, segm_mAP_s: 0.2370, segm_mAP_m: 0.4730, segm_mAP_l: 0.6480, segm_mAP_copypaste: 0.440 0.687 0.473 0.237 0.473 0.648 2024-05-31 09:03:43,960 - mmdet - INFO - Epoch [10][50/7330] lr: 1.000e-05, eta: 6:47:07, time: 1.219, data_time: 0.163, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0404, loss_cls: 0.1496, acc: 94.2932, loss_bbox: 0.1995, loss_mask: 0.2114, loss: 0.6125 2024-05-31 09:04:37,535 - mmdet - INFO - Epoch [10][100/7330] lr: 1.000e-05, eta: 6:46:11, time: 1.072, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0386, loss_cls: 0.1475, acc: 94.2883, loss_bbox: 0.1963, loss_mask: 0.2144, loss: 0.6077 2024-05-31 09:05:31,396 - mmdet - INFO - Epoch [10][150/7330] lr: 1.000e-05, eta: 6:45:15, time: 1.077, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0376, loss_cls: 0.1463, acc: 94.3955, loss_bbox: 0.1989, loss_mask: 0.2137, loss: 0.6080 2024-05-31 09:06:27,456 - mmdet - INFO - Epoch [10][200/7330] lr: 1.000e-05, eta: 6:44:19, time: 1.121, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0378, loss_cls: 0.1451, acc: 94.4639, loss_bbox: 0.1951, loss_mask: 0.2099, loss: 0.5995 2024-05-31 09:07:21,428 - mmdet - INFO - Epoch [10][250/7330] lr: 1.000e-05, eta: 6:43:23, time: 1.080, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0363, loss_cls: 0.1427, acc: 94.5298, loss_bbox: 0.1902, loss_mask: 0.2079, loss: 0.5890 2024-05-31 09:08:17,533 - mmdet - INFO - Epoch [10][300/7330] lr: 1.000e-05, eta: 6:42:28, time: 1.122, data_time: 0.052, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0378, loss_cls: 0.1484, acc: 94.3159, loss_bbox: 0.1990, loss_mask: 0.2087, loss: 0.6052 2024-05-31 09:09:11,751 - mmdet - INFO - Epoch [10][350/7330] lr: 1.000e-05, eta: 6:41:31, time: 1.084, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0365, loss_cls: 0.1433, acc: 94.4946, loss_bbox: 0.1948, loss_mask: 0.2047, loss: 0.5904 2024-05-31 09:10:08,514 - mmdet - INFO - Epoch [10][400/7330] lr: 1.000e-05, eta: 6:40:36, time: 1.135, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0373, loss_cls: 0.1471, acc: 94.3525, loss_bbox: 0.2003, loss_mask: 0.2101, loss: 0.6063 2024-05-31 09:11:03,519 - mmdet - INFO - Epoch [10][450/7330] lr: 1.000e-05, eta: 6:39:40, time: 1.100, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0408, loss_cls: 0.1596, acc: 93.8860, loss_bbox: 0.2101, loss_mask: 0.2161, loss: 0.6390 2024-05-31 09:11:57,771 - mmdet - INFO - Epoch [10][500/7330] lr: 1.000e-05, eta: 6:38:44, time: 1.085, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0386, loss_cls: 0.1474, acc: 94.3206, loss_bbox: 0.2001, loss_mask: 0.2121, loss: 0.6093 2024-05-31 09:12:51,360 - mmdet - INFO - Epoch [10][550/7330] lr: 1.000e-05, eta: 6:37:48, time: 1.072, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0390, loss_cls: 0.1492, acc: 94.2080, loss_bbox: 0.2060, loss_mask: 0.2148, loss: 0.6207 2024-05-31 09:13:47,142 - mmdet - INFO - Epoch [10][600/7330] lr: 1.000e-05, eta: 6:36:52, time: 1.116, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0373, loss_cls: 0.1464, acc: 94.3518, loss_bbox: 0.1997, loss_mask: 0.2066, loss: 0.6016 2024-05-31 09:14:41,514 - mmdet - INFO - Epoch [10][650/7330] lr: 1.000e-05, eta: 6:35:56, time: 1.087, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0357, loss_cls: 0.1442, acc: 94.4763, loss_bbox: 0.1977, loss_mask: 0.2102, loss: 0.5984 2024-05-31 09:15:47,346 - mmdet - INFO - Epoch [10][700/7330] lr: 1.000e-05, eta: 6:35:04, time: 1.317, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0367, loss_cls: 0.1452, acc: 94.3877, loss_bbox: 0.1942, loss_mask: 0.2143, loss: 0.6014 2024-05-31 09:16:41,769 - mmdet - INFO - Epoch [10][750/7330] lr: 1.000e-05, eta: 6:34:08, time: 1.089, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0354, loss_cls: 0.1456, acc: 94.4380, loss_bbox: 0.1923, loss_mask: 0.2068, loss: 0.5905 2024-05-31 09:17:35,355 - mmdet - INFO - Epoch [10][800/7330] lr: 1.000e-05, eta: 6:33:11, time: 1.072, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0389, loss_cls: 0.1484, acc: 94.3645, loss_bbox: 0.1963, loss_mask: 0.2098, loss: 0.6052 2024-05-31 09:18:32,520 - mmdet - INFO - Epoch [10][850/7330] lr: 1.000e-05, eta: 6:32:16, time: 1.143, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0386, loss_cls: 0.1464, acc: 94.3918, loss_bbox: 0.1999, loss_mask: 0.2088, loss: 0.6047 2024-05-31 09:19:26,254 - mmdet - INFO - Epoch [10][900/7330] lr: 1.000e-05, eta: 6:31:20, time: 1.075, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0377, loss_cls: 0.1455, acc: 94.4360, loss_bbox: 0.1991, loss_mask: 0.2106, loss: 0.6038 2024-05-31 09:20:22,039 - mmdet - INFO - Epoch [10][950/7330] lr: 1.000e-05, eta: 6:30:24, time: 1.116, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0368, loss_cls: 0.1442, acc: 94.4324, loss_bbox: 0.1909, loss_mask: 0.2090, loss: 0.5924 2024-05-31 09:21:16,784 - mmdet - INFO - Epoch [10][1000/7330] lr: 1.000e-05, eta: 6:29:28, time: 1.095, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0387, loss_cls: 0.1460, acc: 94.3335, loss_bbox: 0.1959, loss_mask: 0.2092, loss: 0.6000 2024-05-31 09:22:11,385 - mmdet - INFO - Epoch [10][1050/7330] lr: 1.000e-05, eta: 6:28:32, time: 1.092, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0136, loss_rpn_bbox: 0.0416, loss_cls: 0.1556, acc: 94.0332, loss_bbox: 0.2117, loss_mask: 0.2112, loss: 0.6337 2024-05-31 09:23:05,192 - mmdet - INFO - Epoch [10][1100/7330] lr: 1.000e-05, eta: 6:27:36, time: 1.076, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0372, loss_cls: 0.1387, acc: 94.5942, loss_bbox: 0.1938, loss_mask: 0.2033, loss: 0.5847 2024-05-31 09:23:59,874 - mmdet - INFO - Epoch [10][1150/7330] lr: 1.000e-05, eta: 6:26:40, time: 1.094, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0408, loss_cls: 0.1576, acc: 93.9558, loss_bbox: 0.2105, loss_mask: 0.2107, loss: 0.6313 2024-05-31 09:24:58,668 - mmdet - INFO - Epoch [10][1200/7330] lr: 1.000e-05, eta: 6:25:45, time: 1.176, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0384, loss_cls: 0.1506, acc: 94.2136, loss_bbox: 0.1995, loss_mask: 0.2091, loss: 0.6100 2024-05-31 09:26:00,976 - mmdet - INFO - Epoch [10][1250/7330] lr: 1.000e-05, eta: 6:24:52, time: 1.246, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0131, loss_rpn_bbox: 0.0399, loss_cls: 0.1496, acc: 94.3096, loss_bbox: 0.2004, loss_mask: 0.2125, loss: 0.6155 2024-05-31 09:26:55,533 - mmdet - INFO - Epoch [10][1300/7330] lr: 1.000e-05, eta: 6:23:56, time: 1.091, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0402, loss_cls: 0.1460, acc: 94.4519, loss_bbox: 0.1924, loss_mask: 0.2072, loss: 0.5972 2024-05-31 09:27:54,642 - mmdet - INFO - Epoch [10][1350/7330] lr: 1.000e-05, eta: 6:23:01, time: 1.182, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0400, loss_cls: 0.1462, acc: 94.4282, loss_bbox: 0.1972, loss_mask: 0.2098, loss: 0.6046 2024-05-31 09:28:49,142 - mmdet - INFO - Epoch [10][1400/7330] lr: 1.000e-05, eta: 6:22:05, time: 1.090, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0408, loss_cls: 0.1553, acc: 94.0042, loss_bbox: 0.2072, loss_mask: 0.2147, loss: 0.6305 2024-05-31 09:29:43,615 - mmdet - INFO - Epoch [10][1450/7330] lr: 1.000e-05, eta: 6:21:09, time: 1.090, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0387, loss_cls: 0.1441, acc: 94.4258, loss_bbox: 0.1988, loss_mask: 0.2089, loss: 0.6021 2024-05-31 09:30:39,778 - mmdet - INFO - Epoch [10][1500/7330] lr: 1.000e-05, eta: 6:20:13, time: 1.123, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0355, loss_cls: 0.1371, acc: 94.7241, loss_bbox: 0.1872, loss_mask: 0.2016, loss: 0.5717 2024-05-31 09:31:32,771 - mmdet - INFO - Epoch [10][1550/7330] lr: 1.000e-05, eta: 6:19:17, time: 1.060, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0342, loss_cls: 0.1318, acc: 94.9890, loss_bbox: 0.1797, loss_mask: 0.2029, loss: 0.5591 2024-05-31 09:32:26,186 - mmdet - INFO - Epoch [10][1600/7330] lr: 1.000e-05, eta: 6:18:21, time: 1.068, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1501, acc: 94.2126, loss_bbox: 0.2012, loss_mask: 0.2104, loss: 0.6107 2024-05-31 09:33:20,193 - mmdet - INFO - Epoch [10][1650/7330] lr: 1.000e-05, eta: 6:17:24, time: 1.080, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0363, loss_cls: 0.1434, acc: 94.4524, loss_bbox: 0.1948, loss_mask: 0.2037, loss: 0.5885 2024-05-31 09:34:14,249 - mmdet - INFO - Epoch [10][1700/7330] lr: 1.000e-05, eta: 6:16:28, time: 1.081, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0380, loss_cls: 0.1464, acc: 94.3110, loss_bbox: 0.2000, loss_mask: 0.2107, loss: 0.6066 2024-05-31 09:35:15,201 - mmdet - INFO - Epoch [10][1750/7330] lr: 1.000e-05, eta: 6:15:34, time: 1.219, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0358, loss_cls: 0.1366, acc: 94.6821, loss_bbox: 0.1884, loss_mask: 0.2096, loss: 0.5812 2024-05-31 09:36:13,597 - mmdet - INFO - Epoch [10][1800/7330] lr: 1.000e-05, eta: 6:14:39, time: 1.168, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0404, loss_cls: 0.1504, acc: 94.2302, loss_bbox: 0.2035, loss_mask: 0.2160, loss: 0.6228 2024-05-31 09:37:06,942 - mmdet - INFO - Epoch [10][1850/7330] lr: 1.000e-05, eta: 6:13:43, time: 1.067, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0354, loss_cls: 0.1329, acc: 94.9961, loss_bbox: 0.1836, loss_mask: 0.2032, loss: 0.5653 2024-05-31 09:38:01,632 - mmdet - INFO - Epoch [10][1900/7330] lr: 1.000e-05, eta: 6:12:47, time: 1.094, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0376, loss_cls: 0.1496, acc: 94.1870, loss_bbox: 0.2005, loss_mask: 0.2131, loss: 0.6123 2024-05-31 09:38:59,777 - mmdet - INFO - Epoch [10][1950/7330] lr: 1.000e-05, eta: 6:11:52, time: 1.163, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0374, loss_cls: 0.1430, acc: 94.5681, loss_bbox: 0.1928, loss_mask: 0.2057, loss: 0.5899 2024-05-31 09:39:53,350 - mmdet - INFO - Epoch [10][2000/7330] lr: 1.000e-05, eta: 6:10:56, time: 1.071, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0371, loss_cls: 0.1471, acc: 94.3306, loss_bbox: 0.2009, loss_mask: 0.2133, loss: 0.6091 2024-05-31 09:40:49,994 - mmdet - INFO - Epoch [10][2050/7330] lr: 1.000e-05, eta: 6:10:00, time: 1.133, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0404, loss_cls: 0.1482, acc: 94.2681, loss_bbox: 0.2021, loss_mask: 0.2119, loss: 0.6143 2024-05-31 09:41:43,983 - mmdet - INFO - Epoch [10][2100/7330] lr: 1.000e-05, eta: 6:09:04, time: 1.080, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0374, loss_cls: 0.1472, acc: 94.4224, loss_bbox: 0.1965, loss_mask: 0.2108, loss: 0.6027 2024-05-31 09:42:38,158 - mmdet - INFO - Epoch [10][2150/7330] lr: 1.000e-05, eta: 6:08:08, time: 1.084, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0377, loss_cls: 0.1503, acc: 94.1038, loss_bbox: 0.2007, loss_mask: 0.2091, loss: 0.6082 2024-05-31 09:43:31,936 - mmdet - INFO - Epoch [10][2200/7330] lr: 1.000e-05, eta: 6:07:12, time: 1.076, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0358, loss_cls: 0.1391, acc: 94.6978, loss_bbox: 0.1904, loss_mask: 0.2085, loss: 0.5841 2024-05-31 09:44:26,528 - mmdet - INFO - Epoch [10][2250/7330] lr: 1.000e-05, eta: 6:06:16, time: 1.092, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0388, loss_cls: 0.1466, acc: 94.3508, loss_bbox: 0.2013, loss_mask: 0.2135, loss: 0.6124 2024-05-31 09:45:22,854 - mmdet - INFO - Epoch [10][2300/7330] lr: 1.000e-05, eta: 6:05:21, time: 1.126, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0121, loss_rpn_bbox: 0.0379, loss_cls: 0.1437, acc: 94.4565, loss_bbox: 0.1941, loss_mask: 0.2023, loss: 0.5901 2024-05-31 09:46:23,680 - mmdet - INFO - Epoch [10][2350/7330] lr: 1.000e-05, eta: 6:04:26, time: 1.217, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0390, loss_cls: 0.1462, acc: 94.3909, loss_bbox: 0.2045, loss_mask: 0.2144, loss: 0.6152 2024-05-31 09:47:20,374 - mmdet - INFO - Epoch [10][2400/7330] lr: 1.000e-05, eta: 6:03:31, time: 1.134, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0381, loss_cls: 0.1448, acc: 94.4028, loss_bbox: 0.1967, loss_mask: 0.2091, loss: 0.6004 2024-05-31 09:48:14,218 - mmdet - INFO - Epoch [10][2450/7330] lr: 1.000e-05, eta: 6:02:35, time: 1.077, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0381, loss_cls: 0.1467, acc: 94.3872, loss_bbox: 0.1977, loss_mask: 0.2080, loss: 0.6025 2024-05-31 09:49:08,737 - mmdet - INFO - Epoch [10][2500/7330] lr: 1.000e-05, eta: 6:01:39, time: 1.090, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0402, loss_cls: 0.1489, acc: 94.2432, loss_bbox: 0.2038, loss_mask: 0.2120, loss: 0.6174 2024-05-31 09:50:09,437 - mmdet - INFO - Epoch [10][2550/7330] lr: 1.000e-05, eta: 6:00:45, time: 1.214, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0371, loss_cls: 0.1417, acc: 94.5603, loss_bbox: 0.1929, loss_mask: 0.2128, loss: 0.5961 2024-05-31 09:51:04,452 - mmdet - INFO - Epoch [10][2600/7330] lr: 1.000e-05, eta: 5:59:49, time: 1.100, data_time: 0.104, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0393, loss_cls: 0.1474, acc: 94.3093, loss_bbox: 0.2053, loss_mask: 0.2139, loss: 0.6179 2024-05-31 09:51:59,235 - mmdet - INFO - Epoch [10][2650/7330] lr: 1.000e-05, eta: 5:58:53, time: 1.096, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0400, loss_cls: 0.1570, acc: 93.9978, loss_bbox: 0.2110, loss_mask: 0.2127, loss: 0.6335 2024-05-31 09:52:53,386 - mmdet - INFO - Epoch [10][2700/7330] lr: 1.000e-05, eta: 5:57:57, time: 1.083, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0367, loss_cls: 0.1367, acc: 94.7866, loss_bbox: 0.1868, loss_mask: 0.2038, loss: 0.5749 2024-05-31 09:53:47,175 - mmdet - INFO - Epoch [10][2750/7330] lr: 1.000e-05, eta: 5:57:00, time: 1.076, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0387, loss_cls: 0.1467, acc: 94.2590, loss_bbox: 0.2006, loss_mask: 0.2083, loss: 0.6058 2024-05-31 09:54:40,612 - mmdet - INFO - Epoch [10][2800/7330] lr: 1.000e-05, eta: 5:56:04, time: 1.069, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0370, loss_cls: 0.1433, acc: 94.5066, loss_bbox: 0.1906, loss_mask: 0.2085, loss: 0.5901 2024-05-31 09:55:39,069 - mmdet - INFO - Epoch [10][2850/7330] lr: 1.000e-05, eta: 5:55:09, time: 1.169, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0387, loss_cls: 0.1432, acc: 94.4312, loss_bbox: 0.1935, loss_mask: 0.2038, loss: 0.5904 2024-05-31 09:56:36,588 - mmdet - INFO - Epoch [10][2900/7330] lr: 1.000e-05, eta: 5:54:14, time: 1.150, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0385, loss_cls: 0.1513, acc: 94.1770, loss_bbox: 0.2023, loss_mask: 0.2106, loss: 0.6145 2024-05-31 09:57:32,753 - mmdet - INFO - Epoch [10][2950/7330] lr: 1.000e-05, eta: 5:53:19, time: 1.123, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0394, loss_cls: 0.1488, acc: 94.2795, loss_bbox: 0.2002, loss_mask: 0.2101, loss: 0.6098 2024-05-31 09:58:28,788 - mmdet - INFO - Epoch [10][3000/7330] lr: 1.000e-05, eta: 5:52:23, time: 1.121, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0377, loss_cls: 0.1460, acc: 94.5293, loss_bbox: 0.1955, loss_mask: 0.2114, loss: 0.6019 2024-05-31 09:59:23,641 - mmdet - INFO - Epoch [10][3050/7330] lr: 1.000e-05, eta: 5:51:27, time: 1.097, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0129, loss_rpn_bbox: 0.0401, loss_cls: 0.1535, acc: 94.1184, loss_bbox: 0.2048, loss_mask: 0.2130, loss: 0.6242 2024-05-31 10:00:19,878 - mmdet - INFO - Epoch [10][3100/7330] lr: 1.000e-05, eta: 5:50:32, time: 1.125, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0370, loss_cls: 0.1463, acc: 94.2871, loss_bbox: 0.1935, loss_mask: 0.2076, loss: 0.5954 2024-05-31 10:01:18,493 - mmdet - INFO - Epoch [10][3150/7330] lr: 1.000e-05, eta: 5:49:37, time: 1.172, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0393, loss_cls: 0.1504, acc: 94.2395, loss_bbox: 0.2018, loss_mask: 0.2117, loss: 0.6141 2024-05-31 10:02:12,985 - mmdet - INFO - Epoch [10][3200/7330] lr: 1.000e-05, eta: 5:48:41, time: 1.090, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0373, loss_cls: 0.1477, acc: 94.3081, loss_bbox: 0.1968, loss_mask: 0.2118, loss: 0.6055 2024-05-31 10:03:06,185 - mmdet - INFO - Epoch [10][3250/7330] lr: 1.000e-05, eta: 5:47:44, time: 1.064, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0349, loss_cls: 0.1374, acc: 94.7212, loss_bbox: 0.1829, loss_mask: 0.2059, loss: 0.5720 2024-05-31 10:03:59,966 - mmdet - INFO - Epoch [10][3300/7330] lr: 1.000e-05, eta: 5:46:48, time: 1.076, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0356, loss_cls: 0.1384, acc: 94.6604, loss_bbox: 0.1916, loss_mask: 0.2064, loss: 0.5820 2024-05-31 10:04:56,444 - mmdet - INFO - Epoch [10][3350/7330] lr: 1.000e-05, eta: 5:45:53, time: 1.130, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0403, loss_cls: 0.1489, acc: 94.2302, loss_bbox: 0.2006, loss_mask: 0.2086, loss: 0.6103 2024-05-31 10:05:50,349 - mmdet - INFO - Epoch [10][3400/7330] lr: 1.000e-05, eta: 5:44:57, time: 1.078, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0386, loss_cls: 0.1505, acc: 94.2659, loss_bbox: 0.2065, loss_mask: 0.2137, loss: 0.6202 2024-05-31 10:06:46,617 - mmdet - INFO - Epoch [10][3450/7330] lr: 1.000e-05, eta: 5:44:01, time: 1.125, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0393, loss_cls: 0.1489, acc: 94.3071, loss_bbox: 0.2010, loss_mask: 0.2125, loss: 0.6140 2024-05-31 10:07:42,762 - mmdet - INFO - Epoch [10][3500/7330] lr: 1.000e-05, eta: 5:43:06, time: 1.123, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0374, loss_cls: 0.1454, acc: 94.4607, loss_bbox: 0.1974, loss_mask: 0.2167, loss: 0.6076 2024-05-31 10:08:39,348 - mmdet - INFO - Epoch [10][3550/7330] lr: 1.000e-05, eta: 5:42:10, time: 1.132, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0380, loss_cls: 0.1524, acc: 94.1519, loss_bbox: 0.2054, loss_mask: 0.2103, loss: 0.6175 2024-05-31 10:09:35,312 - mmdet - INFO - Epoch [10][3600/7330] lr: 1.000e-05, eta: 5:41:15, time: 1.119, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0367, loss_cls: 0.1463, acc: 94.4246, loss_bbox: 0.1992, loss_mask: 0.2101, loss: 0.6038 2024-05-31 10:10:30,800 - mmdet - INFO - Epoch [10][3650/7330] lr: 1.000e-05, eta: 5:40:19, time: 1.110, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0377, loss_cls: 0.1413, acc: 94.5364, loss_bbox: 0.1915, loss_mask: 0.2069, loss: 0.5888 2024-05-31 10:11:24,621 - mmdet - INFO - Epoch [10][3700/7330] lr: 1.000e-05, eta: 5:39:23, time: 1.076, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0372, loss_cls: 0.1459, acc: 94.3877, loss_bbox: 0.2001, loss_mask: 0.2077, loss: 0.6029 2024-05-31 10:12:23,300 - mmdet - INFO - Epoch [10][3750/7330] lr: 1.000e-05, eta: 5:38:28, time: 1.174, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0358, loss_cls: 0.1440, acc: 94.3936, loss_bbox: 0.1986, loss_mask: 0.2095, loss: 0.5991 2024-05-31 10:13:16,196 - mmdet - INFO - Epoch [10][3800/7330] lr: 1.000e-05, eta: 5:37:31, time: 1.058, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0348, loss_cls: 0.1399, acc: 94.5669, loss_bbox: 0.1927, loss_mask: 0.2089, loss: 0.5867 2024-05-31 10:14:09,796 - mmdet - INFO - Epoch [10][3850/7330] lr: 1.000e-05, eta: 5:36:35, time: 1.072, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0360, loss_cls: 0.1411, acc: 94.5840, loss_bbox: 0.1878, loss_mask: 0.2043, loss: 0.5800 2024-05-31 10:15:04,015 - mmdet - INFO - Epoch [10][3900/7330] lr: 1.000e-05, eta: 5:35:39, time: 1.084, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0381, loss_cls: 0.1475, acc: 94.3530, loss_bbox: 0.1992, loss_mask: 0.2085, loss: 0.6052 2024-05-31 10:16:00,087 - mmdet - INFO - Epoch [10][3950/7330] lr: 1.000e-05, eta: 5:34:44, time: 1.121, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0399, loss_cls: 0.1505, acc: 94.2263, loss_bbox: 0.2025, loss_mask: 0.2118, loss: 0.6163 2024-05-31 10:16:53,964 - mmdet - INFO - Epoch [10][4000/7330] lr: 1.000e-05, eta: 5:33:48, time: 1.077, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0374, loss_cls: 0.1433, acc: 94.5012, loss_bbox: 0.1923, loss_mask: 0.2058, loss: 0.5901 2024-05-31 10:17:48,287 - mmdet - INFO - Epoch [10][4050/7330] lr: 1.000e-05, eta: 5:32:52, time: 1.086, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0397, loss_cls: 0.1511, acc: 94.2510, loss_bbox: 0.2057, loss_mask: 0.2146, loss: 0.6238 2024-05-31 10:18:46,394 - mmdet - INFO - Epoch [10][4100/7330] lr: 1.000e-05, eta: 5:31:56, time: 1.162, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0378, loss_cls: 0.1511, acc: 94.2600, loss_bbox: 0.2002, loss_mask: 0.2123, loss: 0.6126 2024-05-31 10:19:42,388 - mmdet - INFO - Epoch [10][4150/7330] lr: 1.000e-05, eta: 5:31:01, time: 1.120, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0374, loss_cls: 0.1488, acc: 94.3232, loss_bbox: 0.2028, loss_mask: 0.2167, loss: 0.6171 2024-05-31 10:20:40,122 - mmdet - INFO - Epoch [10][4200/7330] lr: 1.000e-05, eta: 5:30:06, time: 1.155, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0364, loss_cls: 0.1384, acc: 94.6230, loss_bbox: 0.1887, loss_mask: 0.2059, loss: 0.5805 2024-05-31 10:21:33,488 - mmdet - INFO - Epoch [10][4250/7330] lr: 1.000e-05, eta: 5:29:10, time: 1.067, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0367, loss_cls: 0.1429, acc: 94.5166, loss_bbox: 0.1968, loss_mask: 0.2062, loss: 0.5932 2024-05-31 10:22:30,355 - mmdet - INFO - Epoch [10][4300/7330] lr: 1.000e-05, eta: 5:28:14, time: 1.137, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0393, loss_cls: 0.1504, acc: 94.2625, loss_bbox: 0.2036, loss_mask: 0.2107, loss: 0.6154 2024-05-31 10:23:26,111 - mmdet - INFO - Epoch [10][4350/7330] lr: 1.000e-05, eta: 5:27:18, time: 1.115, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1446, acc: 94.4614, loss_bbox: 0.1939, loss_mask: 0.2095, loss: 0.5967 2024-05-31 10:24:20,420 - mmdet - INFO - Epoch [10][4400/7330] lr: 1.000e-05, eta: 5:26:22, time: 1.086, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0383, loss_cls: 0.1481, acc: 94.3044, loss_bbox: 0.1975, loss_mask: 0.2111, loss: 0.6066 2024-05-31 10:25:13,395 - mmdet - INFO - Epoch [10][4450/7330] lr: 1.000e-05, eta: 5:25:26, time: 1.060, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0362, loss_cls: 0.1433, acc: 94.5632, loss_bbox: 0.1909, loss_mask: 0.2065, loss: 0.5881 2024-05-31 10:26:06,778 - mmdet - INFO - Epoch [10][4500/7330] lr: 1.000e-05, eta: 5:24:30, time: 1.068, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0363, loss_cls: 0.1447, acc: 94.4897, loss_bbox: 0.1922, loss_mask: 0.2066, loss: 0.5900 2024-05-31 10:27:03,155 - mmdet - INFO - Epoch [10][4550/7330] lr: 1.000e-05, eta: 5:23:34, time: 1.128, data_time: 0.053, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0385, loss_cls: 0.1440, acc: 94.4812, loss_bbox: 0.1938, loss_mask: 0.2078, loss: 0.5946 2024-05-31 10:27:58,968 - mmdet - INFO - Epoch [10][4600/7330] lr: 1.000e-05, eta: 5:22:39, time: 1.116, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0379, loss_cls: 0.1429, acc: 94.5405, loss_bbox: 0.1921, loss_mask: 0.2032, loss: 0.5870 2024-05-31 10:28:52,363 - mmdet - INFO - Epoch [10][4650/7330] lr: 1.000e-05, eta: 5:21:43, time: 1.068, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0363, loss_cls: 0.1461, acc: 94.4221, loss_bbox: 0.1934, loss_mask: 0.2058, loss: 0.5924 2024-05-31 10:29:50,262 - mmdet - INFO - Epoch [10][4700/7330] lr: 1.000e-05, eta: 5:20:47, time: 1.158, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0357, loss_cls: 0.1431, acc: 94.5024, loss_bbox: 0.1947, loss_mask: 0.2108, loss: 0.5955 2024-05-31 10:30:48,347 - mmdet - INFO - Epoch [10][4750/7330] lr: 1.000e-05, eta: 5:19:52, time: 1.162, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0375, loss_cls: 0.1436, acc: 94.4231, loss_bbox: 0.1966, loss_mask: 0.2075, loss: 0.5955 2024-05-31 10:31:43,220 - mmdet - INFO - Epoch [10][4800/7330] lr: 1.000e-05, eta: 5:18:57, time: 1.097, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0394, loss_cls: 0.1459, acc: 94.3850, loss_bbox: 0.1968, loss_mask: 0.2092, loss: 0.6027 2024-05-31 10:32:37,516 - mmdet - INFO - Epoch [10][4850/7330] lr: 1.000e-05, eta: 5:18:01, time: 1.086, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0380, loss_cls: 0.1453, acc: 94.3179, loss_bbox: 0.1980, loss_mask: 0.2084, loss: 0.6007 2024-05-31 10:33:34,018 - mmdet - INFO - Epoch [10][4900/7330] lr: 1.000e-05, eta: 5:17:05, time: 1.130, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0369, loss_cls: 0.1402, acc: 94.6597, loss_bbox: 0.1862, loss_mask: 0.2090, loss: 0.5843 2024-05-31 10:34:29,715 - mmdet - INFO - Epoch [10][4950/7330] lr: 1.000e-05, eta: 5:16:09, time: 1.114, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0386, loss_cls: 0.1481, acc: 94.3242, loss_bbox: 0.1961, loss_mask: 0.2064, loss: 0.6015 2024-05-31 10:35:23,333 - mmdet - INFO - Epoch [10][5000/7330] lr: 1.000e-05, eta: 5:15:13, time: 1.072, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0365, loss_cls: 0.1405, acc: 94.5237, loss_bbox: 0.1905, loss_mask: 0.2063, loss: 0.5850 2024-05-31 10:36:17,405 - mmdet - INFO - Epoch [10][5050/7330] lr: 1.000e-05, eta: 5:14:17, time: 1.081, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0379, loss_cls: 0.1465, acc: 94.2427, loss_bbox: 0.1987, loss_mask: 0.2131, loss: 0.6080 2024-05-31 10:37:13,408 - mmdet - INFO - Epoch [10][5100/7330] lr: 1.000e-05, eta: 5:13:22, time: 1.120, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0379, loss_cls: 0.1511, acc: 94.2202, loss_bbox: 0.2053, loss_mask: 0.2121, loss: 0.6175 2024-05-31 10:38:07,730 - mmdet - INFO - Epoch [10][5150/7330] lr: 1.000e-05, eta: 5:12:26, time: 1.086, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0410, loss_cls: 0.1502, acc: 94.1475, loss_bbox: 0.2050, loss_mask: 0.2103, loss: 0.6194 2024-05-31 10:39:06,137 - mmdet - INFO - Epoch [10][5200/7330] lr: 1.000e-05, eta: 5:11:31, time: 1.168, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0350, loss_cls: 0.1453, acc: 94.4006, loss_bbox: 0.1974, loss_mask: 0.2065, loss: 0.5948 2024-05-31 10:40:01,863 - mmdet - INFO - Epoch [10][5250/7330] lr: 1.000e-05, eta: 5:10:35, time: 1.114, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0367, loss_cls: 0.1417, acc: 94.5122, loss_bbox: 0.1931, loss_mask: 0.2077, loss: 0.5894 2024-05-31 10:41:00,423 - mmdet - INFO - Epoch [10][5300/7330] lr: 1.000e-05, eta: 5:09:40, time: 1.171, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0406, loss_cls: 0.1516, acc: 94.1985, loss_bbox: 0.2050, loss_mask: 0.2102, loss: 0.6180 2024-05-31 10:41:53,936 - mmdet - INFO - Epoch [10][5350/7330] lr: 1.000e-05, eta: 5:08:44, time: 1.070, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0375, loss_cls: 0.1460, acc: 94.4053, loss_bbox: 0.1967, loss_mask: 0.2096, loss: 0.6008 2024-05-31 10:42:47,596 - mmdet - INFO - Epoch [10][5400/7330] lr: 1.000e-05, eta: 5:07:48, time: 1.073, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0389, loss_cls: 0.1501, acc: 94.2478, loss_bbox: 0.2003, loss_mask: 0.2115, loss: 0.6125 2024-05-31 10:43:41,284 - mmdet - INFO - Epoch [10][5450/7330] lr: 1.000e-05, eta: 5:06:52, time: 1.074, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0360, loss_cls: 0.1438, acc: 94.4177, loss_bbox: 0.1900, loss_mask: 0.2095, loss: 0.5901 2024-05-31 10:44:37,364 - mmdet - INFO - Epoch [10][5500/7330] lr: 1.000e-05, eta: 5:05:56, time: 1.122, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0361, loss_cls: 0.1399, acc: 94.6208, loss_bbox: 0.1912, loss_mask: 0.2063, loss: 0.5843 2024-05-31 10:45:30,887 - mmdet - INFO - Epoch [10][5550/7330] lr: 1.000e-05, eta: 5:05:00, time: 1.071, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0370, loss_cls: 0.1487, acc: 94.2332, loss_bbox: 0.2023, loss_mask: 0.2102, loss: 0.6105 2024-05-31 10:46:26,934 - mmdet - INFO - Epoch [10][5600/7330] lr: 1.000e-05, eta: 5:04:04, time: 1.121, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0372, loss_cls: 0.1428, acc: 94.5322, loss_bbox: 0.1965, loss_mask: 0.2142, loss: 0.6022 2024-05-31 10:47:20,908 - mmdet - INFO - Epoch [10][5650/7330] lr: 1.000e-05, eta: 5:03:08, time: 1.079, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0364, loss_cls: 0.1427, acc: 94.5466, loss_bbox: 0.1943, loss_mask: 0.2116, loss: 0.5962 2024-05-31 10:48:17,412 - mmdet - INFO - Epoch [10][5700/7330] lr: 1.000e-05, eta: 5:02:13, time: 1.130, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0384, loss_cls: 0.1437, acc: 94.4651, loss_bbox: 0.1939, loss_mask: 0.2076, loss: 0.5951 2024-05-31 10:49:13,120 - mmdet - INFO - Epoch [10][5750/7330] lr: 1.000e-05, eta: 5:01:17, time: 1.114, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0378, loss_cls: 0.1416, acc: 94.5164, loss_bbox: 0.1966, loss_mask: 0.2082, loss: 0.5951 2024-05-31 10:50:10,840 - mmdet - INFO - Epoch [10][5800/7330] lr: 1.000e-05, eta: 5:00:22, time: 1.154, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0365, loss_cls: 0.1387, acc: 94.6851, loss_bbox: 0.1887, loss_mask: 0.2015, loss: 0.5763 2024-05-31 10:51:07,310 - mmdet - INFO - Epoch [10][5850/7330] lr: 1.000e-05, eta: 4:59:27, time: 1.129, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0356, loss_cls: 0.1414, acc: 94.5413, loss_bbox: 0.1956, loss_mask: 0.2079, loss: 0.5908 2024-05-31 10:52:03,444 - mmdet - INFO - Epoch [10][5900/7330] lr: 1.000e-05, eta: 4:58:31, time: 1.123, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0363, loss_cls: 0.1387, acc: 94.7073, loss_bbox: 0.1913, loss_mask: 0.2096, loss: 0.5868 2024-05-31 10:52:58,199 - mmdet - INFO - Epoch [10][5950/7330] lr: 1.000e-05, eta: 4:57:35, time: 1.095, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0393, loss_cls: 0.1499, acc: 94.1716, loss_bbox: 0.2036, loss_mask: 0.2143, loss: 0.6183 2024-05-31 10:53:52,278 - mmdet - INFO - Epoch [10][6000/7330] lr: 1.000e-05, eta: 4:56:39, time: 1.082, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0379, loss_cls: 0.1474, acc: 94.3240, loss_bbox: 0.1998, loss_mask: 0.2112, loss: 0.6074 2024-05-31 10:54:46,372 - mmdet - INFO - Epoch [10][6050/7330] lr: 1.000e-05, eta: 4:55:43, time: 1.082, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0125, loss_rpn_bbox: 0.0387, loss_cls: 0.1500, acc: 94.2781, loss_bbox: 0.1977, loss_mask: 0.2062, loss: 0.6050 2024-05-31 10:55:42,126 - mmdet - INFO - Epoch [10][6100/7330] lr: 1.000e-05, eta: 4:54:47, time: 1.115, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0368, loss_cls: 0.1439, acc: 94.3860, loss_bbox: 0.1972, loss_mask: 0.2074, loss: 0.5965 2024-05-31 10:56:35,373 - mmdet - INFO - Epoch [10][6150/7330] lr: 1.000e-05, eta: 4:53:51, time: 1.065, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0355, loss_cls: 0.1465, acc: 94.3188, loss_bbox: 0.1969, loss_mask: 0.2112, loss: 0.6008 2024-05-31 10:57:31,351 - mmdet - INFO - Epoch [10][6200/7330] lr: 1.000e-05, eta: 4:52:56, time: 1.120, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0379, loss_cls: 0.1419, acc: 94.4587, loss_bbox: 0.1926, loss_mask: 0.2063, loss: 0.5900 2024-05-31 10:58:24,902 - mmdet - INFO - Epoch [10][6250/7330] lr: 1.000e-05, eta: 4:52:00, time: 1.071, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0365, loss_cls: 0.1425, acc: 94.4971, loss_bbox: 0.1937, loss_mask: 0.2088, loss: 0.5921 2024-05-31 10:59:21,388 - mmdet - INFO - Epoch [10][6300/7330] lr: 1.000e-05, eta: 4:51:04, time: 1.130, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0408, loss_cls: 0.1589, acc: 93.9700, loss_bbox: 0.2099, loss_mask: 0.2106, loss: 0.6320 2024-05-31 11:00:18,986 - mmdet - INFO - Epoch [10][6350/7330] lr: 1.000e-05, eta: 4:50:09, time: 1.152, data_time: 0.051, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0358, loss_cls: 0.1373, acc: 94.8179, loss_bbox: 0.1853, loss_mask: 0.2053, loss: 0.5740 2024-05-31 11:01:14,617 - mmdet - INFO - Epoch [10][6400/7330] lr: 1.000e-05, eta: 4:49:13, time: 1.113, data_time: 0.056, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0351, loss_cls: 0.1320, acc: 94.9336, loss_bbox: 0.1823, loss_mask: 0.1994, loss: 0.5585 2024-05-31 11:02:10,993 - mmdet - INFO - Epoch [10][6450/7330] lr: 1.000e-05, eta: 4:48:18, time: 1.128, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0385, loss_cls: 0.1460, acc: 94.4290, loss_bbox: 0.2029, loss_mask: 0.2145, loss: 0.6121 2024-05-31 11:03:07,150 - mmdet - INFO - Epoch [10][6500/7330] lr: 1.000e-05, eta: 4:47:22, time: 1.123, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0368, loss_cls: 0.1427, acc: 94.5227, loss_bbox: 0.1950, loss_mask: 0.2066, loss: 0.5926 2024-05-31 11:04:00,677 - mmdet - INFO - Epoch [10][6550/7330] lr: 1.000e-05, eta: 4:46:26, time: 1.070, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0364, loss_cls: 0.1454, acc: 94.3911, loss_bbox: 0.1981, loss_mask: 0.2081, loss: 0.5990 2024-05-31 11:04:54,796 - mmdet - INFO - Epoch [10][6600/7330] lr: 1.000e-05, eta: 4:45:30, time: 1.082, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0353, loss_cls: 0.1416, acc: 94.5850, loss_bbox: 0.1916, loss_mask: 0.2087, loss: 0.5881 2024-05-31 11:05:48,549 - mmdet - INFO - Epoch [10][6650/7330] lr: 1.000e-05, eta: 4:44:34, time: 1.075, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0372, loss_cls: 0.1527, acc: 94.1421, loss_bbox: 0.2067, loss_mask: 0.2093, loss: 0.6179 2024-05-31 11:06:44,985 - mmdet - INFO - Epoch [10][6700/7330] lr: 1.000e-05, eta: 4:43:39, time: 1.129, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0372, loss_cls: 0.1435, acc: 94.5198, loss_bbox: 0.1948, loss_mask: 0.2063, loss: 0.5936 2024-05-31 11:07:39,749 - mmdet - INFO - Epoch [10][6750/7330] lr: 1.000e-05, eta: 4:42:43, time: 1.095, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0398, loss_cls: 0.1537, acc: 94.0815, loss_bbox: 0.2111, loss_mask: 0.2126, loss: 0.6281 2024-05-31 11:08:35,716 - mmdet - INFO - Epoch [10][6800/7330] lr: 1.000e-05, eta: 4:41:47, time: 1.119, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0388, loss_cls: 0.1489, acc: 94.2771, loss_bbox: 0.2042, loss_mask: 0.2167, loss: 0.6200 2024-05-31 11:09:29,981 - mmdet - INFO - Epoch [10][6850/7330] lr: 1.000e-05, eta: 4:40:51, time: 1.085, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0377, loss_cls: 0.1467, acc: 94.3186, loss_bbox: 0.1988, loss_mask: 0.2074, loss: 0.6014 2024-05-31 11:10:30,313 - mmdet - INFO - Epoch [10][6900/7330] lr: 1.000e-05, eta: 4:39:56, time: 1.207, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0358, loss_cls: 0.1395, acc: 94.6221, loss_bbox: 0.1926, loss_mask: 0.2036, loss: 0.5822 2024-05-31 11:11:26,419 - mmdet - INFO - Epoch [10][6950/7330] lr: 1.000e-05, eta: 4:39:01, time: 1.122, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0376, loss_cls: 0.1476, acc: 94.3760, loss_bbox: 0.2008, loss_mask: 0.2126, loss: 0.6097 2024-05-31 11:12:22,084 - mmdet - INFO - Epoch [10][7000/7330] lr: 1.000e-05, eta: 4:38:05, time: 1.113, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0366, loss_cls: 0.1401, acc: 94.6433, loss_bbox: 0.1908, loss_mask: 0.2089, loss: 0.5878 2024-05-31 11:13:16,079 - mmdet - INFO - Epoch [10][7050/7330] lr: 1.000e-05, eta: 4:37:09, time: 1.080, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0372, loss_cls: 0.1438, acc: 94.3958, loss_bbox: 0.1944, loss_mask: 0.2051, loss: 0.5920 2024-05-31 11:14:12,056 - mmdet - INFO - Epoch [10][7100/7330] lr: 1.000e-05, eta: 4:36:14, time: 1.119, data_time: 0.049, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0356, loss_cls: 0.1372, acc: 94.8145, loss_bbox: 0.1825, loss_mask: 0.1982, loss: 0.5637 2024-05-31 11:15:05,892 - mmdet - INFO - Epoch [10][7150/7330] lr: 1.000e-05, eta: 4:35:18, time: 1.077, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0397, loss_cls: 0.1461, acc: 94.3083, loss_bbox: 0.1981, loss_mask: 0.2108, loss: 0.6066 2024-05-31 11:15:59,846 - mmdet - INFO - Epoch [10][7200/7330] lr: 1.000e-05, eta: 4:34:22, time: 1.079, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0367, loss_cls: 0.1458, acc: 94.3865, loss_bbox: 0.1988, loss_mask: 0.2047, loss: 0.5964 2024-05-31 11:16:53,783 - mmdet - INFO - Epoch [10][7250/7330] lr: 1.000e-05, eta: 4:33:26, time: 1.079, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0386, loss_cls: 0.1500, acc: 94.2729, loss_bbox: 0.2033, loss_mask: 0.2133, loss: 0.6156 2024-05-31 11:17:50,468 - mmdet - INFO - Epoch [10][7300/7330] lr: 1.000e-05, eta: 4:32:30, time: 1.134, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0374, loss_cls: 0.1433, acc: 94.4883, loss_bbox: 0.1949, loss_mask: 0.2074, loss: 0.5948 2024-05-31 11:18:23,550 - mmdet - INFO - Saving checkpoint at 10 epochs 2024-05-31 11:20:42,006 - mmdet - INFO - Evaluating bbox... 2024-05-31 11:21:01,559 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.492 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.720 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.541 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.329 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.528 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.604 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.604 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.604 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.427 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.646 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.766 2024-05-31 11:21:01,559 - mmdet - INFO - Evaluating segm... 2024-05-31 11:21:25,870 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.440 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.686 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.471 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.233 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.472 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.649 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.355 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.725 2024-05-31 11:21:26,161 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 11:21:26,164 - mmdet - INFO - Epoch(val) [10][625] bbox_mAP: 0.4920, bbox_mAP_50: 0.7200, bbox_mAP_75: 0.5410, bbox_mAP_s: 0.3290, bbox_mAP_m: 0.5280, bbox_mAP_l: 0.6590, bbox_mAP_copypaste: 0.492 0.720 0.541 0.329 0.528 0.659, segm_mAP: 0.4400, segm_mAP_50: 0.6860, segm_mAP_75: 0.4710, segm_mAP_s: 0.2330, segm_mAP_m: 0.4720, segm_mAP_l: 0.6490, segm_mAP_copypaste: 0.440 0.686 0.471 0.233 0.472 0.649 2024-05-31 11:22:34,632 - mmdet - INFO - Epoch [11][50/7330] lr: 1.000e-05, eta: 4:30:57, time: 1.369, data_time: 0.153, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1403, acc: 94.5435, loss_bbox: 0.1949, loss_mask: 0.2095, loss: 0.5936 2024-05-31 11:23:28,854 - mmdet - INFO - Epoch [11][100/7330] lr: 1.000e-05, eta: 4:30:01, time: 1.084, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0361, loss_cls: 0.1415, acc: 94.5457, loss_bbox: 0.1969, loss_mask: 0.2074, loss: 0.5926 2024-05-31 11:24:25,084 - mmdet - INFO - Epoch [11][150/7330] lr: 1.000e-05, eta: 4:29:06, time: 1.125, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0378, loss_cls: 0.1425, acc: 94.4565, loss_bbox: 0.1923, loss_mask: 0.2116, loss: 0.5950 2024-05-31 11:25:19,065 - mmdet - INFO - Epoch [11][200/7330] lr: 1.000e-05, eta: 4:28:10, time: 1.080, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0365, loss_cls: 0.1421, acc: 94.5762, loss_bbox: 0.1958, loss_mask: 0.2075, loss: 0.5927 2024-05-31 11:26:15,447 - mmdet - INFO - Epoch [11][250/7330] lr: 1.000e-05, eta: 4:27:14, time: 1.128, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0372, loss_cls: 0.1390, acc: 94.6797, loss_bbox: 0.1922, loss_mask: 0.2073, loss: 0.5861 2024-05-31 11:27:09,080 - mmdet - INFO - Epoch [11][300/7330] lr: 1.000e-05, eta: 4:26:18, time: 1.073, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0370, loss_cls: 0.1363, acc: 94.7417, loss_bbox: 0.1881, loss_mask: 0.2053, loss: 0.5763 2024-05-31 11:28:07,628 - mmdet - INFO - Epoch [11][350/7330] lr: 1.000e-05, eta: 4:25:23, time: 1.171, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0350, loss_cls: 0.1373, acc: 94.8062, loss_bbox: 0.1869, loss_mask: 0.2027, loss: 0.5718 2024-05-31 11:29:02,177 - mmdet - INFO - Epoch [11][400/7330] lr: 1.000e-05, eta: 4:24:27, time: 1.091, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0384, loss_cls: 0.1456, acc: 94.3396, loss_bbox: 0.2005, loss_mask: 0.2093, loss: 0.6055 2024-05-31 11:29:58,827 - mmdet - INFO - Epoch [11][450/7330] lr: 1.000e-05, eta: 4:23:32, time: 1.133, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0386, loss_cls: 0.1546, acc: 94.0415, loss_bbox: 0.2103, loss_mask: 0.2167, loss: 0.6322 2024-05-31 11:30:52,425 - mmdet - INFO - Epoch [11][500/7330] lr: 1.000e-05, eta: 4:22:36, time: 1.072, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0347, loss_cls: 0.1380, acc: 94.6765, loss_bbox: 0.1902, loss_mask: 0.2052, loss: 0.5791 2024-05-31 11:31:46,968 - mmdet - INFO - Epoch [11][550/7330] lr: 1.000e-05, eta: 4:21:40, time: 1.091, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0373, loss_cls: 0.1455, acc: 94.4343, loss_bbox: 0.2000, loss_mask: 0.2066, loss: 0.6002 2024-05-31 11:32:44,194 - mmdet - INFO - Epoch [11][600/7330] lr: 1.000e-05, eta: 4:20:45, time: 1.145, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0379, loss_cls: 0.1380, acc: 94.6765, loss_bbox: 0.1883, loss_mask: 0.2049, loss: 0.5795 2024-05-31 11:33:42,267 - mmdet - INFO - Epoch [11][650/7330] lr: 1.000e-05, eta: 4:19:49, time: 1.161, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0376, loss_cls: 0.1462, acc: 94.3779, loss_bbox: 0.1992, loss_mask: 0.2107, loss: 0.6046 2024-05-31 11:34:36,429 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 11:34:36,429 - mmdet - INFO - Epoch [11][700/7330] lr: 1.000e-05, eta: 4:18:53, time: 1.083, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0387, loss_cls: 0.1490, acc: 94.3198, loss_bbox: 0.2026, loss_mask: 0.2103, loss: 0.6126 2024-05-31 11:35:31,977 - mmdet - INFO - Epoch [11][750/7330] lr: 1.000e-05, eta: 4:17:58, time: 1.111, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0367, loss_cls: 0.1340, acc: 94.8376, loss_bbox: 0.1883, loss_mask: 0.2052, loss: 0.5748 2024-05-31 11:36:29,004 - mmdet - INFO - Epoch [11][800/7330] lr: 1.000e-05, eta: 4:17:02, time: 1.141, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0374, loss_cls: 0.1429, acc: 94.4785, loss_bbox: 0.1946, loss_mask: 0.2086, loss: 0.5943 2024-05-31 11:37:24,679 - mmdet - INFO - Epoch [11][850/7330] lr: 1.000e-05, eta: 4:16:07, time: 1.113, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0343, loss_cls: 0.1389, acc: 94.6917, loss_bbox: 0.1916, loss_mask: 0.2070, loss: 0.5832 2024-05-31 11:38:21,134 - mmdet - INFO - Epoch [11][900/7330] lr: 1.000e-05, eta: 4:15:11, time: 1.129, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0365, loss_cls: 0.1401, acc: 94.5398, loss_bbox: 0.1898, loss_mask: 0.2022, loss: 0.5796 2024-05-31 11:39:15,258 - mmdet - INFO - Epoch [11][950/7330] lr: 1.000e-05, eta: 4:14:15, time: 1.083, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0382, loss_cls: 0.1444, acc: 94.3748, loss_bbox: 0.1971, loss_mask: 0.2112, loss: 0.6019 2024-05-31 11:40:09,495 - mmdet - INFO - Epoch [11][1000/7330] lr: 1.000e-05, eta: 4:13:19, time: 1.085, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0385, loss_cls: 0.1408, acc: 94.5376, loss_bbox: 0.1925, loss_mask: 0.2033, loss: 0.5866 2024-05-31 11:41:06,076 - mmdet - INFO - Epoch [11][1050/7330] lr: 1.000e-05, eta: 4:12:24, time: 1.132, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0396, loss_cls: 0.1449, acc: 94.2959, loss_bbox: 0.2009, loss_mask: 0.2116, loss: 0.6088 2024-05-31 11:42:00,136 - mmdet - INFO - Epoch [11][1100/7330] lr: 1.000e-05, eta: 4:11:28, time: 1.081, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0364, loss_cls: 0.1449, acc: 94.3936, loss_bbox: 0.1970, loss_mask: 0.2089, loss: 0.5975 2024-05-31 11:42:56,401 - mmdet - INFO - Epoch [11][1150/7330] lr: 1.000e-05, eta: 4:10:33, time: 1.125, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0380, loss_cls: 0.1457, acc: 94.3481, loss_bbox: 0.2010, loss_mask: 0.2089, loss: 0.6052 2024-05-31 11:43:52,898 - mmdet - INFO - Epoch [11][1200/7330] lr: 1.000e-05, eta: 4:09:37, time: 1.130, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0362, loss_cls: 0.1385, acc: 94.6050, loss_bbox: 0.1906, loss_mask: 0.2080, loss: 0.5837 2024-05-31 11:44:49,197 - mmdet - INFO - Epoch [11][1250/7330] lr: 1.000e-05, eta: 4:08:42, time: 1.126, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0369, loss_cls: 0.1435, acc: 94.4761, loss_bbox: 0.1937, loss_mask: 0.2098, loss: 0.5945 2024-05-31 11:45:44,214 - mmdet - INFO - Epoch [11][1300/7330] lr: 1.000e-05, eta: 4:07:46, time: 1.100, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0397, loss_cls: 0.1475, acc: 94.3105, loss_bbox: 0.2029, loss_mask: 0.2145, loss: 0.6167 2024-05-31 11:46:43,061 - mmdet - INFO - Epoch [11][1350/7330] lr: 1.000e-05, eta: 4:06:51, time: 1.177, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0387, loss_cls: 0.1409, acc: 94.5874, loss_bbox: 0.1952, loss_mask: 0.2110, loss: 0.5957 2024-05-31 11:47:41,549 - mmdet - INFO - Epoch [11][1400/7330] lr: 1.000e-05, eta: 4:05:56, time: 1.170, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0388, loss_cls: 0.1345, acc: 94.7979, loss_bbox: 0.1867, loss_mask: 0.2064, loss: 0.5762 2024-05-31 11:48:36,163 - mmdet - INFO - Epoch [11][1450/7330] lr: 1.000e-05, eta: 4:05:00, time: 1.092, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0370, loss_cls: 0.1460, acc: 94.3416, loss_bbox: 0.1976, loss_mask: 0.2071, loss: 0.5985 2024-05-31 11:49:30,764 - mmdet - INFO - Epoch [11][1500/7330] lr: 1.000e-05, eta: 4:04:04, time: 1.092, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0388, loss_cls: 0.1481, acc: 94.3635, loss_bbox: 0.1972, loss_mask: 0.2108, loss: 0.6066 2024-05-31 11:50:25,006 - mmdet - INFO - Epoch [11][1550/7330] lr: 1.000e-05, eta: 4:03:08, time: 1.085, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0384, loss_cls: 0.1479, acc: 94.3120, loss_bbox: 0.1965, loss_mask: 0.2079, loss: 0.6020 2024-05-31 11:51:19,241 - mmdet - INFO - Epoch [11][1600/7330] lr: 1.000e-05, eta: 4:02:12, time: 1.085, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0350, loss_cls: 0.1391, acc: 94.6367, loss_bbox: 0.1923, loss_mask: 0.2037, loss: 0.5802 2024-05-31 11:52:15,900 - mmdet - INFO - Epoch [11][1650/7330] lr: 1.000e-05, eta: 4:01:17, time: 1.133, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0376, loss_cls: 0.1433, acc: 94.4622, loss_bbox: 0.1967, loss_mask: 0.2125, loss: 0.6005 2024-05-31 11:53:12,908 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 11:53:12,908 - mmdet - INFO - Epoch [11][1700/7330] lr: 1.000e-05, eta: 4:00:21, time: 1.140, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0363, loss_cls: 0.1464, acc: 94.3579, loss_bbox: 0.1984, loss_mask: 0.2086, loss: 0.5999 2024-05-31 11:54:06,996 - mmdet - INFO - Epoch [11][1750/7330] lr: 1.000e-05, eta: 3:59:25, time: 1.082, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0351, loss_cls: 0.1337, acc: 94.7744, loss_bbox: 0.1823, loss_mask: 0.2027, loss: 0.5641 2024-05-31 11:55:06,548 - mmdet - INFO - Epoch [11][1800/7330] lr: 1.000e-05, eta: 3:58:30, time: 1.191, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0383, loss_cls: 0.1467, acc: 94.3215, loss_bbox: 0.2017, loss_mask: 0.2122, loss: 0.6098 2024-05-31 11:56:01,072 - mmdet - INFO - Epoch [11][1850/7330] lr: 1.000e-05, eta: 3:57:35, time: 1.091, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0373, loss_cls: 0.1424, acc: 94.5278, loss_bbox: 0.1949, loss_mask: 0.2053, loss: 0.5911 2024-05-31 11:56:57,350 - mmdet - INFO - Epoch [11][1900/7330] lr: 1.000e-05, eta: 3:56:39, time: 1.126, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0370, loss_cls: 0.1498, acc: 94.2544, loss_bbox: 0.2033, loss_mask: 0.2147, loss: 0.6160 2024-05-31 11:57:58,630 - mmdet - INFO - Epoch [11][1950/7330] lr: 1.000e-05, eta: 3:55:44, time: 1.226, data_time: 0.100, memory: 24290, loss_rpn_cls: 0.0117, loss_rpn_bbox: 0.0393, loss_cls: 0.1503, acc: 94.2302, loss_bbox: 0.2008, loss_mask: 0.2072, loss: 0.6093 2024-05-31 11:58:53,047 - mmdet - INFO - Epoch [11][2000/7330] lr: 1.000e-05, eta: 3:54:49, time: 1.088, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0370, loss_cls: 0.1423, acc: 94.5630, loss_bbox: 0.1914, loss_mask: 0.2077, loss: 0.5903 2024-05-31 11:59:48,105 - mmdet - INFO - Epoch [11][2050/7330] lr: 1.000e-05, eta: 3:53:53, time: 1.101, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0374, loss_cls: 0.1416, acc: 94.4778, loss_bbox: 0.1938, loss_mask: 0.2104, loss: 0.5940 2024-05-31 12:00:42,157 - mmdet - INFO - Epoch [11][2100/7330] lr: 1.000e-05, eta: 3:52:57, time: 1.081, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0357, loss_cls: 0.1412, acc: 94.6309, loss_bbox: 0.1859, loss_mask: 0.2009, loss: 0.5741 2024-05-31 12:01:36,750 - mmdet - INFO - Epoch [11][2150/7330] lr: 1.000e-05, eta: 3:52:01, time: 1.092, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0388, loss_cls: 0.1438, acc: 94.4893, loss_bbox: 0.2017, loss_mask: 0.2051, loss: 0.6008 2024-05-31 12:02:31,055 - mmdet - INFO - Epoch [11][2200/7330] lr: 1.000e-05, eta: 3:51:05, time: 1.086, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0362, loss_cls: 0.1374, acc: 94.7764, loss_bbox: 0.1869, loss_mask: 0.2034, loss: 0.5744 2024-05-31 12:03:31,155 - mmdet - INFO - Epoch [11][2250/7330] lr: 1.000e-05, eta: 3:50:10, time: 1.202, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0390, loss_cls: 0.1419, acc: 94.4788, loss_bbox: 0.1974, loss_mask: 0.2101, loss: 0.5988 2024-05-31 12:04:25,490 - mmdet - INFO - Epoch [11][2300/7330] lr: 1.000e-05, eta: 3:49:14, time: 1.087, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0360, loss_cls: 0.1371, acc: 94.7188, loss_bbox: 0.1856, loss_mask: 0.1996, loss: 0.5694 2024-05-31 12:05:20,406 - mmdet - INFO - Epoch [11][2350/7330] lr: 1.000e-05, eta: 3:48:19, time: 1.098, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0370, loss_cls: 0.1457, acc: 94.4663, loss_bbox: 0.2009, loss_mask: 0.2115, loss: 0.6061 2024-05-31 12:06:19,489 - mmdet - INFO - Epoch [11][2400/7330] lr: 1.000e-05, eta: 3:47:24, time: 1.182, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0376, loss_cls: 0.1415, acc: 94.5496, loss_bbox: 0.1967, loss_mask: 0.2061, loss: 0.5929 2024-05-31 12:07:16,135 - mmdet - INFO - Epoch [11][2450/7330] lr: 1.000e-05, eta: 3:46:28, time: 1.133, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0376, loss_cls: 0.1435, acc: 94.4729, loss_bbox: 0.1963, loss_mask: 0.2078, loss: 0.5957 2024-05-31 12:08:15,130 - mmdet - INFO - Epoch [11][2500/7330] lr: 1.000e-05, eta: 3:45:33, time: 1.180, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0382, loss_cls: 0.1460, acc: 94.3767, loss_bbox: 0.1976, loss_mask: 0.2104, loss: 0.6026 2024-05-31 12:09:11,661 - mmdet - INFO - Epoch [11][2550/7330] lr: 1.000e-05, eta: 3:44:38, time: 1.131, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0368, loss_cls: 0.1396, acc: 94.5664, loss_bbox: 0.1899, loss_mask: 0.2093, loss: 0.5860 2024-05-31 12:10:06,032 - mmdet - INFO - Epoch [11][2600/7330] lr: 1.000e-05, eta: 3:43:42, time: 1.087, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0365, loss_cls: 0.1433, acc: 94.5215, loss_bbox: 0.1917, loss_mask: 0.2079, loss: 0.5896 2024-05-31 12:11:00,616 - mmdet - INFO - Epoch [11][2650/7330] lr: 1.000e-05, eta: 3:42:46, time: 1.092, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0384, loss_cls: 0.1499, acc: 94.3330, loss_bbox: 0.2011, loss_mask: 0.2092, loss: 0.6095 2024-05-31 12:11:55,227 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 12:11:55,227 - mmdet - INFO - Epoch [11][2700/7330] lr: 1.000e-05, eta: 3:41:50, time: 1.092, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0384, loss_cls: 0.1453, acc: 94.4209, loss_bbox: 0.1941, loss_mask: 0.2081, loss: 0.5972 2024-05-31 12:12:48,816 - mmdet - INFO - Epoch [11][2750/7330] lr: 1.000e-05, eta: 3:40:54, time: 1.072, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0376, loss_cls: 0.1447, acc: 94.3911, loss_bbox: 0.1988, loss_mask: 0.2092, loss: 0.6017 2024-05-31 12:13:45,351 - mmdet - INFO - Epoch [11][2800/7330] lr: 1.000e-05, eta: 3:39:59, time: 1.131, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0382, loss_cls: 0.1436, acc: 94.4663, loss_bbox: 0.1954, loss_mask: 0.2089, loss: 0.5968 2024-05-31 12:14:42,598 - mmdet - INFO - Epoch [11][2850/7330] lr: 1.000e-05, eta: 3:39:03, time: 1.145, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0379, loss_cls: 0.1397, acc: 94.5752, loss_bbox: 0.1931, loss_mask: 0.2061, loss: 0.5879 2024-05-31 12:15:37,300 - mmdet - INFO - Epoch [11][2900/7330] lr: 1.000e-05, eta: 3:38:07, time: 1.094, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0387, loss_cls: 0.1485, acc: 94.3689, loss_bbox: 0.1984, loss_mask: 0.2123, loss: 0.6102 2024-05-31 12:16:31,884 - mmdet - INFO - Epoch [11][2950/7330] lr: 1.000e-05, eta: 3:37:12, time: 1.092, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0371, loss_cls: 0.1471, acc: 94.3699, loss_bbox: 0.2011, loss_mask: 0.2057, loss: 0.6023 2024-05-31 12:17:37,999 - mmdet - INFO - Epoch [11][3000/7330] lr: 1.000e-05, eta: 3:36:18, time: 1.322, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0398, loss_cls: 0.1424, acc: 94.5205, loss_bbox: 0.1907, loss_mask: 0.2077, loss: 0.5913 2024-05-31 12:18:32,840 - mmdet - INFO - Epoch [11][3050/7330] lr: 1.000e-05, eta: 3:35:22, time: 1.097, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0395, loss_cls: 0.1480, acc: 94.3267, loss_bbox: 0.1994, loss_mask: 0.2085, loss: 0.6064 2024-05-31 12:19:29,114 - mmdet - INFO - Epoch [11][3100/7330] lr: 1.000e-05, eta: 3:34:26, time: 1.125, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0379, loss_cls: 0.1421, acc: 94.5488, loss_bbox: 0.1934, loss_mask: 0.2077, loss: 0.5922 2024-05-31 12:20:23,336 - mmdet - INFO - Epoch [11][3150/7330] lr: 1.000e-05, eta: 3:33:30, time: 1.084, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0367, loss_cls: 0.1360, acc: 94.7388, loss_bbox: 0.1920, loss_mask: 0.2078, loss: 0.5823 2024-05-31 12:21:18,502 - mmdet - INFO - Epoch [11][3200/7330] lr: 1.000e-05, eta: 3:32:35, time: 1.103, data_time: 0.095, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0379, loss_cls: 0.1471, acc: 94.2935, loss_bbox: 0.1996, loss_mask: 0.2100, loss: 0.6062 2024-05-31 12:22:13,053 - mmdet - INFO - Epoch [11][3250/7330] lr: 1.000e-05, eta: 3:31:39, time: 1.091, data_time: 0.098, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0353, loss_cls: 0.1413, acc: 94.5659, loss_bbox: 0.1942, loss_mask: 0.2072, loss: 0.5892 2024-05-31 12:23:10,449 - mmdet - INFO - Epoch [11][3300/7330] lr: 1.000e-05, eta: 3:30:43, time: 1.148, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0394, loss_cls: 0.1527, acc: 94.1594, loss_bbox: 0.2015, loss_mask: 0.2099, loss: 0.6150 2024-05-31 12:24:04,414 - mmdet - INFO - Epoch [11][3350/7330] lr: 1.000e-05, eta: 3:29:48, time: 1.079, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0361, loss_cls: 0.1448, acc: 94.4236, loss_bbox: 0.1944, loss_mask: 0.2064, loss: 0.5929 2024-05-31 12:24:59,081 - mmdet - INFO - Epoch [11][3400/7330] lr: 1.000e-05, eta: 3:28:52, time: 1.093, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0378, loss_cls: 0.1469, acc: 94.3540, loss_bbox: 0.1951, loss_mask: 0.2070, loss: 0.5978 2024-05-31 12:25:56,305 - mmdet - INFO - Epoch [11][3450/7330] lr: 1.000e-05, eta: 3:27:56, time: 1.144, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0385, loss_cls: 0.1434, acc: 94.5261, loss_bbox: 0.1957, loss_mask: 0.2102, loss: 0.5994 2024-05-31 12:26:53,491 - mmdet - INFO - Epoch [11][3500/7330] lr: 1.000e-05, eta: 3:27:01, time: 1.144, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1444, acc: 94.4104, loss_bbox: 0.1964, loss_mask: 0.2102, loss: 0.5998 2024-05-31 12:27:52,512 - mmdet - INFO - Epoch [11][3550/7330] lr: 1.000e-05, eta: 3:26:06, time: 1.180, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0375, loss_cls: 0.1460, acc: 94.3613, loss_bbox: 0.2012, loss_mask: 0.2099, loss: 0.6058 2024-05-31 12:28:51,817 - mmdet - INFO - Epoch [11][3600/7330] lr: 1.000e-05, eta: 3:25:11, time: 1.186, data_time: 0.100, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0383, loss_cls: 0.1454, acc: 94.3762, loss_bbox: 0.1973, loss_mask: 0.2110, loss: 0.6033 2024-05-31 12:29:46,401 - mmdet - INFO - Epoch [11][3650/7330] lr: 1.000e-05, eta: 3:24:15, time: 1.092, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0381, loss_cls: 0.1418, acc: 94.4758, loss_bbox: 0.1966, loss_mask: 0.2094, loss: 0.5964 2024-05-31 12:30:43,619 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 12:30:43,619 - mmdet - INFO - Epoch [11][3700/7330] lr: 1.000e-05, eta: 3:23:19, time: 1.144, data_time: 0.092, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0374, loss_cls: 0.1466, acc: 94.3411, loss_bbox: 0.1955, loss_mask: 0.2109, loss: 0.6015 2024-05-31 12:31:38,093 - mmdet - INFO - Epoch [11][3750/7330] lr: 1.000e-05, eta: 3:22:24, time: 1.089, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0377, loss_cls: 0.1398, acc: 94.5854, loss_bbox: 0.1953, loss_mask: 0.2068, loss: 0.5902 2024-05-31 12:32:33,181 - mmdet - INFO - Epoch [11][3800/7330] lr: 1.000e-05, eta: 3:21:28, time: 1.102, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0374, loss_cls: 0.1491, acc: 94.2188, loss_bbox: 0.1998, loss_mask: 0.2105, loss: 0.6078 2024-05-31 12:33:30,091 - mmdet - INFO - Epoch [11][3850/7330] lr: 1.000e-05, eta: 3:20:32, time: 1.138, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0374, loss_cls: 0.1402, acc: 94.5571, loss_bbox: 0.1936, loss_mask: 0.2080, loss: 0.5895 2024-05-31 12:34:24,630 - mmdet - INFO - Epoch [11][3900/7330] lr: 1.000e-05, eta: 3:19:36, time: 1.091, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0407, loss_cls: 0.1474, acc: 94.3093, loss_bbox: 0.2018, loss_mask: 0.2141, loss: 0.6155 2024-05-31 12:35:18,436 - mmdet - INFO - Epoch [11][3950/7330] lr: 1.000e-05, eta: 3:18:41, time: 1.076, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0351, loss_cls: 0.1347, acc: 94.8054, loss_bbox: 0.1865, loss_mask: 0.2042, loss: 0.5712 2024-05-31 12:36:12,653 - mmdet - INFO - Epoch [11][4000/7330] lr: 1.000e-05, eta: 3:17:45, time: 1.084, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0364, loss_cls: 0.1349, acc: 94.7932, loss_bbox: 0.1854, loss_mask: 0.2025, loss: 0.5703 2024-05-31 12:37:13,689 - mmdet - INFO - Epoch [11][4050/7330] lr: 1.000e-05, eta: 3:16:50, time: 1.221, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0368, loss_cls: 0.1449, acc: 94.4294, loss_bbox: 0.1993, loss_mask: 0.2093, loss: 0.6009 2024-05-31 12:38:13,538 - mmdet - INFO - Epoch [11][4100/7330] lr: 1.000e-05, eta: 3:15:55, time: 1.197, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0368, loss_cls: 0.1430, acc: 94.4797, loss_bbox: 0.1953, loss_mask: 0.2041, loss: 0.5896 2024-05-31 12:39:08,143 - mmdet - INFO - Epoch [11][4150/7330] lr: 1.000e-05, eta: 3:14:59, time: 1.092, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0375, loss_cls: 0.1428, acc: 94.5154, loss_bbox: 0.1966, loss_mask: 0.2072, loss: 0.5949 2024-05-31 12:40:06,504 - mmdet - INFO - Epoch [11][4200/7330] lr: 1.000e-05, eta: 3:14:04, time: 1.167, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0372, loss_cls: 0.1411, acc: 94.5264, loss_bbox: 0.1924, loss_mask: 0.2043, loss: 0.5848 2024-05-31 12:41:01,209 - mmdet - INFO - Epoch [11][4250/7330] lr: 1.000e-05, eta: 3:13:08, time: 1.094, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0361, loss_cls: 0.1406, acc: 94.5769, loss_bbox: 0.1920, loss_mask: 0.2105, loss: 0.5896 2024-05-31 12:41:58,068 - mmdet - INFO - Epoch [11][4300/7330] lr: 1.000e-05, eta: 3:12:12, time: 1.138, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0374, loss_cls: 0.1406, acc: 94.5715, loss_bbox: 0.1951, loss_mask: 0.2053, loss: 0.5894 2024-05-31 12:42:52,941 - mmdet - INFO - Epoch [11][4350/7330] lr: 1.000e-05, eta: 3:11:17, time: 1.097, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0376, loss_cls: 0.1429, acc: 94.4346, loss_bbox: 0.2006, loss_mask: 0.2086, loss: 0.6010 2024-05-31 12:43:50,198 - mmdet - INFO - Epoch [11][4400/7330] lr: 1.000e-05, eta: 3:10:21, time: 1.145, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0370, loss_cls: 0.1419, acc: 94.5522, loss_bbox: 0.1906, loss_mask: 0.2043, loss: 0.5852 2024-05-31 12:44:45,375 - mmdet - INFO - Epoch [11][4450/7330] lr: 1.000e-05, eta: 3:09:25, time: 1.104, data_time: 0.094, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0387, loss_cls: 0.1427, acc: 94.5286, loss_bbox: 0.1954, loss_mask: 0.2117, loss: 0.5995 2024-05-31 12:45:39,755 - mmdet - INFO - Epoch [11][4500/7330] lr: 1.000e-05, eta: 3:08:30, time: 1.087, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0349, loss_cls: 0.1376, acc: 94.7488, loss_bbox: 0.1858, loss_mask: 0.2029, loss: 0.5722 2024-05-31 12:46:34,000 - mmdet - INFO - Epoch [11][4550/7330] lr: 1.000e-05, eta: 3:07:34, time: 1.085, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0351, loss_cls: 0.1320, acc: 94.9175, loss_bbox: 0.1809, loss_mask: 0.2025, loss: 0.5606 2024-05-31 12:47:31,375 - mmdet - INFO - Epoch [11][4600/7330] lr: 1.000e-05, eta: 3:06:38, time: 1.148, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0375, loss_cls: 0.1478, acc: 94.3064, loss_bbox: 0.2003, loss_mask: 0.2091, loss: 0.6065 2024-05-31 12:48:30,159 - mmdet - INFO - Epoch [11][4650/7330] lr: 1.000e-05, eta: 3:05:43, time: 1.176, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0365, loss_cls: 0.1410, acc: 94.6013, loss_bbox: 0.1905, loss_mask: 0.2043, loss: 0.5836 2024-05-31 12:49:28,421 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 12:49:28,421 - mmdet - INFO - Epoch [11][4700/7330] lr: 1.000e-05, eta: 3:04:48, time: 1.165, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0376, loss_cls: 0.1504, acc: 94.2185, loss_bbox: 0.2028, loss_mask: 0.2094, loss: 0.6109 2024-05-31 12:50:22,555 - mmdet - INFO - Epoch [11][4750/7330] lr: 1.000e-05, eta: 3:03:52, time: 1.083, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0359, loss_cls: 0.1409, acc: 94.5627, loss_bbox: 0.1919, loss_mask: 0.2069, loss: 0.5861 2024-05-31 12:51:21,010 - mmdet - INFO - Epoch [11][4800/7330] lr: 1.000e-05, eta: 3:02:57, time: 1.169, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0096, loss_rpn_bbox: 0.0355, loss_cls: 0.1383, acc: 94.6282, loss_bbox: 0.1895, loss_mask: 0.2061, loss: 0.5791 2024-05-31 12:52:16,167 - mmdet - INFO - Epoch [11][4850/7330] lr: 1.000e-05, eta: 3:02:01, time: 1.103, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0389, loss_cls: 0.1425, acc: 94.4546, loss_bbox: 0.1927, loss_mask: 0.2051, loss: 0.5903 2024-05-31 12:53:12,726 - mmdet - INFO - Epoch [11][4900/7330] lr: 1.000e-05, eta: 3:01:05, time: 1.131, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0358, loss_cls: 0.1378, acc: 94.7366, loss_bbox: 0.1905, loss_mask: 0.2020, loss: 0.5762 2024-05-31 12:54:10,747 - mmdet - INFO - Epoch [11][4950/7330] lr: 1.000e-05, eta: 3:00:10, time: 1.160, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0124, loss_rpn_bbox: 0.0394, loss_cls: 0.1504, acc: 94.2339, loss_bbox: 0.2023, loss_mask: 0.2081, loss: 0.6127 2024-05-31 12:55:06,166 - mmdet - INFO - Epoch [11][5000/7330] lr: 1.000e-05, eta: 2:59:14, time: 1.108, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0378, loss_cls: 0.1392, acc: 94.5979, loss_bbox: 0.1882, loss_mask: 0.2022, loss: 0.5786 2024-05-31 12:56:00,666 - mmdet - INFO - Epoch [11][5050/7330] lr: 1.000e-05, eta: 2:58:18, time: 1.090, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0389, loss_cls: 0.1464, acc: 94.4050, loss_bbox: 0.1966, loss_mask: 0.2055, loss: 0.5987 2024-05-31 12:56:57,908 - mmdet - INFO - Epoch [11][5100/7330] lr: 1.000e-05, eta: 2:57:23, time: 1.145, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0404, loss_cls: 0.1499, acc: 94.1418, loss_bbox: 0.2053, loss_mask: 0.2099, loss: 0.6161 2024-05-31 12:57:55,302 - mmdet - INFO - Epoch [11][5150/7330] lr: 1.000e-05, eta: 2:56:27, time: 1.148, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0377, loss_cls: 0.1463, acc: 94.3987, loss_bbox: 0.1978, loss_mask: 0.2110, loss: 0.6040 2024-05-31 12:58:52,118 - mmdet - INFO - Epoch [11][5200/7330] lr: 1.000e-05, eta: 2:55:32, time: 1.137, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0351, loss_cls: 0.1412, acc: 94.6628, loss_bbox: 0.1918, loss_mask: 0.2076, loss: 0.5857 2024-05-31 12:59:47,758 - mmdet - INFO - Epoch [11][5250/7330] lr: 1.000e-05, eta: 2:54:36, time: 1.113, data_time: 0.090, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0393, loss_cls: 0.1492, acc: 94.2473, loss_bbox: 0.2015, loss_mask: 0.2133, loss: 0.6142 2024-05-31 13:00:44,672 - mmdet - INFO - Epoch [11][5300/7330] lr: 1.000e-05, eta: 2:53:41, time: 1.138, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0386, loss_cls: 0.1382, acc: 94.7134, loss_bbox: 0.1881, loss_mask: 0.2066, loss: 0.5823 2024-05-31 13:01:41,764 - mmdet - INFO - Epoch [11][5350/7330] lr: 1.000e-05, eta: 2:52:45, time: 1.142, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0378, loss_cls: 0.1438, acc: 94.4978, loss_bbox: 0.1944, loss_mask: 0.2080, loss: 0.5942 2024-05-31 13:02:39,312 - mmdet - INFO - Epoch [11][5400/7330] lr: 1.000e-05, eta: 2:51:50, time: 1.151, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0388, loss_cls: 0.1432, acc: 94.4585, loss_bbox: 0.1964, loss_mask: 0.2102, loss: 0.5993 2024-05-31 13:03:34,266 - mmdet - INFO - Epoch [11][5450/7330] lr: 1.000e-05, eta: 2:50:54, time: 1.099, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0381, loss_cls: 0.1455, acc: 94.4087, loss_bbox: 0.1977, loss_mask: 0.2103, loss: 0.6035 2024-05-31 13:04:33,321 - mmdet - INFO - Epoch [11][5500/7330] lr: 1.000e-05, eta: 2:49:59, time: 1.181, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0118, loss_rpn_bbox: 0.0385, loss_cls: 0.1465, acc: 94.3225, loss_bbox: 0.2010, loss_mask: 0.2153, loss: 0.6131 2024-05-31 13:05:26,280 - mmdet - INFO - Epoch [11][5550/7330] lr: 1.000e-05, eta: 2:49:03, time: 1.059, data_time: 0.054, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0341, loss_cls: 0.1399, acc: 94.5898, loss_bbox: 0.1892, loss_mask: 0.2113, loss: 0.5850 2024-05-31 13:06:20,987 - mmdet - INFO - Epoch [11][5600/7330] lr: 1.000e-05, eta: 2:48:07, time: 1.094, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0364, loss_cls: 0.1392, acc: 94.5874, loss_bbox: 0.1905, loss_mask: 0.2027, loss: 0.5793 2024-05-31 13:07:17,516 - mmdet - INFO - Epoch [11][5650/7330] lr: 1.000e-05, eta: 2:47:11, time: 1.131, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0381, loss_cls: 0.1410, acc: 94.6172, loss_bbox: 0.1902, loss_mask: 0.2061, loss: 0.5866 2024-05-31 13:08:13,852 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 13:08:13,852 - mmdet - INFO - Epoch [11][5700/7330] lr: 1.000e-05, eta: 2:46:16, time: 1.127, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0370, loss_cls: 0.1417, acc: 94.5415, loss_bbox: 0.1933, loss_mask: 0.2111, loss: 0.5944 2024-05-31 13:09:10,985 - mmdet - INFO - Epoch [11][5750/7330] lr: 1.000e-05, eta: 2:45:20, time: 1.142, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0350, loss_cls: 0.1360, acc: 94.7400, loss_bbox: 0.1879, loss_mask: 0.2020, loss: 0.5706 2024-05-31 13:10:04,409 - mmdet - INFO - Epoch [11][5800/7330] lr: 1.000e-05, eta: 2:44:24, time: 1.069, data_time: 0.057, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0350, loss_cls: 0.1357, acc: 94.6855, loss_bbox: 0.1873, loss_mask: 0.2063, loss: 0.5746 2024-05-31 13:10:58,270 - mmdet - INFO - Epoch [11][5850/7330] lr: 1.000e-05, eta: 2:43:29, time: 1.077, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0347, loss_cls: 0.1383, acc: 94.6907, loss_bbox: 0.1898, loss_mask: 0.2042, loss: 0.5789 2024-05-31 13:11:55,742 - mmdet - INFO - Epoch [11][5900/7330] lr: 1.000e-05, eta: 2:42:33, time: 1.149, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0126, loss_rpn_bbox: 0.0400, loss_cls: 0.1493, acc: 94.2263, loss_bbox: 0.2043, loss_mask: 0.2152, loss: 0.6215 2024-05-31 13:12:54,582 - mmdet - INFO - Epoch [11][5950/7330] lr: 1.000e-05, eta: 2:41:38, time: 1.177, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0384, loss_cls: 0.1427, acc: 94.4858, loss_bbox: 0.1910, loss_mask: 0.2023, loss: 0.5856 2024-05-31 13:13:51,496 - mmdet - INFO - Epoch [11][6000/7330] lr: 1.000e-05, eta: 2:40:42, time: 1.138, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0382, loss_cls: 0.1473, acc: 94.3467, loss_bbox: 0.2007, loss_mask: 0.2098, loss: 0.6071 2024-05-31 13:14:46,246 - mmdet - INFO - Epoch [11][6050/7330] lr: 1.000e-05, eta: 2:39:46, time: 1.095, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0389, loss_cls: 0.1481, acc: 94.2366, loss_bbox: 0.2002, loss_mask: 0.2115, loss: 0.6099 2024-05-31 13:15:42,435 - mmdet - INFO - Epoch [11][6100/7330] lr: 1.000e-05, eta: 2:38:51, time: 1.124, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0353, loss_cls: 0.1458, acc: 94.3818, loss_bbox: 0.1985, loss_mask: 0.2099, loss: 0.5997 2024-05-31 13:16:37,245 - mmdet - INFO - Epoch [11][6150/7330] lr: 1.000e-05, eta: 2:37:55, time: 1.096, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0128, loss_rpn_bbox: 0.0347, loss_cls: 0.1421, acc: 94.5706, loss_bbox: 0.1939, loss_mask: 0.2044, loss: 0.5879 2024-05-31 13:17:34,467 - mmdet - INFO - Epoch [11][6200/7330] lr: 1.000e-05, eta: 2:37:00, time: 1.144, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0355, loss_cls: 0.1422, acc: 94.5786, loss_bbox: 0.1912, loss_mask: 0.2057, loss: 0.5845 2024-05-31 13:18:32,757 - mmdet - INFO - Epoch [11][6250/7330] lr: 1.000e-05, eta: 2:36:04, time: 1.166, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0361, loss_cls: 0.1375, acc: 94.7192, loss_bbox: 0.1892, loss_mask: 0.2078, loss: 0.5815 2024-05-31 13:19:27,713 - mmdet - INFO - Epoch [11][6300/7330] lr: 1.000e-05, eta: 2:35:08, time: 1.099, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0377, loss_cls: 0.1414, acc: 94.5925, loss_bbox: 0.1930, loss_mask: 0.2067, loss: 0.5900 2024-05-31 13:20:21,910 - mmdet - INFO - Epoch [11][6350/7330] lr: 1.000e-05, eta: 2:34:13, time: 1.084, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0369, loss_cls: 0.1414, acc: 94.4673, loss_bbox: 0.1930, loss_mask: 0.2060, loss: 0.5886 2024-05-31 13:21:16,712 - mmdet - INFO - Epoch [11][6400/7330] lr: 1.000e-05, eta: 2:33:17, time: 1.096, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0386, loss_cls: 0.1453, acc: 94.3469, loss_bbox: 0.1990, loss_mask: 0.2118, loss: 0.6062 2024-05-31 13:22:10,515 - mmdet - INFO - Epoch [11][6450/7330] lr: 1.000e-05, eta: 2:32:21, time: 1.076, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0338, loss_cls: 0.1363, acc: 94.7791, loss_bbox: 0.1830, loss_mask: 0.2015, loss: 0.5647 2024-05-31 13:23:06,843 - mmdet - INFO - Epoch [11][6500/7330] lr: 1.000e-05, eta: 2:31:25, time: 1.127, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0360, loss_cls: 0.1397, acc: 94.6624, loss_bbox: 0.1887, loss_mask: 0.2043, loss: 0.5792 2024-05-31 13:24:09,032 - mmdet - INFO - Epoch [11][6550/7330] lr: 1.000e-05, eta: 2:30:30, time: 1.244, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0363, loss_cls: 0.1404, acc: 94.6189, loss_bbox: 0.1930, loss_mask: 0.2084, loss: 0.5896 2024-05-31 13:25:03,958 - mmdet - INFO - Epoch [11][6600/7330] lr: 1.000e-05, eta: 2:29:35, time: 1.099, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0351, loss_cls: 0.1392, acc: 94.5657, loss_bbox: 0.1934, loss_mask: 0.2046, loss: 0.5826 2024-05-31 13:25:58,043 - mmdet - INFO - Epoch [11][6650/7330] lr: 1.000e-05, eta: 2:28:39, time: 1.082, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0355, loss_cls: 0.1379, acc: 94.6423, loss_bbox: 0.1887, loss_mask: 0.2057, loss: 0.5784 2024-05-31 13:26:55,638 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 13:26:55,638 - mmdet - INFO - Epoch [11][6700/7330] lr: 1.000e-05, eta: 2:27:43, time: 1.152, data_time: 0.096, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0379, loss_cls: 0.1440, acc: 94.4434, loss_bbox: 0.1976, loss_mask: 0.2091, loss: 0.6000 2024-05-31 13:27:52,663 - mmdet - INFO - Epoch [11][6750/7330] lr: 1.000e-05, eta: 2:26:48, time: 1.140, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0386, loss_cls: 0.1466, acc: 94.3757, loss_bbox: 0.2006, loss_mask: 0.2041, loss: 0.6014 2024-05-31 13:28:51,985 - mmdet - INFO - Epoch [11][6800/7330] lr: 1.000e-05, eta: 2:25:52, time: 1.186, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0363, loss_cls: 0.1383, acc: 94.6636, loss_bbox: 0.1947, loss_mask: 0.2077, loss: 0.5878 2024-05-31 13:29:46,119 - mmdet - INFO - Epoch [11][6850/7330] lr: 1.000e-05, eta: 2:24:57, time: 1.083, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0353, loss_cls: 0.1391, acc: 94.5996, loss_bbox: 0.1885, loss_mask: 0.2063, loss: 0.5797 2024-05-31 13:30:40,738 - mmdet - INFO - Epoch [11][6900/7330] lr: 1.000e-05, eta: 2:24:01, time: 1.092, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0369, loss_cls: 0.1405, acc: 94.6218, loss_bbox: 0.1925, loss_mask: 0.2047, loss: 0.5861 2024-05-31 13:31:35,948 - mmdet - INFO - Epoch [11][6950/7330] lr: 1.000e-05, eta: 2:23:05, time: 1.104, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0376, loss_cls: 0.1487, acc: 94.3567, loss_bbox: 0.2004, loss_mask: 0.2103, loss: 0.6086 2024-05-31 13:32:30,985 - mmdet - INFO - Epoch [11][7000/7330] lr: 1.000e-05, eta: 2:22:09, time: 1.101, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0388, loss_cls: 0.1423, acc: 94.5127, loss_bbox: 0.1943, loss_mask: 0.2089, loss: 0.5953 2024-05-31 13:33:25,887 - mmdet - INFO - Epoch [11][7050/7330] lr: 1.000e-05, eta: 2:21:14, time: 1.098, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0355, loss_cls: 0.1388, acc: 94.6521, loss_bbox: 0.1877, loss_mask: 0.2041, loss: 0.5769 2024-05-31 13:34:25,164 - mmdet - INFO - Epoch [11][7100/7330] lr: 1.000e-05, eta: 2:20:18, time: 1.186, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0383, loss_cls: 0.1494, acc: 94.2515, loss_bbox: 0.2015, loss_mask: 0.2127, loss: 0.6138 2024-05-31 13:35:24,131 - mmdet - INFO - Epoch [11][7150/7330] lr: 1.000e-05, eta: 2:19:23, time: 1.179, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0353, loss_cls: 0.1357, acc: 94.7449, loss_bbox: 0.1848, loss_mask: 0.2050, loss: 0.5704 2024-05-31 13:36:18,332 - mmdet - INFO - Epoch [11][7200/7330] lr: 1.000e-05, eta: 2:18:27, time: 1.084, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0371, loss_cls: 0.1433, acc: 94.4973, loss_bbox: 0.1959, loss_mask: 0.2081, loss: 0.5955 2024-05-31 13:37:15,059 - mmdet - INFO - Epoch [11][7250/7330] lr: 1.000e-05, eta: 2:17:31, time: 1.135, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0358, loss_cls: 0.1435, acc: 94.5044, loss_bbox: 0.1959, loss_mask: 0.2042, loss: 0.5898 2024-05-31 13:38:12,606 - mmdet - INFO - Epoch [11][7300/7330] lr: 1.000e-05, eta: 2:16:36, time: 1.151, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0388, loss_cls: 0.1470, acc: 94.3286, loss_bbox: 0.2011, loss_mask: 0.2095, loss: 0.6088 2024-05-31 13:38:51,156 - mmdet - INFO - Saving checkpoint at 11 epochs 2024-05-31 13:41:10,108 - mmdet - INFO - Evaluating bbox... 2024-05-31 13:41:32,735 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.493 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.723 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.539 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.319 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.528 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.419 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.646 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.766 2024-05-31 13:41:32,735 - mmdet - INFO - Evaluating segm... 2024-05-31 13:41:58,618 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.440 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.687 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.472 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.231 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.474 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.650 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.547 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.547 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.547 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.352 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.591 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.726 2024-05-31 13:41:58,997 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 13:41:59,000 - mmdet - INFO - Epoch(val) [11][625] bbox_mAP: 0.4930, bbox_mAP_50: 0.7230, bbox_mAP_75: 0.5390, bbox_mAP_s: 0.3190, bbox_mAP_m: 0.5280, bbox_mAP_l: 0.6590, bbox_mAP_copypaste: 0.493 0.723 0.539 0.319 0.528 0.659, segm_mAP: 0.4400, segm_mAP_50: 0.6870, segm_mAP_75: 0.4720, segm_mAP_s: 0.2310, segm_mAP_m: 0.4740, segm_mAP_l: 0.6500, segm_mAP_copypaste: 0.440 0.687 0.472 0.231 0.474 0.650 2024-05-31 13:42:59,135 - mmdet - INFO - Epoch [12][50/7330] lr: 1.000e-06, eta: 2:15:04, time: 1.202, data_time: 0.147, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0353, loss_cls: 0.1405, acc: 94.6008, loss_bbox: 0.1903, loss_mask: 0.2084, loss: 0.5851 2024-05-31 13:43:53,624 - mmdet - INFO - Epoch [12][100/7330] lr: 1.000e-06, eta: 2:14:08, time: 1.090, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0373, loss_cls: 0.1414, acc: 94.5913, loss_bbox: 0.1904, loss_mask: 0.2066, loss: 0.5864 2024-05-31 13:44:47,853 - mmdet - INFO - Epoch [12][150/7330] lr: 1.000e-06, eta: 2:13:13, time: 1.085, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0376, loss_cls: 0.1383, acc: 94.6960, loss_bbox: 0.1910, loss_mask: 0.2029, loss: 0.5800 2024-05-31 13:45:41,871 - mmdet - INFO - Epoch [12][200/7330] lr: 1.000e-06, eta: 2:12:17, time: 1.080, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0358, loss_cls: 0.1362, acc: 94.7485, loss_bbox: 0.1891, loss_mask: 0.2051, loss: 0.5761 2024-05-31 13:46:36,076 - mmdet - INFO - Epoch [12][250/7330] lr: 1.000e-06, eta: 2:11:21, time: 1.084, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0384, loss_cls: 0.1466, acc: 94.3103, loss_bbox: 0.1974, loss_mask: 0.2082, loss: 0.6009 2024-05-31 13:47:30,627 - mmdet - INFO - Epoch [12][300/7330] lr: 1.000e-06, eta: 2:10:25, time: 1.091, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0362, loss_cls: 0.1394, acc: 94.5791, loss_bbox: 0.1930, loss_mask: 0.2089, loss: 0.5878 2024-05-31 13:48:24,802 - mmdet - INFO - Epoch [12][350/7330] lr: 1.000e-06, eta: 2:09:29, time: 1.083, data_time: 0.059, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0379, loss_cls: 0.1423, acc: 94.5237, loss_bbox: 0.1995, loss_mask: 0.2077, loss: 0.5986 2024-05-31 13:49:18,336 - mmdet - INFO - Epoch [12][400/7330] lr: 1.000e-06, eta: 2:08:34, time: 1.071, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0380, loss_cls: 0.1355, acc: 94.7551, loss_bbox: 0.1874, loss_mask: 0.2091, loss: 0.5803 2024-05-31 13:50:12,727 - mmdet - INFO - Epoch [12][450/7330] lr: 1.000e-06, eta: 2:07:38, time: 1.088, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0380, loss_cls: 0.1422, acc: 94.5063, loss_bbox: 0.1956, loss_mask: 0.2045, loss: 0.5911 2024-05-31 13:51:06,863 - mmdet - INFO - Epoch [12][500/7330] lr: 1.000e-06, eta: 2:06:42, time: 1.083, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0339, loss_cls: 0.1323, acc: 94.8352, loss_bbox: 0.1816, loss_mask: 0.2038, loss: 0.5613 2024-05-31 13:52:03,611 - mmdet - INFO - Epoch [12][550/7330] lr: 1.000e-06, eta: 2:05:47, time: 1.135, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0386, loss_cls: 0.1475, acc: 94.3225, loss_bbox: 0.2041, loss_mask: 0.2152, loss: 0.6169 2024-05-31 13:52:57,860 - mmdet - INFO - Epoch [12][600/7330] lr: 1.000e-06, eta: 2:04:51, time: 1.085, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0353, loss_cls: 0.1357, acc: 94.8320, loss_bbox: 0.1865, loss_mask: 0.2055, loss: 0.5729 2024-05-31 13:53:57,054 - mmdet - INFO - Epoch [12][650/7330] lr: 1.000e-06, eta: 2:03:55, time: 1.184, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0378, loss_cls: 0.1399, acc: 94.5679, loss_bbox: 0.1944, loss_mask: 0.2052, loss: 0.5882 2024-05-31 13:54:58,539 - mmdet - INFO - Epoch [12][700/7330] lr: 1.000e-06, eta: 2:03:00, time: 1.230, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0366, loss_cls: 0.1358, acc: 94.8230, loss_bbox: 0.1883, loss_mask: 0.2069, loss: 0.5778 2024-05-31 13:55:52,453 - mmdet - INFO - Epoch [12][750/7330] lr: 1.000e-06, eta: 2:02:04, time: 1.078, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0365, loss_cls: 0.1430, acc: 94.4224, loss_bbox: 0.1913, loss_mask: 0.2058, loss: 0.5869 2024-05-31 13:56:48,915 - mmdet - INFO - Epoch [12][800/7330] lr: 1.000e-06, eta: 2:01:09, time: 1.129, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0366, loss_cls: 0.1352, acc: 94.8296, loss_bbox: 0.1869, loss_mask: 0.2028, loss: 0.5730 2024-05-31 13:57:45,094 - mmdet - INFO - Epoch [12][850/7330] lr: 1.000e-06, eta: 2:00:13, time: 1.124, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0375, loss_cls: 0.1442, acc: 94.4736, loss_bbox: 0.1962, loss_mask: 0.2049, loss: 0.5936 2024-05-31 13:58:40,112 - mmdet - INFO - Epoch [12][900/7330] lr: 1.000e-06, eta: 1:59:17, time: 1.100, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0122, loss_rpn_bbox: 0.0399, loss_cls: 0.1540, acc: 93.9739, loss_bbox: 0.2107, loss_mask: 0.2156, loss: 0.6324 2024-05-31 13:59:34,111 - mmdet - INFO - Epoch [12][950/7330] lr: 1.000e-06, eta: 1:58:22, time: 1.080, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0360, loss_cls: 0.1349, acc: 94.8057, loss_bbox: 0.1893, loss_mask: 0.2057, loss: 0.5756 2024-05-31 14:00:28,450 - mmdet - INFO - Epoch [12][1000/7330] lr: 1.000e-06, eta: 1:57:26, time: 1.087, data_time: 0.055, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0368, loss_cls: 0.1347, acc: 94.8376, loss_bbox: 0.1871, loss_mask: 0.2006, loss: 0.5703 2024-05-31 14:01:22,702 - mmdet - INFO - Epoch [12][1050/7330] lr: 1.000e-06, eta: 1:56:30, time: 1.085, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0365, loss_cls: 0.1456, acc: 94.3938, loss_bbox: 0.1974, loss_mask: 0.2059, loss: 0.5954 2024-05-31 14:02:17,445 - mmdet - INFO - Epoch [12][1100/7330] lr: 1.000e-06, eta: 1:55:34, time: 1.095, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0372, loss_cls: 0.1426, acc: 94.5220, loss_bbox: 0.1948, loss_mask: 0.2074, loss: 0.5922 2024-05-31 14:03:14,073 - mmdet - INFO - Epoch [12][1150/7330] lr: 1.000e-06, eta: 1:54:39, time: 1.133, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0369, loss_cls: 0.1466, acc: 94.2896, loss_bbox: 0.1970, loss_mask: 0.2076, loss: 0.5993 2024-05-31 14:04:14,002 - mmdet - INFO - Epoch [12][1200/7330] lr: 1.000e-06, eta: 1:53:44, time: 1.199, data_time: 0.091, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0404, loss_cls: 0.1433, acc: 94.4446, loss_bbox: 0.2009, loss_mask: 0.2051, loss: 0.6005 2024-05-31 14:05:11,924 - mmdet - INFO - Epoch [12][1250/7330] lr: 1.000e-06, eta: 1:52:48, time: 1.158, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0348, loss_cls: 0.1376, acc: 94.7791, loss_bbox: 0.1864, loss_mask: 0.2026, loss: 0.5712 2024-05-31 14:06:08,206 - mmdet - INFO - Epoch [12][1300/7330] lr: 1.000e-06, eta: 1:51:52, time: 1.126, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0370, loss_cls: 0.1446, acc: 94.4771, loss_bbox: 0.1990, loss_mask: 0.2055, loss: 0.5963 2024-05-31 14:07:02,050 - mmdet - INFO - Epoch [12][1350/7330] lr: 1.000e-06, eta: 1:50:57, time: 1.077, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0357, loss_cls: 0.1430, acc: 94.4292, loss_bbox: 0.1964, loss_mask: 0.2069, loss: 0.5919 2024-05-31 14:07:56,351 - mmdet - INFO - Epoch [12][1400/7330] lr: 1.000e-06, eta: 1:50:01, time: 1.086, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0369, loss_cls: 0.1408, acc: 94.5095, loss_bbox: 0.1932, loss_mask: 0.2091, loss: 0.5901 2024-05-31 14:08:54,263 - mmdet - INFO - Epoch [12][1450/7330] lr: 1.000e-06, eta: 1:49:05, time: 1.158, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0091, loss_rpn_bbox: 0.0341, loss_cls: 0.1394, acc: 94.6836, loss_bbox: 0.1865, loss_mask: 0.2029, loss: 0.5720 2024-05-31 14:09:47,783 - mmdet - INFO - Epoch [12][1500/7330] lr: 1.000e-06, eta: 1:48:10, time: 1.070, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0337, loss_cls: 0.1359, acc: 94.6604, loss_bbox: 0.1903, loss_mask: 0.2031, loss: 0.5727 2024-05-31 14:10:41,956 - mmdet - INFO - Epoch [12][1550/7330] lr: 1.000e-06, eta: 1:47:14, time: 1.083, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0350, loss_cls: 0.1337, acc: 94.8218, loss_bbox: 0.1820, loss_mask: 0.2065, loss: 0.5672 2024-05-31 14:11:36,933 - mmdet - INFO - Epoch [12][1600/7330] lr: 1.000e-06, eta: 1:46:18, time: 1.100, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0397, loss_cls: 0.1459, acc: 94.3689, loss_bbox: 0.1975, loss_mask: 0.2085, loss: 0.6030 2024-05-31 14:12:31,779 - mmdet - INFO - Epoch [12][1650/7330] lr: 1.000e-06, eta: 1:45:22, time: 1.097, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0376, loss_cls: 0.1427, acc: 94.4971, loss_bbox: 0.1958, loss_mask: 0.2097, loss: 0.5968 2024-05-31 14:13:27,723 - mmdet - INFO - Epoch [12][1700/7330] lr: 1.000e-06, eta: 1:44:27, time: 1.119, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0353, loss_cls: 0.1361, acc: 94.7822, loss_bbox: 0.1865, loss_mask: 0.2021, loss: 0.5699 2024-05-31 14:14:27,217 - mmdet - INFO - Epoch [12][1750/7330] lr: 1.000e-06, eta: 1:43:31, time: 1.190, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0356, loss_cls: 0.1405, acc: 94.6006, loss_bbox: 0.1911, loss_mask: 0.2077, loss: 0.5847 2024-05-31 14:15:23,971 - mmdet - INFO - Epoch [12][1800/7330] lr: 1.000e-06, eta: 1:42:36, time: 1.135, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0382, loss_cls: 0.1464, acc: 94.3271, loss_bbox: 0.1962, loss_mask: 0.2087, loss: 0.5999 2024-05-31 14:16:23,356 - mmdet - INFO - Epoch [12][1850/7330] lr: 1.000e-06, eta: 1:41:40, time: 1.188, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0362, loss_cls: 0.1400, acc: 94.5432, loss_bbox: 0.1897, loss_mask: 0.2049, loss: 0.5809 2024-05-31 14:17:17,116 - mmdet - INFO - Epoch [12][1900/7330] lr: 1.000e-06, eta: 1:40:45, time: 1.075, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0359, loss_cls: 0.1367, acc: 94.7278, loss_bbox: 0.1884, loss_mask: 0.2043, loss: 0.5758 2024-05-31 14:18:11,085 - mmdet - INFO - Epoch [12][1950/7330] lr: 1.000e-06, eta: 1:39:49, time: 1.079, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0376, loss_cls: 0.1394, acc: 94.5776, loss_bbox: 0.1906, loss_mask: 0.2083, loss: 0.5863 2024-05-31 14:19:04,984 - mmdet - INFO - Epoch [12][2000/7330] lr: 1.000e-06, eta: 1:38:53, time: 1.078, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0381, loss_cls: 0.1446, acc: 94.4885, loss_bbox: 0.1971, loss_mask: 0.2091, loss: 0.5998 2024-05-31 14:20:03,840 - mmdet - INFO - Epoch [12][2050/7330] lr: 1.000e-06, eta: 1:37:58, time: 1.177, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0119, loss_rpn_bbox: 0.0404, loss_cls: 0.1457, acc: 94.3604, loss_bbox: 0.2001, loss_mask: 0.2057, loss: 0.6038 2024-05-31 14:20:58,102 - mmdet - INFO - Epoch [12][2100/7330] lr: 1.000e-06, eta: 1:37:02, time: 1.085, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0380, loss_cls: 0.1401, acc: 94.5747, loss_bbox: 0.1914, loss_mask: 0.2093, loss: 0.5894 2024-05-31 14:21:52,527 - mmdet - INFO - Epoch [12][2150/7330] lr: 1.000e-06, eta: 1:36:06, time: 1.089, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0371, loss_cls: 0.1412, acc: 94.5522, loss_bbox: 0.1939, loss_mask: 0.2086, loss: 0.5921 2024-05-31 14:22:46,325 - mmdet - INFO - Epoch [12][2200/7330] lr: 1.000e-06, eta: 1:35:10, time: 1.076, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0363, loss_cls: 0.1394, acc: 94.5850, loss_bbox: 0.1919, loss_mask: 0.2070, loss: 0.5853 2024-05-31 14:23:46,261 - mmdet - INFO - Epoch [12][2250/7330] lr: 1.000e-06, eta: 1:34:15, time: 1.199, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0373, loss_cls: 0.1427, acc: 94.4907, loss_bbox: 0.1945, loss_mask: 0.2065, loss: 0.5918 2024-05-31 14:24:43,112 - mmdet - INFO - Epoch [12][2300/7330] lr: 1.000e-06, eta: 1:33:19, time: 1.137, data_time: 0.088, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0394, loss_cls: 0.1400, acc: 94.6428, loss_bbox: 0.1918, loss_mask: 0.2042, loss: 0.5860 2024-05-31 14:25:39,827 - mmdet - INFO - Epoch [12][2350/7330] lr: 1.000e-06, eta: 1:32:24, time: 1.134, data_time: 0.086, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0368, loss_cls: 0.1441, acc: 94.3655, loss_bbox: 0.1960, loss_mask: 0.2057, loss: 0.5936 2024-05-31 14:26:34,312 - mmdet - INFO - Epoch [12][2400/7330] lr: 1.000e-06, eta: 1:31:28, time: 1.090, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0382, loss_cls: 0.1420, acc: 94.5479, loss_bbox: 0.1917, loss_mask: 0.2055, loss: 0.5886 2024-05-31 14:27:33,105 - mmdet - INFO - Epoch [12][2450/7330] lr: 1.000e-06, eta: 1:30:32, time: 1.176, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0367, loss_cls: 0.1425, acc: 94.4893, loss_bbox: 0.1987, loss_mask: 0.2079, loss: 0.5964 2024-05-31 14:28:27,552 - mmdet - INFO - Epoch [12][2500/7330] lr: 1.000e-06, eta: 1:29:37, time: 1.089, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0096, loss_rpn_bbox: 0.0348, loss_cls: 0.1317, acc: 94.8777, loss_bbox: 0.1843, loss_mask: 0.2046, loss: 0.5650 2024-05-31 14:29:21,691 - mmdet - INFO - Epoch [12][2550/7330] lr: 1.000e-06, eta: 1:28:41, time: 1.083, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0379, loss_cls: 0.1426, acc: 94.4773, loss_bbox: 0.2001, loss_mask: 0.2073, loss: 0.5980 2024-05-31 14:30:16,189 - mmdet - INFO - Epoch [12][2600/7330] lr: 1.000e-06, eta: 1:27:45, time: 1.090, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0355, loss_cls: 0.1423, acc: 94.4539, loss_bbox: 0.1949, loss_mask: 0.2038, loss: 0.5871 2024-05-31 14:31:14,296 - mmdet - INFO - Epoch [12][2650/7330] lr: 1.000e-06, eta: 1:26:50, time: 1.162, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0359, loss_cls: 0.1329, acc: 94.9070, loss_bbox: 0.1851, loss_mask: 0.2067, loss: 0.5714 2024-05-31 14:32:08,545 - mmdet - INFO - Epoch [12][2700/7330] lr: 1.000e-06, eta: 1:25:54, time: 1.085, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0375, loss_cls: 0.1409, acc: 94.4937, loss_bbox: 0.1947, loss_mask: 0.2066, loss: 0.5897 2024-05-31 14:33:05,542 - mmdet - INFO - Epoch [12][2750/7330] lr: 1.000e-06, eta: 1:24:58, time: 1.140, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0366, loss_cls: 0.1367, acc: 94.7341, loss_bbox: 0.1908, loss_mask: 0.2088, loss: 0.5831 2024-05-31 14:34:02,415 - mmdet - INFO - Epoch [12][2800/7330] lr: 1.000e-06, eta: 1:24:03, time: 1.137, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0374, loss_cls: 0.1377, acc: 94.6890, loss_bbox: 0.1905, loss_mask: 0.2000, loss: 0.5760 2024-05-31 14:34:58,363 - mmdet - INFO - Epoch [12][2850/7330] lr: 1.000e-06, eta: 1:23:07, time: 1.119, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0378, loss_cls: 0.1428, acc: 94.4910, loss_bbox: 0.1954, loss_mask: 0.2098, loss: 0.5970 2024-05-31 14:35:52,847 - mmdet - INFO - Epoch [12][2900/7330] lr: 1.000e-06, eta: 1:22:11, time: 1.090, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0360, loss_cls: 0.1385, acc: 94.6309, loss_bbox: 0.1899, loss_mask: 0.2032, loss: 0.5785 2024-05-31 14:36:48,581 - mmdet - INFO - Epoch [12][2950/7330] lr: 1.000e-06, eta: 1:21:16, time: 1.115, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0351, loss_cls: 0.1351, acc: 94.8145, loss_bbox: 0.1828, loss_mask: 0.1992, loss: 0.5620 2024-05-31 14:37:43,805 - mmdet - INFO - Epoch [12][3000/7330] lr: 1.000e-06, eta: 1:20:20, time: 1.104, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0393, loss_cls: 0.1413, acc: 94.5105, loss_bbox: 0.1940, loss_mask: 0.2067, loss: 0.5915 2024-05-31 14:38:43,003 - mmdet - INFO - Epoch [12][3050/7330] lr: 1.000e-06, eta: 1:19:25, time: 1.184, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0096, loss_rpn_bbox: 0.0373, loss_cls: 0.1423, acc: 94.4397, loss_bbox: 0.1948, loss_mask: 0.2059, loss: 0.5900 2024-05-31 14:39:38,253 - mmdet - INFO - Epoch [12][3100/7330] lr: 1.000e-06, eta: 1:18:29, time: 1.105, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0373, loss_cls: 0.1441, acc: 94.3909, loss_bbox: 0.1960, loss_mask: 0.2097, loss: 0.5977 2024-05-31 14:40:31,821 - mmdet - INFO - Epoch [12][3150/7330] lr: 1.000e-06, eta: 1:17:33, time: 1.071, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0382, loss_cls: 0.1409, acc: 94.5488, loss_bbox: 0.1942, loss_mask: 0.2104, loss: 0.5945 2024-05-31 14:41:26,361 - mmdet - INFO - Epoch [12][3200/7330] lr: 1.000e-06, eta: 1:16:37, time: 1.091, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0377, loss_cls: 0.1400, acc: 94.5308, loss_bbox: 0.1953, loss_mask: 0.2044, loss: 0.5885 2024-05-31 14:42:27,797 - mmdet - INFO - Epoch [12][3250/7330] lr: 1.000e-06, eta: 1:15:42, time: 1.229, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0369, loss_cls: 0.1411, acc: 94.5010, loss_bbox: 0.1949, loss_mask: 0.2065, loss: 0.5905 2024-05-31 14:43:25,265 - mmdet - INFO - Epoch [12][3300/7330] lr: 1.000e-06, eta: 1:14:46, time: 1.149, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0378, loss_cls: 0.1418, acc: 94.6055, loss_bbox: 0.1947, loss_mask: 0.2098, loss: 0.5950 2024-05-31 14:44:18,421 - mmdet - INFO - Epoch [12][3350/7330] lr: 1.000e-06, eta: 1:13:51, time: 1.063, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0358, loss_cls: 0.1323, acc: 94.8362, loss_bbox: 0.1841, loss_mask: 0.2035, loss: 0.5661 2024-05-31 14:45:15,465 - mmdet - INFO - Epoch [12][3400/7330] lr: 1.000e-06, eta: 1:12:55, time: 1.141, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0359, loss_cls: 0.1360, acc: 94.7759, loss_bbox: 0.1898, loss_mask: 0.2052, loss: 0.5769 2024-05-31 14:46:09,727 - mmdet - INFO - Epoch [12][3450/7330] lr: 1.000e-06, eta: 1:11:59, time: 1.085, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0389, loss_cls: 0.1446, acc: 94.5029, loss_bbox: 0.1929, loss_mask: 0.2078, loss: 0.5947 2024-05-31 14:47:06,722 - mmdet - INFO - Epoch [12][3500/7330] lr: 1.000e-06, eta: 1:11:04, time: 1.140, data_time: 0.098, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0351, loss_cls: 0.1296, acc: 94.9661, loss_bbox: 0.1797, loss_mask: 0.2001, loss: 0.5549 2024-05-31 14:48:01,270 - mmdet - INFO - Epoch [12][3550/7330] lr: 1.000e-06, eta: 1:10:08, time: 1.091, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0379, loss_cls: 0.1436, acc: 94.5303, loss_bbox: 0.1943, loss_mask: 0.2066, loss: 0.5927 2024-05-31 14:48:56,470 - mmdet - INFO - Epoch [12][3600/7330] lr: 1.000e-06, eta: 1:09:12, time: 1.104, data_time: 0.087, memory: 24290, loss_rpn_cls: 0.0116, loss_rpn_bbox: 0.0375, loss_cls: 0.1393, acc: 94.6592, loss_bbox: 0.1932, loss_mask: 0.2036, loss: 0.5852 2024-05-31 14:49:52,438 - mmdet - INFO - Epoch [12][3650/7330] lr: 1.000e-06, eta: 1:08:17, time: 1.119, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0357, loss_cls: 0.1393, acc: 94.6208, loss_bbox: 0.1866, loss_mask: 0.2026, loss: 0.5749 2024-05-31 14:50:49,233 - mmdet - INFO - Epoch [12][3700/7330] lr: 1.000e-06, eta: 1:07:21, time: 1.136, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0346, loss_cls: 0.1405, acc: 94.6479, loss_bbox: 0.1909, loss_mask: 0.2017, loss: 0.5778 2024-05-31 14:51:44,012 - mmdet - INFO - Epoch [12][3750/7330] lr: 1.000e-06, eta: 1:06:25, time: 1.096, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0375, loss_cls: 0.1411, acc: 94.4985, loss_bbox: 0.1943, loss_mask: 0.2068, loss: 0.5910 2024-05-31 14:52:41,302 - mmdet - INFO - Epoch [12][3800/7330] lr: 1.000e-06, eta: 1:05:30, time: 1.146, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0388, loss_cls: 0.1375, acc: 94.6562, loss_bbox: 0.1910, loss_mask: 0.2077, loss: 0.5849 2024-05-31 14:53:43,652 - mmdet - INFO - Epoch [12][3850/7330] lr: 1.000e-06, eta: 1:04:34, time: 1.247, data_time: 0.081, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0385, loss_cls: 0.1489, acc: 94.2356, loss_bbox: 0.2009, loss_mask: 0.2111, loss: 0.6108 2024-05-31 14:54:38,010 - mmdet - INFO - Epoch [12][3900/7330] lr: 1.000e-06, eta: 1:03:39, time: 1.087, data_time: 0.071, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0363, loss_cls: 0.1419, acc: 94.5320, loss_bbox: 0.1926, loss_mask: 0.2040, loss: 0.5846 2024-05-31 14:55:34,425 - mmdet - INFO - Epoch [12][3950/7330] lr: 1.000e-06, eta: 1:02:43, time: 1.128, data_time: 0.065, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0349, loss_cls: 0.1368, acc: 94.6990, loss_bbox: 0.1879, loss_mask: 0.2043, loss: 0.5740 2024-05-31 14:56:29,031 - mmdet - INFO - Epoch [12][4000/7330] lr: 1.000e-06, eta: 1:01:47, time: 1.092, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0351, loss_cls: 0.1374, acc: 94.7153, loss_bbox: 0.1901, loss_mask: 0.2045, loss: 0.5772 2024-05-31 14:57:24,085 - mmdet - INFO - Epoch [12][4050/7330] lr: 1.000e-06, eta: 1:00:52, time: 1.101, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0395, loss_cls: 0.1477, acc: 94.2979, loss_bbox: 0.2017, loss_mask: 0.2056, loss: 0.6056 2024-05-31 14:58:21,027 - mmdet - INFO - Epoch [12][4100/7330] lr: 1.000e-06, eta: 0:59:56, time: 1.139, data_time: 0.097, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0368, loss_cls: 0.1381, acc: 94.7090, loss_bbox: 0.1887, loss_mask: 0.2016, loss: 0.5759 2024-05-31 14:59:14,797 - mmdet - INFO - Epoch [12][4150/7330] lr: 1.000e-06, eta: 0:59:00, time: 1.075, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0368, loss_cls: 0.1443, acc: 94.4258, loss_bbox: 0.1982, loss_mask: 0.2061, loss: 0.5967 2024-05-31 15:00:08,569 - mmdet - INFO - Epoch [12][4200/7330] lr: 1.000e-06, eta: 0:58:04, time: 1.075, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0094, loss_rpn_bbox: 0.0336, loss_cls: 0.1377, acc: 94.6987, loss_bbox: 0.1858, loss_mask: 0.2074, loss: 0.5738 2024-05-31 15:01:05,408 - mmdet - INFO - Epoch [12][4250/7330] lr: 1.000e-06, eta: 0:57:09, time: 1.137, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0102, loss_rpn_bbox: 0.0372, loss_cls: 0.1382, acc: 94.6843, loss_bbox: 0.1890, loss_mask: 0.2032, loss: 0.5777 2024-05-31 15:02:05,320 - mmdet - INFO - Epoch [12][4300/7330] lr: 1.000e-06, eta: 0:56:13, time: 1.198, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0379, loss_cls: 0.1419, acc: 94.5371, loss_bbox: 0.1968, loss_mask: 0.2108, loss: 0.5981 2024-05-31 15:02:59,887 - mmdet - INFO - Epoch [12][4350/7330] lr: 1.000e-06, eta: 0:55:18, time: 1.091, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0394, loss_cls: 0.1436, acc: 94.3975, loss_bbox: 0.2011, loss_mask: 0.2139, loss: 0.6081 2024-05-31 15:03:59,336 - mmdet - INFO - Epoch [12][4400/7330] lr: 1.000e-06, eta: 0:54:22, time: 1.189, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0364, loss_cls: 0.1371, acc: 94.7234, loss_bbox: 0.1896, loss_mask: 0.2074, loss: 0.5820 2024-05-31 15:04:55,812 - mmdet - INFO - Epoch [12][4450/7330] lr: 1.000e-06, eta: 0:53:26, time: 1.130, data_time: 0.075, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0390, loss_cls: 0.1477, acc: 94.2622, loss_bbox: 0.1953, loss_mask: 0.2069, loss: 0.5997 2024-05-31 15:05:52,403 - mmdet - INFO - Epoch [12][4500/7330] lr: 1.000e-06, eta: 0:52:31, time: 1.132, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0373, loss_cls: 0.1446, acc: 94.3796, loss_bbox: 0.1993, loss_mask: 0.2071, loss: 0.5989 2024-05-31 15:06:46,349 - mmdet - INFO - Epoch [12][4550/7330] lr: 1.000e-06, eta: 0:51:35, time: 1.079, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0120, loss_rpn_bbox: 0.0395, loss_cls: 0.1449, acc: 94.4270, loss_bbox: 0.1976, loss_mask: 0.2093, loss: 0.6033 2024-05-31 15:07:40,403 - mmdet - INFO - Epoch [12][4600/7330] lr: 1.000e-06, eta: 0:50:39, time: 1.081, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0093, loss_rpn_bbox: 0.0371, loss_cls: 0.1377, acc: 94.7141, loss_bbox: 0.1908, loss_mask: 0.2050, loss: 0.5800 2024-05-31 15:08:34,472 - mmdet - INFO - Epoch [12][4650/7330] lr: 1.000e-06, eta: 0:49:44, time: 1.081, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0350, loss_cls: 0.1362, acc: 94.7603, loss_bbox: 0.1879, loss_mask: 0.2026, loss: 0.5723 2024-05-31 15:09:31,056 - mmdet - INFO - Epoch [12][4700/7330] lr: 1.000e-06, eta: 0:48:48, time: 1.132, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0096, loss_rpn_bbox: 0.0354, loss_cls: 0.1385, acc: 94.5830, loss_bbox: 0.1927, loss_mask: 0.2056, loss: 0.5817 2024-05-31 15:10:26,159 - mmdet - INFO - Epoch [12][4750/7330] lr: 1.000e-06, eta: 0:47:52, time: 1.102, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0381, loss_cls: 0.1439, acc: 94.4468, loss_bbox: 0.1955, loss_mask: 0.2095, loss: 0.5984 2024-05-31 15:11:22,874 - mmdet - INFO - Epoch [12][4800/7330] lr: 1.000e-06, eta: 0:46:57, time: 1.134, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0101, loss_rpn_bbox: 0.0347, loss_cls: 0.1371, acc: 94.6816, loss_bbox: 0.1877, loss_mask: 0.1994, loss: 0.5691 2024-05-31 15:12:19,991 - mmdet - INFO - Epoch [12][4850/7330] lr: 1.000e-06, eta: 0:46:01, time: 1.142, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0367, loss_cls: 0.1429, acc: 94.4399, loss_bbox: 0.1944, loss_mask: 0.2082, loss: 0.5927 2024-05-31 15:13:16,575 - mmdet - INFO - Epoch [12][4900/7330] lr: 1.000e-06, eta: 0:45:05, time: 1.132, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0335, loss_cls: 0.1321, acc: 94.9275, loss_bbox: 0.1811, loss_mask: 0.2017, loss: 0.5581 2024-05-31 15:14:13,958 - mmdet - INFO - Epoch [12][4950/7330] lr: 1.000e-06, eta: 0:44:10, time: 1.148, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0123, loss_rpn_bbox: 0.0386, loss_cls: 0.1492, acc: 94.2656, loss_bbox: 0.2032, loss_mask: 0.2092, loss: 0.6126 2024-05-31 15:15:09,403 - mmdet - INFO - Epoch [12][5000/7330] lr: 1.000e-06, eta: 0:43:14, time: 1.109, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0094, loss_rpn_bbox: 0.0368, loss_cls: 0.1368, acc: 94.6658, loss_bbox: 0.1894, loss_mask: 0.2020, loss: 0.5743 2024-05-31 15:16:08,541 - mmdet - INFO - Epoch [12][5050/7330] lr: 1.000e-06, eta: 0:42:18, time: 1.183, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0373, loss_cls: 0.1451, acc: 94.3591, loss_bbox: 0.1900, loss_mask: 0.2066, loss: 0.5897 2024-05-31 15:17:02,417 - mmdet - INFO - Epoch [12][5100/7330] lr: 1.000e-06, eta: 0:41:23, time: 1.077, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0340, loss_cls: 0.1338, acc: 94.8516, loss_bbox: 0.1867, loss_mask: 0.2039, loss: 0.5683 2024-05-31 15:17:56,384 - mmdet - INFO - Epoch [12][5150/7330] lr: 1.000e-06, eta: 0:40:27, time: 1.079, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0370, loss_cls: 0.1394, acc: 94.6375, loss_bbox: 0.1909, loss_mask: 0.2064, loss: 0.5836 2024-05-31 15:18:50,962 - mmdet - INFO - Epoch [12][5200/7330] lr: 1.000e-06, eta: 0:39:31, time: 1.091, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0383, loss_cls: 0.1446, acc: 94.4426, loss_bbox: 0.1930, loss_mask: 0.2085, loss: 0.5952 2024-05-31 15:19:45,467 - mmdet - INFO - Epoch [12][5250/7330] lr: 1.000e-06, eta: 0:38:36, time: 1.090, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0387, loss_cls: 0.1427, acc: 94.4966, loss_bbox: 0.1933, loss_mask: 0.2028, loss: 0.5879 2024-05-31 15:20:42,806 - mmdet - INFO - Epoch [12][5300/7330] lr: 1.000e-06, eta: 0:37:40, time: 1.147, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0405, loss_cls: 0.1443, acc: 94.4392, loss_bbox: 0.1988, loss_mask: 0.2154, loss: 0.6105 2024-05-31 15:21:37,422 - mmdet - INFO - Epoch [12][5350/7330] lr: 1.000e-06, eta: 0:36:44, time: 1.092, data_time: 0.093, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0364, loss_cls: 0.1383, acc: 94.6335, loss_bbox: 0.1900, loss_mask: 0.2028, loss: 0.5774 2024-05-31 15:22:31,488 - mmdet - INFO - Epoch [12][5400/7330] lr: 1.000e-06, eta: 0:35:48, time: 1.081, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0106, loss_rpn_bbox: 0.0369, loss_cls: 0.1452, acc: 94.4429, loss_bbox: 0.1988, loss_mask: 0.2098, loss: 0.6014 2024-05-31 15:23:28,203 - mmdet - INFO - Epoch [12][5450/7330] lr: 1.000e-06, eta: 0:34:53, time: 1.134, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0363, loss_cls: 0.1336, acc: 94.8218, loss_bbox: 0.1846, loss_mask: 0.2011, loss: 0.5653 2024-05-31 15:24:31,174 - mmdet - INFO - Epoch [12][5500/7330] lr: 1.000e-06, eta: 0:33:57, time: 1.259, data_time: 0.079, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0350, loss_cls: 0.1395, acc: 94.5955, loss_bbox: 0.1884, loss_mask: 0.2041, loss: 0.5775 2024-05-31 15:25:28,103 - mmdet - INFO - Epoch [12][5550/7330] lr: 1.000e-06, eta: 0:33:02, time: 1.139, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0091, loss_rpn_bbox: 0.0354, loss_cls: 0.1396, acc: 94.6094, loss_bbox: 0.1925, loss_mask: 0.2069, loss: 0.5834 2024-05-31 15:26:26,251 - mmdet - INFO - Epoch [12][5600/7330] lr: 1.000e-06, eta: 0:32:06, time: 1.163, data_time: 0.085, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0364, loss_cls: 0.1418, acc: 94.5303, loss_bbox: 0.1943, loss_mask: 0.2036, loss: 0.5864 2024-05-31 15:27:22,423 - mmdet - INFO - Epoch [12][5650/7330] lr: 1.000e-06, eta: 0:31:10, time: 1.123, data_time: 0.083, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0363, loss_cls: 0.1394, acc: 94.6741, loss_bbox: 0.1937, loss_mask: 0.2073, loss: 0.5870 2024-05-31 15:28:17,652 - mmdet - INFO - Epoch [12][5700/7330] lr: 1.000e-06, eta: 0:30:15, time: 1.105, data_time: 0.084, memory: 24290, loss_rpn_cls: 0.0108, loss_rpn_bbox: 0.0347, loss_cls: 0.1347, acc: 94.7722, loss_bbox: 0.1860, loss_mask: 0.2003, loss: 0.5664 2024-05-31 15:29:12,049 - mmdet - INFO - Epoch [12][5750/7330] lr: 1.000e-06, eta: 0:29:19, time: 1.088, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0371, loss_cls: 0.1446, acc: 94.4136, loss_bbox: 0.1943, loss_mask: 0.2055, loss: 0.5925 2024-05-31 15:30:06,561 - mmdet - INFO - Epoch [12][5800/7330] lr: 1.000e-06, eta: 0:28:23, time: 1.090, data_time: 0.080, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0358, loss_cls: 0.1362, acc: 94.8000, loss_bbox: 0.1872, loss_mask: 0.2018, loss: 0.5720 2024-05-31 15:31:01,545 - mmdet - INFO - Epoch [12][5850/7330] lr: 1.000e-06, eta: 0:27:28, time: 1.100, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0375, loss_cls: 0.1415, acc: 94.5854, loss_bbox: 0.1916, loss_mask: 0.2048, loss: 0.5861 2024-05-31 15:31:57,942 - mmdet - INFO - Epoch [12][5900/7330] lr: 1.000e-06, eta: 0:26:32, time: 1.128, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0351, loss_cls: 0.1366, acc: 94.6702, loss_bbox: 0.1880, loss_mask: 0.2031, loss: 0.5727 2024-05-31 15:32:52,312 - mmdet - INFO - Epoch [12][5950/7330] lr: 1.000e-06, eta: 0:25:36, time: 1.087, data_time: 0.082, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0355, loss_cls: 0.1449, acc: 94.5452, loss_bbox: 0.1961, loss_mask: 0.2074, loss: 0.5942 2024-05-31 15:33:49,334 - mmdet - INFO - Epoch [12][6000/7330] lr: 1.000e-06, eta: 0:24:41, time: 1.140, data_time: 0.077, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0339, loss_cls: 0.1326, acc: 94.8494, loss_bbox: 0.1837, loss_mask: 0.2015, loss: 0.5613 2024-05-31 15:34:45,680 - mmdet - INFO - Epoch [12][6050/7330] lr: 1.000e-06, eta: 0:23:45, time: 1.127, data_time: 0.070, memory: 24290, loss_rpn_cls: 0.0094, loss_rpn_bbox: 0.0361, loss_cls: 0.1345, acc: 94.7825, loss_bbox: 0.1871, loss_mask: 0.2027, loss: 0.5697 2024-05-31 15:35:46,496 - mmdet - INFO - Epoch [12][6100/7330] lr: 1.000e-06, eta: 0:22:49, time: 1.216, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0371, loss_cls: 0.1348, acc: 94.8164, loss_bbox: 0.1849, loss_mask: 0.2011, loss: 0.5683 2024-05-31 15:36:41,123 - mmdet - INFO - Epoch [12][6150/7330] lr: 1.000e-06, eta: 0:21:54, time: 1.093, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0111, loss_rpn_bbox: 0.0364, loss_cls: 0.1372, acc: 94.6797, loss_bbox: 0.1863, loss_mask: 0.2070, loss: 0.5780 2024-05-31 15:37:39,160 - mmdet - INFO - Epoch [12][6200/7330] lr: 1.000e-06, eta: 0:20:58, time: 1.161, data_time: 0.078, memory: 24290, loss_rpn_cls: 0.0099, loss_rpn_bbox: 0.0350, loss_cls: 0.1342, acc: 94.8125, loss_bbox: 0.1865, loss_mask: 0.2011, loss: 0.5667 2024-05-31 15:38:35,295 - mmdet - INFO - Epoch [12][6250/7330] lr: 1.000e-06, eta: 0:20:02, time: 1.123, data_time: 0.060, memory: 24290, loss_rpn_cls: 0.0113, loss_rpn_bbox: 0.0373, loss_cls: 0.1402, acc: 94.5474, loss_bbox: 0.1935, loss_mask: 0.2103, loss: 0.5926 2024-05-31 15:39:29,214 - mmdet - INFO - Epoch [12][6300/7330] lr: 1.000e-06, eta: 0:19:07, time: 1.078, data_time: 0.058, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0352, loss_cls: 0.1333, acc: 94.7869, loss_bbox: 0.1893, loss_mask: 0.2063, loss: 0.5740 2024-05-31 15:40:23,593 - mmdet - INFO - Epoch [12][6350/7330] lr: 1.000e-06, eta: 0:18:11, time: 1.088, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0115, loss_rpn_bbox: 0.0371, loss_cls: 0.1521, acc: 94.1135, loss_bbox: 0.2029, loss_mask: 0.2092, loss: 0.6128 2024-05-31 15:41:18,558 - mmdet - INFO - Epoch [12][6400/7330] lr: 1.000e-06, eta: 0:17:15, time: 1.099, data_time: 0.068, memory: 24290, loss_rpn_cls: 0.0110, loss_rpn_bbox: 0.0382, loss_cls: 0.1381, acc: 94.6145, loss_bbox: 0.1922, loss_mask: 0.2043, loss: 0.5838 2024-05-31 15:42:13,421 - mmdet - INFO - Epoch [12][6450/7330] lr: 1.000e-06, eta: 0:16:19, time: 1.097, data_time: 0.089, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0375, loss_cls: 0.1389, acc: 94.6509, loss_bbox: 0.1925, loss_mask: 0.2055, loss: 0.5853 2024-05-31 15:43:09,267 - mmdet - INFO - Epoch [12][6500/7330] lr: 1.000e-06, eta: 0:15:24, time: 1.117, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0375, loss_cls: 0.1385, acc: 94.6230, loss_bbox: 0.1915, loss_mask: 0.2072, loss: 0.5861 2024-05-31 15:44:05,734 - mmdet - INFO - Epoch [12][6550/7330] lr: 1.000e-06, eta: 0:14:28, time: 1.130, data_time: 0.069, memory: 24290, loss_rpn_cls: 0.0105, loss_rpn_bbox: 0.0372, loss_cls: 0.1421, acc: 94.5322, loss_bbox: 0.1912, loss_mask: 0.2054, loss: 0.5864 2024-05-31 15:44:59,549 - mmdet - INFO - Epoch [12][6600/7330] lr: 1.000e-06, eta: 0:13:32, time: 1.076, data_time: 0.061, memory: 24290, loss_rpn_cls: 0.0098, loss_rpn_bbox: 0.0358, loss_cls: 0.1323, acc: 94.8689, loss_bbox: 0.1867, loss_mask: 0.2059, loss: 0.5706 2024-05-31 15:45:58,160 - mmdet - INFO - Epoch [12][6650/7330] lr: 1.000e-06, eta: 0:12:37, time: 1.172, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0114, loss_rpn_bbox: 0.0359, loss_cls: 0.1376, acc: 94.7246, loss_bbox: 0.1912, loss_mask: 0.2057, loss: 0.5819 2024-05-31 15:46:56,769 - mmdet - INFO - Epoch [12][6700/7330] lr: 1.000e-06, eta: 0:11:41, time: 1.172, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0365, loss_cls: 0.1396, acc: 94.6414, loss_bbox: 0.1900, loss_mask: 0.2061, loss: 0.5825 2024-05-31 15:47:51,141 - mmdet - INFO - Epoch [12][6750/7330] lr: 1.000e-06, eta: 0:10:45, time: 1.087, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0095, loss_rpn_bbox: 0.0359, loss_cls: 0.1356, acc: 94.7144, loss_bbox: 0.1916, loss_mask: 0.2082, loss: 0.5808 2024-05-31 15:48:50,524 - mmdet - INFO - Epoch [12][6800/7330] lr: 1.000e-06, eta: 0:09:50, time: 1.187, data_time: 0.072, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0355, loss_cls: 0.1391, acc: 94.6091, loss_bbox: 0.1883, loss_mask: 0.2032, loss: 0.5758 2024-05-31 15:49:44,501 - mmdet - INFO - Epoch [12][6850/7330] lr: 1.000e-06, eta: 0:08:54, time: 1.080, data_time: 0.066, memory: 24290, loss_rpn_cls: 0.0112, loss_rpn_bbox: 0.0366, loss_cls: 0.1462, acc: 94.3562, loss_bbox: 0.1997, loss_mask: 0.2125, loss: 0.6061 2024-05-31 15:50:38,045 - mmdet - INFO - Epoch [12][6900/7330] lr: 1.000e-06, eta: 0:07:58, time: 1.071, data_time: 0.074, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0346, loss_cls: 0.1279, acc: 95.0391, loss_bbox: 0.1814, loss_mask: 0.1966, loss: 0.5509 2024-05-31 15:51:31,505 - mmdet - INFO - Epoch [12][6950/7330] lr: 1.000e-06, eta: 0:07:03, time: 1.069, data_time: 0.067, memory: 24290, loss_rpn_cls: 0.0092, loss_rpn_bbox: 0.0332, loss_cls: 0.1337, acc: 94.8037, loss_bbox: 0.1800, loss_mask: 0.1999, loss: 0.5559 2024-05-31 15:52:25,707 - mmdet - INFO - Epoch [12][7000/7330] lr: 1.000e-06, eta: 0:06:07, time: 1.084, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0107, loss_rpn_bbox: 0.0359, loss_cls: 0.1399, acc: 94.5535, loss_bbox: 0.1917, loss_mask: 0.2066, loss: 0.5847 2024-05-31 15:53:23,135 - mmdet - INFO - Epoch [12][7050/7330] lr: 1.000e-06, eta: 0:05:11, time: 1.149, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0379, loss_cls: 0.1393, acc: 94.5906, loss_bbox: 0.1902, loss_mask: 0.1991, loss: 0.5766 2024-05-31 15:54:19,602 - mmdet - INFO - Epoch [12][7100/7330] lr: 1.000e-06, eta: 0:04:16, time: 1.129, data_time: 0.062, memory: 24290, loss_rpn_cls: 0.0097, loss_rpn_bbox: 0.0345, loss_cls: 0.1320, acc: 94.9119, loss_bbox: 0.1838, loss_mask: 0.2020, loss: 0.5620 2024-05-31 15:55:13,725 - mmdet - INFO - Epoch [12][7150/7330] lr: 1.000e-06, eta: 0:03:20, time: 1.082, data_time: 0.063, memory: 24290, loss_rpn_cls: 0.0100, loss_rpn_bbox: 0.0367, loss_cls: 0.1394, acc: 94.6384, loss_bbox: 0.1897, loss_mask: 0.2041, loss: 0.5799 2024-05-31 15:56:10,212 - mmdet - INFO - Epoch [12][7200/7330] lr: 1.000e-06, eta: 0:02:24, time: 1.130, data_time: 0.076, memory: 24290, loss_rpn_cls: 0.0109, loss_rpn_bbox: 0.0367, loss_cls: 0.1402, acc: 94.5732, loss_bbox: 0.1917, loss_mask: 0.2055, loss: 0.5849 2024-05-31 15:57:06,649 - mmdet - INFO - Epoch [12][7250/7330] lr: 1.000e-06, eta: 0:01:29, time: 1.129, data_time: 0.064, memory: 24290, loss_rpn_cls: 0.0103, loss_rpn_bbox: 0.0361, loss_cls: 0.1376, acc: 94.7373, loss_bbox: 0.1881, loss_mask: 0.2023, loss: 0.5743 2024-05-31 15:58:06,005 - mmdet - INFO - Epoch [12][7300/7330] lr: 1.000e-06, eta: 0:00:33, time: 1.187, data_time: 0.073, memory: 24290, loss_rpn_cls: 0.0104, loss_rpn_bbox: 0.0378, loss_cls: 0.1425, acc: 94.4238, loss_bbox: 0.1984, loss_mask: 0.2070, loss: 0.5961 2024-05-31 15:58:39,487 - mmdet - INFO - Saving checkpoint at 12 epochs 2024-05-31 16:01:00,001 - mmdet - INFO - Evaluating bbox... 2024-05-31 16:01:22,755 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.494 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.722 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.542 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.329 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.529 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.659 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.606 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.427 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.647 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.767 2024-05-31 16:01:22,756 - mmdet - INFO - Evaluating segm... 2024-05-31 16:01:48,477 - mmdet - INFO - Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.441 Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=1000 ] = 0.689 Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=1000 ] = 0.473 Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.235 Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.474 Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.648 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=300 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=1000 ] = 0.546 Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=1000 ] = 0.360 Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=1000 ] = 0.590 Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=1000 ] = 0.723 2024-05-31 16:01:48,792 - mmdet - INFO - Exp name: mask_rcnn_deit_tsbl_1792_1568_1120_448_fpn_1x_coco_bs16.py 2024-05-31 16:01:48,795 - mmdet - INFO - Epoch(val) [12][625] bbox_mAP: 0.4940, bbox_mAP_50: 0.7220, bbox_mAP_75: 0.5420, bbox_mAP_s: 0.3290, bbox_mAP_m: 0.5290, bbox_mAP_l: 0.6590, bbox_mAP_copypaste: 0.494 0.722 0.542 0.329 0.529 0.659, segm_mAP: 0.4410, segm_mAP_50: 0.6890, segm_mAP_75: 0.4730, segm_mAP_s: 0.2350, segm_mAP_m: 0.4740, segm_mAP_l: 0.6480, segm_mAP_copypaste: 0.441 0.689 0.473 0.235 0.474 0.648